Haha. This has already been done this thread. Do you really think the results will be any different the second time?hovz said:you probly just ignore the fps drops. make a log file with fraps of a dm game. and check how many times fps drops to the 30s.
if u cant see the diff between 70 and 35 than you have some issues
If you can really tell the difference between the two and it bugs you to no end that you cannot play during the drops roll and feel the need to bash the game and the developers. You need to do of two things IMHO:hovz said:if u cant see the diff between 70 and 35 than you have some issues
When an object is culled, it is ignored. The only objects that have to be kept track of are dynamic objects or those being rendered. If it is only culled from view and therefore not rendered, it is only relevant if there is an object that might collide with it. Even then, only the objects within that partition need be checked against. Also, engines cull on a per-partition basis -- not a per-object basis.hovz said:ok so then ut would be doing it the same way if it was designed well. which would still mean far cry is keeping track of much more culled objects. i dont even see how you guys can argue that far cry isnt doing much more wortk than ut.
Entz said:This is quite a funny thread.
If you can really tell the difference between the two and it bugs you to no end that you cannot play during the drops roll and feel the need to bash the game and the developers. You need to do of two things IMHO:hovz said:if u cant see the diff between 70 and 35 than you have some issues
1) Upgrade your computer so your min is high enough that you feel you can live through the gaming session without blaming potental gaming inadequacies on the sudden drop in frame rate.
or
2) Stop playing this "horrible terrible, badly coded game" as it would seem you are not enjoying it anyways. Perhaps you could sell it to a friend or a used gaming store and go see a movie or something ( you know go out and see the real world). thousands of other players have no problem with it.
----
#2 seems like the best option as I am sure you will have the exact same problem with #1. Oh noes it drops from 80fps to 50fps!!!11! I can't live on!
Perhaps if you spent less time looking at the frame counter you will be happier? Overall gameplay experience > per-instant Framerate.
Comparing it to other games, written by different developer, using different tools, constructs, shortcuts etc is quite pointless. If every game used the exact same game engine for all time games would become extemely boring. Variety is a good thing. You need to accept that what you are getting is the FPS you are getting in that game, based on the way it was designed, you have no control over it and you have to learn to live with it or not play it. If you slowdown at one point, everyone will slow down at that point making it even.
If you had only a FX 5200, the CPU dependency of UT200X would be looking mighty nice right now (esp vs FarCry ).
I think Q3 running at 250fps has spoiled some people.
The results showed that FarCry and UT2004 performed almost the same. UT2004 was slightly lower with 15 bots, but with that much AIs active and fighting, such a performance difference is certainly not unexpected.hovz said:the results proved me right.
hovz said:and i dont need stat fps on to tell me when im dropping to 30 fps.
ShePearl said:After seeing many benchmarks based on those new nVIDIA GeForce 6800Ultra, and ATI X800XT (and Pro), I'd like to figure out what aspects of UT2004 make *more* CPU bound compared to Far Cry ??
I understand that both ATI, and nVIDIA cards are very fast, and unless they're run at very high resolutions such as 1600x1200 with max image quality settings, these cards are always *almost* bottlenecked by mainstream CPUs currently available in the market.
But, it seems to me that those new cards are bottlenecked more in UT2004 by CPUs than they're bottlenecked by CPUs in Far Cry.
What are the aspects of UT2004 which makes it more CPU dependent than Far Cry ?
CHeers.
i need to get out? ROFL dont even get me started you stupid nerd. thats the last thing you should be saying to me. i can already picture you in my head. pimply, scrawny, lanky, goofy, ugly, lonely, single, virgin...just leave it