UT2004, Far Cry - CPU and VPU (nv40, R420)

Status
Not open for further replies.
you probly just ignore the fps drops. make a log file with fraps of a dm game. and check how many times fps drops to the 30s.
 
hovz said:
you probly just ignore the fps drops. make a log file with fraps of a dm game. and check how many times fps drops to the 30s.
Haha. This has already been done this thread. Do you really think the results will be any different the second time?
 
I don't care how many times fraps catches it. I care with what my eyes can see. And if I can't see it, then who gives a crap. I see it in Far Cry, I don't see it in UT2004. Just b/c I can see it in Far Cry doesn't mean the game is crap.
 
if u cant see the diff between 70 and 35 than you have some issues

When you are watching TV or a DVD do you notice any stuttering? They run at 24 to 30 fps and I personally have never been able to notice the difference between that and 120 fps. In a game, anything over 30 fps is usually acceptable. Problems result when lag occurs (can happen for a number of reasons) the game radomly pauses. Now, if you are talking refresh rates, then that is a differents story as the monitor goes completely black between each refresh and the eye can see the flickering on and off. There is a DIFFERENCE between REFRESH RATES and FRAMES PER SECOND

Sigh :rolleyes:
 
This is quite a funny thread.

hovz said:
if u cant see the diff between 70 and 35 than you have some issues
If you can really tell the difference between the two and it bugs you to no end that you cannot play during the drops :)roll:) and feel the need to bash the game and the developers. You need to do of two things IMHO:

1) Upgrade your computer so your min is high enough that you feel you can live through the gaming session without blaming potental gaming inadequacies on the sudden drop in frame rate.

or

2) Stop playing this "horrible terrible, badly coded game" as it would seem you are not enjoying it anyways. Perhaps you could sell it to a friend or a used gaming store and go see a movie or something ( you know go out and see the real world). thousands of other players have no problem with it.

----
#2 seems like the best option as I am sure you will have the exact same problem with #1. Oh noes it drops from 80fps to 50fps!!!11! I can't live on!

Perhaps if you spent less time looking at the frame counter you will be happier? Overall gameplay experience > per-instant Framerate.

Comparing it to other games, written by different developer, using different tools, constructs, shortcuts etc is quite pointless. If every game used the exact same game engine for all time games would become extemely boring. Variety is a good thing. You need to accept that what you are getting is the FPS you are getting in that game, based on the way it was designed, you have no control over it and you have to learn to live with it or not play it. If you slowdown at one point, everyone will slow down at that point making it even.

If you had only a FX 5200, the CPU dependency of UT200X would be looking mighty nice right now (esp vs FarCry ).

I think Q3 running at 250fps has spoiled some people.
 
Damn ISP. . . My internet connection was inexplicably down for a few hours.

hovz said:
ok so then ut would be doing it the same way if it was designed well. which would still mean far cry is keeping track of much more culled objects. i dont even see how you guys can argue that far cry isnt doing much more wortk than ut.
When an object is culled, it is ignored. The only objects that have to be kept track of are dynamic objects or those being rendered. If it is only culled from view and therefore not rendered, it is only relevant if there is an object that might collide with it. Even then, only the objects within that partition need be checked against. Also, engines cull on a per-partition basis -- not a per-object basis.
 
Entz said:
This is quite a funny thread.

hovz said:
if u cant see the diff between 70 and 35 than you have some issues
If you can really tell the difference between the two and it bugs you to no end that you cannot play during the drops :)roll:) and feel the need to bash the game and the developers. You need to do of two things IMHO:

1) Upgrade your computer so your min is high enough that you feel you can live through the gaming session without blaming potental gaming inadequacies on the sudden drop in frame rate.

or

2) Stop playing this "horrible terrible, badly coded game" as it would seem you are not enjoying it anyways. Perhaps you could sell it to a friend or a used gaming store and go see a movie or something ( you know go out and see the real world). thousands of other players have no problem with it.

----
#2 seems like the best option as I am sure you will have the exact same problem with #1. Oh noes it drops from 80fps to 50fps!!!11! I can't live on!

Perhaps if you spent less time looking at the frame counter you will be happier? Overall gameplay experience > per-instant Framerate.

Comparing it to other games, written by different developer, using different tools, constructs, shortcuts etc is quite pointless. If every game used the exact same game engine for all time games would become extemely boring. Variety is a good thing. You need to accept that what you are getting is the FPS you are getting in that game, based on the way it was designed, you have no control over it and you have to learn to live with it or not play it. If you slowdown at one point, everyone will slow down at that point making it even.

If you had only a FX 5200, the CPU dependency of UT200X would be looking mighty nice right now (esp vs FarCry ).

I think Q3 running at 250fps has spoiled some people.

i need to get out? ROFL dont even get me started you stupid nerd. thats the last thing you should be saying to me. i can already picture you in my head. pimply, scrawny, lanky, goofy, ugly, lonely, single, virgin...just leave it.

and i dont need stat fps on to tell me when im dropping to 30 fps.
 
hovz said:
the results proved me right.
The results showed that FarCry and UT2004 performed almost the same. UT2004 was slightly lower with 15 bots, but with that much AIs active and fighting, such a performance difference is certainly not unexpected.
 
REmember the original question?
ShePearl said:
After seeing many benchmarks based on those new nVIDIA GeForce 6800Ultra, and ATI X800XT (and Pro), I'd like to figure out what aspects of UT2004 make *more* CPU bound compared to Far Cry ??

I understand that both ATI, and nVIDIA cards are very fast, and unless they're run at very high resolutions such as 1600x1200 with max image quality settings, these cards are always *almost* bottlenecked by mainstream CPUs currently available in the market.

But, it seems to me that those new cards are bottlenecked more in UT2004 by CPUs than they're bottlenecked by CPUs in Far Cry.

What are the aspects of UT2004 which makes it more CPU dependent than Far Cry ?

CHeers.
 
While I haven't played Far Cry, one aspect that typically will make single-player games less CPU-bound is that the AI is not active unless the player enters a certain radius of the enemy, whereas the AI for bots in UT is always active.
 
i need to get out? ROFL dont even get me started you stupid nerd. thats the last thing you should be saying to me. i can already picture you in my head. pimply, scrawny, lanky, goofy, ugly, lonely, single, virgin...just leave it

Yep, you REALLY do need to get out, especially when you start resorting to crude insults over a damn game.
 
This thread has served its purpose - why waste more server space going over the same bloody thing? El lockedo.
 
Status
Not open for further replies.
Back
Top