UT2004, Far Cry - CPU and VPU (nv40, R420)

Status
Not open for further replies.

ShePearl

Newcomer
After seeing many benchmarks based on those new nVIDIA GeForce 6800Ultra, and ATI X800XT (and Pro), I'd like to figure out what aspects of UT2004 make *more* CPU bound compared to Far Cry ??

I understand that both ATI, and nVIDIA cards are very fast, and unless they're run at very high resolutions such as 1600x1200 with max image quality settings, these cards are always *almost* bottlenecked by mainstream CPUs currently available in the market.

But, it seems to me that those new cards are bottlenecked more in UT2004 by CPUs than they're bottlenecked by CPUs in Far Cry.

What are the aspects of UT2004 which makes it more CPU dependent than Far Cry ?

CHeers.
 
well in far cry theres alot of physics and geometry. for whatever reason all polygons are still proccesed on the cpu, and then the details are added in the gpu. ud think by this point in time the gpu would be able to do more of the work but :/.

ut2k4 is just badly programmed. epics programmers have never been particularly good. the fact that on a top end cpu ur fps will still frequently drop to the 30s regardless of how low you put the settings is a testament to epics lack of programming knowledge. i could see if this was cutting edge technology, but come on, the fucking game is direct x 7 for the most part. very few scenes are over 75000 polys per frame....and theres virtually no advanced physics except for karma.
 
ut2k4 is just badly programmed. epics programmers have never been particularly good.
Ah, so that's why we've seen so many games based on the Unreal Engine ;)!

the fact that on a top end cpu ur fps will still frequently drop to the 30s regardless of how low you put the settings is a testament to epics lack of programming knowledge.
I'm quite sure that I've never seen the fps drop below 30 in any of the levels, and that's on an AMD Athlon XP 2600+!
 
ShePearl said:
What are the aspects of UT2004 which makes it more CPU dependent than Far Cry ?
I think you are approaching this the wrong way. Your question should be:

What are the aspects of Far Cry that make it more GPU dependent than UT 2004?
 
ShePearl said:
What are the aspects of UT2004 which makes it more CPU dependent than Far Cry ?

CHeers.
I'd say it's because UT2k4 makes less use of pixel shader techniques and makes heavy use of a scripting language. Basically, the Unreal engine, while its performance is pretty good for the visuals, does still suffer because one of the largest motivating factors in designing the engine is ease of content creation.
 
hovz said:
ut2k4 is just badly programmed. epics programmers have never been particularly good.
Ignorant idiot. I do hope Vogel reads this, though, should give him a good laugh.
 
no its true, my fps low on ut is always lower than my low on far cry. there is NO excuse for that.

heres a little summary of the ut experience

cruising along at 85 fps, oops an enemy starts shooting at you, fps begins to flucuate rapidly between 60 and 85 making it harder to aim. uh oh now theres 3 people on ur screen shooting, fps hovers from 40 to the 60s. lots of fluctuation making it seem choppier than it is. uh oh ur in an open part of the map and theres 8 people on screen now ur fps is hopping down into the 20s and 30s frequently, which causes mouse lag. oh i know ill just turn on reduce mouse lag, uh oh cant do that as it drops ur fps an additional 10 to 20.
 
the fact that on such a relativly simple game technically speaking your cpu bottlenecked on an athlon 64? u get what, 50 fps average? that means the lows are around 30? explain to me again how good this engine is!!!!
 
because everyone i know who doesnt extreme overclock their system has the same problems?

One million against a couple of people you know is a very small percentage :LOL:!

Why don't you check out the Unreal Engine 3 videos and then comment on how good Epic programmers are?
 
Have to agree with hovz 100%, this is one of the things that infuriates me the most about PC games. No matter how many dollars you throw at the minimum fps problem you can never quite eradicate it.

Carmacks engines far best, which is why I expect doom3 to maintain a consistent frame rate, unlike HL2 which looks like a great game, but if you watch any of the bink videos you can observe the same seemingly "random" and downright annoying frame rate drops, particularly when the screen turning. Even on the X800 demonstrations of Hl2 I can observe this.

Engine programmers should concentrate the most on making their fps counter consistent throughout all eventualities in their game. Instead they pile on the effects (HDR this pixel shader that) with complete disregard for the inevitability that their engine will slow to a crawl in certain situations even when run on bleeding edge hardware.
 
but unreal doesnt even have any effects, and the fps still jumps all over the place on the best of systems. i dont care about unreal 3, in 2006 then we can talk about it. im talking about 2k4 and 2k3 engine. its 2004, the game doesnt even have abive average graohics anymore and the fps cant even stay consistently above 60. again my fps is much steadie ron far crhy then on 2k4. far cry looks far far better than 2k4, is doing far far more work also. plz explain to me again how good epics engine is :rolleyes:
 
because everyone i know who doesnt extreme overclock their system has the same problems?

Sorry can't help that you and 'everyone' you know can't set up their PCs or the game correctly. ;)

Carmacks engines far best, which is why I expect doom3 to maintain a consistent frame rate

Being locked to 60fps should help.

Engine programmers should concentrate the most on making their fps counter consistent throughout all eventualities in their game. Instead they pile on the effects (HDR this pixel shader that) with complete disregard for the inevitability that their engine will slow to a crawl in certain situations even when run on bleeding edge hardware.

Oooh games slow down when lots is occurring, Wow I wonder what could be causing that? :rolleyes:
 
Heathen said:
because everyone i know who doesnt extreme overclock their system has the same problems?

Sorry can't help that you and 'everyone' you know can't set up their PCs or the game correctly. ;)

yes thats why i can run eveyr other game without constant random fluctuation. thats alo why every review states that ut2004 is completely cpu limited even on an athlon 64 3400+. i guess those hardware sites could learn alot about setting up systems? :rolleyes: morons
 
hovz said:
well in far cry theres alot of physics and geometry. for whatever reason all polygons are still proccesed on the cpu, and then the details are added in the gpu. ud think by this point in time the gpu would be able to do more of the work but :/.
What do you mean, exactly, by "all polygons are still processed on the CPU"?
hovz said:
the fact that on such a relativly simple game technically speaking your cpu bottlenecked on an athlon 64? u get what, 50 fps average? that means the lows are around 30? explain to me again how good this engine is!!!!
Consider what's happening in-game when the framerate drops so low. There's alot of AI and physics algorithms in action, ya know. Also you must realize that virtually all gameplay and game content is coded not in C++, but in UnrealScript. UnrealScript, like Java, is probably compiled down to byte-code of some type and run on a virtual machine -- almost like a CPU in software. This makes it faster than interpretted high-level code, but always much slower than compiled C++ code. As such, I don't find it at all surprising that the Unreal Engine gets CPU bound.
 
Chalnoth said:
hovz said:
but unreal doesnt even have any effects,
Riiight. Try playing the game.

im referring to advanced effects used in recent games, not basic alpha blending and primitive smoke and fire effects. and ugly fake reflections
 
Ostsol said:
hovz said:
well in far cry theres alot of physics and geometry. for whatever reason all polygons are still proccesed on the cpu, and then the details are added in the gpu. ud think by this point in time the gpu would be able to do more of the work but :/.
What do you mean, exactly, by "all polygons are still processed on the CPU"?
hovz said:
the fact that on such a relativly simple game technically speaking your cpu bottlenecked on an athlon 64? u get what, 50 fps average? that means the lows are around 30? explain to me again how good this engine is!!!!
Consider what's happening in-game when the framerate drops so low. There's alot of AI and physics algorithms in action, ya know. Also you must realize that virtually all gameplay and game content is coded not in C++, but in UnrealScript. UnrealScript, like Java, is probably compiled down to byte-code of some type and run on a virtual machine -- almost like a CPU in software. This makes it faster than interpretted high-level code, but always much slower than compiled C++ code. As such, I don't find it at all surprising that the Unreal Engine gets CPU bound.

all polys are first rendered on cpu, they are sent to the gpu to be textured, lit, and everything else.

theres no ai in multiplayer, and karma is only applied to player deaths and vehicles. so again where is all the physics work?

that also doesnt explain why far cry never stutters and drops like unreal does.
 
Status
Not open for further replies.
Back
Top