UT2004, Far Cry - CPU and VPU (nv40, R420)

Status
Not open for further replies.
So has someone answered the earlier question of why Far Cry can run better (tho it doesn't on my machine)?
Is the physics/ai less complex or is it coded more efficiently? The graphics are certainly better.
 
I'd imagine UT2K4 benches are more CPU-bound because (wait for it) they're botmatches, whereas Far Cry benches tend to be more like fly-bys (simple run-throughs with FRAPS).
 
when i play far cry my fps very rarely ever drops below 40. in ut it drops to the 30s all the tims
 
Pete said:
I'd imagine UT2K4 benches are more CPU-bound because (wait for it) they're botmatches, whereas Far Cry benches tend to be more like fly-bys (simple run-throughs with FRAPS).
I think this is the best answer I've seen in this thread. :) Thanks.

Botmatches, indeed. Lots of AI, physics calculations going on. I wonder how UT2004 fly-by type benchmarks (rather than botmaches) result with these new cards compared to botmatches relative to CPUs.
 
Pete said:
I'd imagine UT2K4 benches are more CPU-bound because (wait for it) they're botmatches..
Exactly. The other thing people seem to forget when using botmatches to benchmark is that UT2004 is essentially an online game. For the most part you don't play against bots online.
 
Diplo said:
Pete said:
I'd imagine UT2K4 benches are more CPU-bound because (wait for it) they're botmatches..
Exactly. The other thing people seem to forget when using botmatches to benchmark is that UT2004 is essentially an online game. For the most part you don't play against bots online.
Has anyone tried benchmarking with a recording of an online game?
 
Ostsol said:
Diplo said:
Pete said:
I'd imagine UT2K4 benches are more CPU-bound because (wait for it) they're botmatches..
Exactly. The other thing people seem to forget when using botmatches to benchmark is that UT2004 is essentially an online game. For the most part you don't play against bots online.
Has anyone tried benchmarking with a recording of an online game?
I think [H] does it for their reviews.
 
Diplo said:
Pete said:
I'd imagine UT2K4 benches are more CPU-bound because (wait for it) they're botmatches..
Exactly. The other thing people seem to forget when using botmatches to benchmark is that UT2004 is essentially an online game. For the most part you don't play against bots online.
This means that when playing online against human opponents, it becomes *less* CPU bound compared to botmatch benchmarks, right ?
 
Also, I wonder if upcoming DOOM3 as well as Half-Life2 are more like UT2004 or Far Cry or mixture of both ?

For instance, Half-Life2 with its second generation HAVOK for physics will require lots of resources from CPU, I believe. And it seems very shader intensive game as well. Ah.. it sounds like mixture of both. Lots of CPU power as well as VPU. Probably means crawling with current generation hardware. :(
 
ShePearl said:
This means that when playing online against human opponents, it becomes *less* CPU bound compared to botmatch benchmarks, right ?
Yep, that's right. Of course there will be some extra over-head in the netcode, but I can't imagine this would be as great as the processing power needed for the AI. You will still have all the physics to contend with, but bare in mind you can set the physics 'detail' to low, medium or high in UT2K, depending on how powerful your CPU is.

Oh, and Steve Polge is responsible for all the AI coding in the UT series, not Tim Sweeney. I doubt many people, unless they are hard-core players, really understand just how damn good the AI is in UT2004.
 
That is assuming that the AI is the reason that it slows down when characters are on screen (botmatch vs flyby demo).

I don't know about this release, but on the previous UT release I got acceptable framerates on a 1200Mhz system by turning down the shadowing options, and turning off EAX.

I suggest to hovz (and others)that one should attempt to modify some of the quality settings (audio and video) and compare botmatch and in game framerates. Everything here is nearly all evidenceless speculation, yet there is a benchmark and many config options in the game itself. Change them and figure it out yourself!

It was clearly not the AI's fault last time I tweaked an Unreal engine. But I'm all uninstalled now....

Think about it, I was playing the original Unreal on a 233Mhz system with SOFTWARE rendering and the UnrealScript AI wasn't a problem..... Most of that UnrealScript has back-ended C++ anyhow.

From my evidence before, it was the CPU processing spent on shadowing calculations and some very bad 3D sound support when combined with EAX.
Supprisingly, changing the sound options to use software DirectSound3d was much faster! My minimum framerates shot way up by changing sound settings and shadow detail. Neither are AI related.

Maybe it is the same thing here? Try out some other video and audio quality settings. Maybe its something else? But this speculation so far is pointless...
 
You may be right there. EAX is a sound *deceleration* API IMHO. I don't know what the hell is wrong with Creative, but EAX/2/3 severely saps performance on my system. Battlefield Vietnam is virtually unplayable with it on.

Does EAX run as badly on non-creative sound cards? (e.g. Nvidia MCP?)
 
DemoCoder said:
You may be right there. EAX is a sound *deceleration* API IMHO. I don't know what the hell is wrong with Creative, but EAX/2/3 severely saps performance on my system. Battlefield Vietnam is virtually unplayable with it on.

Does EAX run as badly on non-creative sound cards? (e.g. Nvidia MCP?)

AFAICR, the likes of the MCP only supports EAX. EAX 2.0/3.0 seems to be something that Creative have kept for themselves as a unique selling point.

EAX suffers from the same performance issue as high end graphics cards ie, you are asking your system to do a lot more work (environments, occlusions, etc) so you end up with much a more complicated sound environment that causes a bigger performance hit (even with hardware support) than you get with a simpler sound environment running in software.
 
thop said:
Neeyik said:
I think [H] does it for their reviews.
Yes and still barely got above 60FPS average with their A64, no matter the resolution.
I just used UMark to benchmark a 12 bot Bombing Run game (BR-Anubis) and got an average of 92FPS at highest detail at 1024x768 on a GF4ti4400 on an AthlonXP 2800.
 
Bouncing Zabaglione Bros. said:
EAX suffers from the same performance issue as high end graphics cards ie, you are asking your system to do a lot more work (environments, occlusions, etc) so you end up with much a more complicated sound environment that causes a bigger performance hit (even with hardware support) than you get with a simpler sound environment running in software.

I disagree. EAX shouldn't cause a huge CPU impact just like fixed-function T&L shouldn't (in fact, EAX should be far less). You are asking the 3D card to do the work, not the CPU. And unlike Aureal's A3D which *actually did work* to figure out the environment by using a low-res geometry version of the environment , EAX simply requires tagging your map with some predefined environment cues (how much reverb, how much occlusion, room size, or just using presets)

I can only conclude that Creative's cards really don't accelerate much of EAX and that the drivers on the CPU are doing too much of the work. I don't notice these issues on Sensura engines, and my X-Box for example, can not only handle everything EAX does, but can encode Dolby 5.1 to boot.

I think the consolidation in the audio card industry that has made Creative a virtual monopoly has been bad for the advancement of sound.
 
DemoCoder said:
I think the consolidation in the audio card industry that has made Creative a virtual monopoly has been bad for the advancement of sound.

Understatement of the year Democoder.

In June or July 1998 I built my first custom PC. For sound I bought a SB Live and for video I bought Voodoo 2 12MB (then both cutting edge). Since then video technology has followed an exponential growth curve making the V2 long obsolete while the SB Live was never eclipsed until it's death two months ago.
 
DemoCoder said:
I disagree. EAX shouldn't cause a huge CPU impact just like fixed-function T&L shouldn't (in fact, EAX should be far less). You are asking the 3D card to do the work, not the CPU.

Shouldn't do, but it does. "Disagreeing" doesn't change the facts of the matter. That's why (for instance) Epic tells you to disable EAX if you experience performance problems. In fact, the pop-up help in UT2K3/4 warns you as you try to set this in game, that it reduces performance.

DemoCoder said:
I can only conclude that Creative's cards really don't accelerate much of EAX and that the drivers on the CPU are doing too much of the work. I don't notice these issues on Sensura engines, and my X-Box for example, can not only handle everything EAX does, but can encode Dolby 5.1 to boot.

I think the consolidation in the audio card industry that has made Creative a virtual monopoly has been bad for the advancement of sound.

Yes, that why moves such as Nforce sound and VIA Envy are good. Unfortunatley, Creative have gobbled up Sensura same as they did Aureal. I wouldn't mind so much if they produced good products, but for the most part Creatives sound hardware and drivers have been problematic for many people for a long, long time.

It's an interesting parallel with other parts of the industy. What would Intel be like without AMD, or Nvidia without ATI to keep them honest? I think Creative illustrates perfectly what happens when a monopoly of one is the only game in town.
 
Bouncing Zabaglione Bros. said:
Shouldn't do, but it does. "Disagreeing" doesn't change the facts of the matter.

I wasn't disagreeing that Creative's EAX slows down on Creative cards, I was disagreeing with

You said:
EAX suffers from the same performance issue as high end graphics cards ie, you are asking your system to do a lot more work (environments, occlusions, etc)

The fact of the matter is, EAX does not impose "alot more work" by the application. The content authors merely select several variables (or presets) for each room. This can be done at map creation time. At runtime, the application just passes these parameters to the EAX driver along with information on the sound emitter and receiver.

Ergo, the EAX driver itself is chewing up too much CPU rather than offloading the work to the onboard DSPs, which begs the question, why not?

This parallels the situation of video encoding/decoding, where many video cards do not fully accelerate all parts of the video codec pipeline, including some of the most expensive parts, which means it has to be done on the CPU.
 
Status
Not open for further replies.
Back
Top