After seeing many benchmarks based on those new nVIDIA GeForce 6800Ultra, and ATI X800XT (and Pro), I'd like to figure out what aspects of UT2004 make *more* CPU bound compared to Far Cry ??
I understand that both ATI, and nVIDIA cards are very fast, and unless they're run at very high resolutions such as 1600x1200 with max image quality settings, these cards are always *almost* bottlenecked by mainstream CPUs currently available in the market.
But, it seems to me that those new cards are bottlenecked more in UT2004 by CPUs than they're bottlenecked by CPUs in Far Cry.
What are the aspects of UT2004 which makes it more CPU dependent than Far Cry ?
CHeers.
I understand that both ATI, and nVIDIA cards are very fast, and unless they're run at very high resolutions such as 1600x1200 with max image quality settings, these cards are always *almost* bottlenecked by mainstream CPUs currently available in the market.
But, it seems to me that those new cards are bottlenecked more in UT2004 by CPUs than they're bottlenecked by CPUs in Far Cry.
What are the aspects of UT2004 which makes it more CPU dependent than Far Cry ?
CHeers.