So how CPU-limited ARE modern games using a G80?

Bohdy

Regular
Have there been any tests comparing overclocked/non-overclocked performance in various games with various settings?

Would be nice to dispel that myth once and for all ;)
 
Does it matter?

If your CPU gets 60fps at low res in all games, then that's all you need.

G80 is overkill for today. You buy it so that you can enable all the eye-candy at ultra-high resolution for today's games, and a couple of years down the road it still works well at decent resolutions. Games tend to up the graphics load a lot faster than the CPU load, so who cares if you're CPU limited at 150fps.
 
More CPU power more FPS. That's all I'll say.

The more GPU limited scenarios someone uses the smaller difference. It still goes without saying though that under ideal conditions one should couple a high end GPU with a high end CPU and not with some past generation Smelleron :D
 
Games are CPU dependant because
1/ DirectX sucks (prohibitive draw calls)
2/ popular "3D engines" are CPU dependant junkyards
3/ game devs are, for many, lazy ass writing shitty code

(Don't make your choice, all those are true)
 
If memory serves me well Aces Hardware did a quick editorial on the performance of the GeForce 3 with different CPU's. It made interesting reading... can't seem to find it right now.
 
Games are CPU dependant because
1/ DirectX sucks (prohibitive draw calls)
2/ popular "3D engines" are CPU dependant junkyards
3/ game devs are, for many, lazy ass writing shitty code

(Don't make your choice, all those are true)

As I understand, DX10 is much better when it comes to draw calls, and finally on par with OGL. Then again, that will change once OGL3 makes its appearance. Also, I only somewhat agree with 2 & 3, because there are always exceptions. For example, Yann Lombard reknowned at GameDev.net. I know he's working on something, but don't know what exactly. However, whatever it is, I'm sure it will have the best graphics of any game to date, given his history.
 
Even though basically everything that's been said so far is true, I thought this link deserved a double mention, since it's the first direct answer to the OP that reviewers have provided so far. DH compares a 2.66GHz Core2 Quad to an OC'ed, 3.6GHz C2 Quad, and sees next to no gain in both FEAR and Oblivion at 19x12 4xAA.

The point is, while the generic CPU dependency that Ingenu so dispassionately (;)) listed may keep overall framerate in check, there're always IQ features you can enable to essentially match GPU to CPU limitations, so basically nullifying any CPU limitation. Yeah, it's not an ideal choice, but it's not like G80 will necessarily be spinning its wheels with anything but a 3+GHz Core2 Duo.

Even if you're monitor-limited to 16x12 or (gasp) 12x10, there's always HDR, 8xQ, and Transparency AA (and NV's very useful "Enhance Application" AA setting) to cut the legs out from under the burliest GPU. But the reality is that someone who's willing to pony up $600 for a high-end card that loses its crown after basically a year probably has also invested in a high-res display to maximize that high-end card.

As for CPU limitations for modern games in general (not specifically with a G80), Xbit Labs published two articles on exactly that last year. I can't access their site ATM, but IIRC they're in the CPU section.
 
Very. We are running out of pixels to push. Every generation we go higher. 1600x1200 -> 1900x1200 -> now 2560x1600. I'm afraid we have hit the end of the road. We need more complex shaders. Bring on the UE3 titles!
 
any mid/high end cpu is good enough to push g80, as long as the cpu can push the gpu to min frame rates then its fast enough.

Very. We are running out of pixels to push. Every generation we go higher. 1600x1200 -> 1900x1200 -> now 2560x1600. I'm afraid we have hit the end of the road. We need more complex shaders. Bring on the UE3 titles!

ive run out of resolution ages ago, my monitor defult res is 1280 x 1024, and i agree 100% bring on the eye candy.
 
That Driver Heaven test was useful indeed.

Based one comparative results between different cards at different resolutions on various sites that I have read, it seems as if most games are NOT significantly CPU bound even at lower resolutions (1024, 1280) if moderate IQ features (4xAA, 16xAF) are enabled. There are some exceptions like some of the Doom III engine games, and CoD2.

Not that big a surprise to me as almost all games were GPU bound with the previous top cards and C2D, as [H] showed so clearly in that controversial "CPU review" of theirs. A 2X GPU boost in power hasn't narrowed the gap entirely, I think.
 
Very. We are running out of pixels to push. Every generation we go higher. 1600x1200 -> 1900x1200 -> now 2560x1600. I'm afraid we have hit the end of the road. We need more complex shaders. Bring on the UE3 titles!
High-res LCD screens will continue to get cheaper and cheaper - 1900x1200 displays are quite affordable now, and the 8800GTS will still bog down with recent titles at that res and good AA, not to mention more shader-heavy titles coming in the future.

If you think a 19" LCD is nirvana good for you, but as soon as 24" LCD's get below $500 cndn I'm on one. Chances are I'll need the latest GPU at that time to get 60fps in modern titles with all quality options cranked.
 
wanna bring the g80 to its knees, easy
just have 20-30 lights onscreen with good quality shadows
true this also taxes the cpu but it absolutly murders GPUs
thats the reason no games do it (except mine of course :p then again ive gotta really cut down on the shadow quality + i mean really,really cut down )
 
Actually, with the era of resolution independence coming with Vista (sigh, taken long enough) it'll be practical to use 200dpi or higher screens.

Photographs on screen would look so sweet then. I hate seeing the pixels.

It's just gonna take a few more years yet :cry:

Jawed
 
Actually, with the era of resolution independence coming with Vista (sigh, taken long enough) it'll be practical to use 200dpi or higher screens.

Photographs on screen would look so sweet then. I hate seeing the pixels.

It's just gonna take a few more years yet :cry:

Jawed
Screen door effect bugs the hell out of me, needs more crt :D
Just be sure your desk can support one ;)
 
:oops: Some of the resolutions used by people are getting ridiculous I think.

I mean 2560x makes sense if that is the native res of your screen, but screens don't need resolutions higher than that unless they are significantly bigger imo. And modern games certainly don't have the detail necessary for such high resolutions to make much of a difference, I think.
 
Back
Top