Next Generation GPUs to kill Core2 Duo/Quad and Phenom ?

almighty

Banned
Hey guys, been overclocking all night to that little more as you do :)

Anyway, there is a few articles floating around the net that show you need atleast a CPU running at 3.6Ghz not to bottleneck a GTX260/275/280/285/295 and a 4870x2...etc..etc.... and all the SLI/Crossfire configs there is available.

And in a few games its actually closer to a 3.8Ghz clock needed, this got me thinking, if the cards NOW need that much CPU power whats the next gen DX11 cards going to need?

I meen in some tests i7 is still holding back tri-sli/quad fire systems in certain games so whats gonna happen when you sli/crossfire the next gen cards?

IMO the Core 2 architecture and Phenoms have had it when the next gen cards come out :(

We need more CPU power :(
 
Nope. There is no such thing as a fixed balance between CPU and GPU power. Its completely dependant on the settings you use.

Really, you should never, need more than a 60fps fixed framerate. So if your CPU is capable of delivering that (which i7 is in every game), then you are by default GPU limited.

Your GPU is only there to be cranked up to the point that it matches the framerate delivered by your CPU so unless your CPU is literlally incapable of delivering playable framerates at any graphical settings then you are pretty much GPU bound.

Just dial up the resolution, or the AA. Add TSAA if you have to, or 16xQ AA. Throw in vsync and if your still hitting a fixed 60fps then don't consider yourself CPU limited, just consider yourself lucky that your system is practically limitless in current games! But be aware that your GPU will run out of power much faster than your CPU will.
 
To elaborate all games/programs are CPU/GPU limited to some extent. And one or the other or even both can be idling or have portions idling at any time even if it's "mostly" bottlenecked by something else.

You just have to find the limit that is comfortable for you. For example I'm quite peachy with 30 FPS as long as there is no input lag and min FPS doesn't drop below 15 or so. So I'm quite happy bumping up resolution and IQ until it drops below that.

On the other hand some people aren't satisified with anything under 60 fps and dial down resolution and IQ until they are satisfied.

I'm sure if I dialed up the resolution to 16000x10000 for example that I'd bottleneck anything in existence for the next few years. Although if I was only doing wireframes it may not. :D

Regards,
SB
 
Anyway, there is a few articles floating around the net that show you need atleast a CPU running at 3.6Ghz not to bottleneck a GTX260/275/280/285/295
This is utter bullcrap.

What game are we talking about? At what resolution, in-game effects level, texture and anti aliasing quality level, and so on?

I don't know of any graphics rig that is CPU limited in crysis with everything maxed out at HD resolutions, do you?

If you by "bottlenecked" mean you're running 200fps at 3.5GHz and 230fps at 3.8GHz, thus showing a CPU "bottleneck", well think about what possible use you could have for those extra FPSes, considering all they result in is massive tearing on your monitor - which most likely doesn't show more than 60fps anyway.

Speaking about CPU bottlenecks is pointless today.
 
This is utter bullcrap.

What game are we talking about? At what resolution, in-game effects level, texture and anti aliasing quality level, and so on?

I don't know of any graphics rig that is CPU limited in crysis with everything maxed out at HD resolutions, do you?

If you by "bottlenecked" mean you're running 200fps at 3.5GHz and 230fps at 3.8GHz, thus showing a CPU "bottleneck", well think about what possible use you could have for those extra FPSes, considering all they result in is massive tearing on your monitor - which most likely doesn't show more than 60fps anyway.

Speaking about CPU bottlenecks is pointless today.

how so? look at this.
http://www.behardware.com/articles/759-8/amd-phenom-ii-x4-955-black-edition.html

granted the Crysis CPU bench might be a little artificial, but CPU power pretty much determines the best framerate you can get.

I agree that bottlenecks are often overrated (comments as "you should not get GPU Y because your CPU X is too slow", or "why do you keep your 8800GTX on your core i7).
if your CPU is too slow, you can set higher details (bar CPU dependant ones) and AA / resolution.

Yet in the end a fast CPU with a crappy GPU is preferable to the reverse situation if you want to be able to run recent games.
Take a core 2 duo with a modern IGP or a lowest end GPU. you will suffer but may be able to run some games at 800x600 with a mix of low and medium details. Now try running that game on an XP1800+ with a 9800pro or 6800GT, that's a considerably better GPU, but all you can achieve is a good looking slideshow.
 
This article, now when you read it consider that the GT300 and ATI5000 series will be allot faster and require even more CPU power then the card tested in this article

GTX 295 CPU Scaling : http://www.legionhardware.com/document.php?id=807

I'm not seeing any CPU limits in that article though?

Every CPU tested, right down the the 2.0Ghz C2D is more than capable of pushing out playable framerates in every game. Two games were the slowest CPU doesn't break 60fps (Crysis and FC2) are both running at below there max settings anyway (Crysis is way below), and hence any apparent CPU limitations would become GPU limitations when the settings are turned up.

The only game benchmark that makes a decent case for CPU limitations there is World in Conflict were a C2D at 2Ghz is limited to 35fps, but an i7 at 3.6Ghz can push out over 60fps. But still, that only shows a limitation of todays slower dual cores, not the best quads. Its clear that the i7's are running into GPU limits as the resolution goes up and thats only with basic 4xAA. What happens with 16xQ CSAA with TSAA? And then throw in some Ambient Occlusion if the game supports it....
 
how so? look at this.
http://www.behardware.com/articles/759-8/amd-phenom-ii-x4-955-black-edition.html

granted the Crysis CPU bench might be a little artificial, but CPU power pretty much determines the best framerate you can get.

I agree that bottlenecks are often overrated (comments as "you should not get GPU Y because your CPU X is too slow", or "why do you keep your 8800GTX on your core i7).
if your CPU is too slow, you can set higher details (bar CPU dependant ones) and AA / resolution.

Yet in the end a fast CPU with a crappy GPU is preferable to the reverse situation if you want to be able to run recent games.
Take a core 2 duo with a modern IGP or a lowest end GPU. you will suffer but may be able to run some games at 800x600 with a mix of low and medium details. Now try running that game on an XP1800+ with a 9800pro or 6800GT, that's a considerably better GPU, but all you can achieve is a good looking slideshow.

I more or less agree. A CPU limit is real if the CPU is literally incapable of pushing playable framerates. Thats an extremely rare occurance however with modern CPU's and even n todays most demanding in game CPU test (the Crysis one), a C2D Q9400 or I7 920 can both push out >30fps average.

You could argue that thats not enough and so we need faster CPU's to enable faster framerates but even on a GT300 i'm betting you can easily bring the framerate down to, and below that 30fps simply by upping the graphics settings. And hence the game is effectively still GPU limited because its unplayable at higher graphical settings and playable (albeit at 30fps) on lower graphical settings were the bottleneck is moved fromt he GPU onto the CPU.

Some games today require pretty beefy CPU's to be playable. Crysis and GTA4 are the two I can think of but at least these CPU's do exist today. Its going to be a long time before we see a game that is unplayable on anything less than a 3.3Ghz i7 regardless of what graphics settings you apply.
 
The only game benchmark that makes a decent case for CPU limitations there is World in Conflict were a C2D at 2Ghz is limited to 35fps, but an i7 at 3.6Ghz can push out over 60fps. But still, that only shows a limitation of todays slower dual cores, not the best quads. Its clear that the i7's are running into GPU limits as the resolution goes up and thats only with basic 4xAA. What happens with 16xQ CSAA with TSAA? And then throw in some Ambient Occlusion if the game supports it....

Add to that even competition grade RTS play doesn't need much more than 30 fps. It isn't as if precision twich aiming is required as in competition grade FPS play where you may have to hit something only a few pixels across in the distance that is only there for a second or two at most..

Regards,
SB
 
how so? look at this.
http://www.behardware.com/articles/759-8/amd-phenom-ii-x4-955-black-edition.html

granted the Crysis CPU bench might be a little artificial, but CPU power pretty much determines the best framerate you can get.

I agree that bottlenecks are often overrated (comments as "you should not get GPU Y because your CPU X is too slow", or "why do you keep your 8800GTX on your core i7).
if your CPU is too slow, you can set higher details (bar CPU dependant ones) and AA / resolution.

Yet in the end a fast CPU with a crappy GPU is preferable to the reverse situation if you want to be able to run recent games.
Take a core 2 duo with a modern IGP or a lowest end GPU. you will suffer but may be able to run some games at 800x600 with a mix of low and medium details. Now try running that game on an XP1800+ with a 9800pro or 6800GT, that's a considerably better GPU, but all you can achieve is a good looking slideshow.

Why the arbitrary choices? I thought it was generally accepted logic, that as a PC gamer you are far better off with a mid-range CPU and high end GPU than vice versa!

You only picked basically an antique CPU to make your "point". Now what if I said, if you have a 2ghz core2dou dual core, and a 4890, you'll be much better off than if you have an Core17 960 oced @4ghz, and a IGP. I would be right as well, just because I changed the mix.

In the end I think, in this era of fixed pixel displays, what decides what you're limited by is the particular game and res. For example, on my 1680X1050 monitor, I'm GPU limited when I cant max the settings at that res. However when I can max the settings and my framerate doesnt change, then the CPU is my bottleneck. So it just depends on the game, and your LCD's native resolution.
 
Back
Top