Do PC Gaming bottleneck trends carry over to consoles also?

RobertR1

Pro
Legend
From PC land we have learned that as resolution goes up, the CPU is less valuable and the GPU is the key. This is certainly more and more common with newer games. You can often see CPU scaling graphs giving minimal to no performance when benchmarking various models and speeds.

How does this differ in console land (if at all)?

I'm talking strictly in reference to graphics.
 
You can optimize more for the CPU and therefore off load tasks its good at on to it from the GPU. This allows you to do more with the GPU.

Not so easy on a PC because you can't always count on certain things, but easy on a console.
 
From PC land we have learned that as resolution goes up, the CPU is less valuable and the GPU is the key. This is certainly more and more common with newer games. You can often see CPU scaling graphs giving minimal to no performance when benchmarking various models and speeds.

How does this differ in console land (if at all)?

I'm talking strictly in reference to graphics.

On a console, you have a known combination of GPU/CPU and resolution, so you can balance the workload accross the available resources, knowing that the balance will be the same for everyone with the platform.
 
First, you can certainly get some dramatic performance increases in the PC land when changing CPUs, especially at the resolution that matter for consoles. At 1280x720, even with FSAA 4x, even "last generation" top cards (R580, G71) were often CPU-bottlenecked in many games (I'm not even taking SLI or 8800 into account). If anything, the whole "HD era" paradigm of this generation of consoles is highly amusing to the seasoned PC gamer. I don't think I ever went under 1024x768 with some level of FSAA since I got my 9700 Pro.

In addition, games for consoles and PC differ a lot in what they offer, and CPU is often the weakest link for PC developers when targetting a certain configuration. GPU-wise, they can get away by disabling some effects and lowering texture resolution, and letting gamers tweak IQ settings such as resolution, AA, AF... But CPU calculations don't work that way, there is a certain amount of stuff that has to be done anyway (AI, physics, general code...). As such, PC devs tend to have conservative approaches to CPU budget.

Being fixed boxes, consoles don't have these problems. Also, consoles tend to have less CPU overhead than PCs, since there are nearly no background tasks to speak of.
 
Back
Top