Were previous console jumps only 8-10x improvements?

Orion

Regular
The psone to ps2 jump saw both a massive increase in transistors as well as clockrate, going from 30sh mhz to 300mhz. The ps2 to ps3 sees a jump from 300Mhz to 3sh Ghz as well as large jump in transistors.
While psone games had 3k polys per frame ps2 has 100sk poly per frame and ps3 millions polys per frame, iirc.

It seems to me like the jumps were larger than 10x.
 
Last edited by a moderator:
I am not a hardware expert. However, I think the previous improvements were greater than 8-10. The issue we now face is diminishing returns. To get the same improvement in quality we got from the PS2 to the PS3, we would need a system with much more powerful hardware than the PS4. Personally, I think the PS4's RAM is possibly enough. However, I think we would have needed a more powerful GPU.

However, there will be big jumps in performance other than just realism. With the PS4 worlds will be much bigger, there will be more characters on the screen at once, etc. If you don't care about photorealism (and Crysis 3 level games are good enough for you) the PS4 is a dream machine.
 
The PS1 and PS2 generations lasted about 5 years. For things that follow Moore's Law such as CPU, that's about 10x progress. However, I think the GPU were progressing well beyond Moore's Law around the PS2 days.

http://csgillespie.wordpress.com/2011/01/25/cpu-and-gpu-trends-over-time/
gpu4.png


Pay attention to the slopes on this graph (and the vertical axis is logarithmic scale). The CPU FLOPS were roughly following Moore's Law but GPU FLOPS could be >100x in 5 years. So, the PS2 to PS3 generation leap was much better than a 10x jump.

A somewhat related matter is GPU processing power no longer progresses as quickly as before 2009. That means while PS4 will be less powerful relative to PC at launch, PC GPU power may not have grown as much as in the PS3 era a few years down the line in ratio terms.
 
Last edited by a moderator:
Though flops aren't the be-all of CPU performance. CPUs have focused more on that area lately but their IPC improvements and going multi-core oughtn't be ignored IMO.
 
I think this jump is less noticeable because you only jump to 3D or HD once. You do face diminishing returns because as you start to approach realism, more and more energy is spent accurately modeling physics and such in real time which were previously pre-computed, and few people will ever notice.

As such, what people fear the most is "good enough".
 
Law of Diminishing Returns:

PS2 Graphics was "good enough" for 50% of the customers
PS3 Graphics was "good enough" for 80% of the customers
PS4 Graphics will be "good enough" for 90% of the customers
PS5 Graphics will be "good enough" for 95% of the customers
PS6 Graphics will be "good enough" for 96% of the customers
 
If you look at chip progression you will see that these things changed:

1) Transistors got smaller
2) Power consumption got higher
3) Die size got larger
4) Frequency got higher

However, eventually these things slowed down. Frequencies were limited by more than just power consumption and temperature. Die size increases became less and less financially practical due to the limitations of manufacturing. Power consumption plateaued due to limits in packaging and cooling, and a desire to not cost people too much on their electricity bill or give them problems with their home wiring (in all practical purposes you'd eventually hit a limit where a standard 15A wall socket wouldn't be able to power it)

The only one still increasing at a fairly steady rate is transistor shrinks, although that too has started showing signs of slowing down throughout the industry, and this will continue to become more apparent.

So it's not that they wouldn't like to improve things as much as they have in the past but they're hitting limits in their ability to do so.
 
If you look at chip progression you will see that these things changed:

1) Transistors got smaller
2) Power consumption got higher
3) Die size got larger
4) Frequency got higher

However, eventually these things slowed down. Frequencies were limited by more than just power consumption and temperature. Die size increases became less and less financially practical due to the limitations of manufacturing. Power consumption plateaued due to limits in packaging and cooling, and a desire to not cost people too much on their electricity bill or give them problems with their home wiring (in all practical purposes you'd eventually hit a limit where a standard 15A wall socket wouldn't be able to power it)

The only one still increasing at a fairly steady rate is transistor shrinks, although that too has started showing signs of slowing down throughout the industry, and this will continue to become more apparent.

So it's not that they wouldn't like to improve things as much as they have in the past but they're hitting limits in their ability to do so.

The biggest one is I'm not sure Microsoft or Sony actually made money this generation if you consider the starting point as when the Xbox 360 and PS3 were launched respectively. Sure, these last few years have been profitable, but did they make up for the first 2-3?
 
Of course they made money this generation. They don't sell at an initial loss with no hardware roadmap that allows for recovery. I don't think attachment rate was phenomenally below expectations. MS makes hundreds of millions a year just from Live subscriptions.

Sony may no longer be able to afford selling at a loss but MS could probably pull it off to decent effect.
 
The psone to ps2 jump saw both a massive increase in transistors as well as clockrate, going from 30sh mhz to 300mhz. The ps2 to ps3 sees a jump from 300Mhz to 3sh Ghz as well as large jump in transistors.
While psone games had 3k polys per frame ps2 has 100sk poly per frame and ps3 millions polys per frame, iirc.

It seems to me like the jumps were larger than 10x.

Your metrics seem to be jumping around a bit. Are you talking about clock speed? FLOPS? Triangles per second?
 
Compare franchises across time.

Clocks, transistors, amount of dram or a number of other hardware metrics don't do as nearly as good of job of showing how far we have as actual images.
 
Just count the number of transistors in each system and see the improvements. More trannies = more performance.
 
I think when first gen '3D' consoles were released, the tech was so new and everyone came from 2D graphics, that the generation after that was bound to see huge improvements just from that experience alone, never mind all the advances in tech and hardware. But PS3 to PS4 is 16x increase in memory, and a massive increase in parallelism, just to name a few features. I definitely agree that graphics do not scale linearly when it comes to human perception though, and game designers better be aware that they should think hard about spending the new power wisely and not just on enhanced graphics. People need new experiences, and graphics are only there to support immersion. I'm still hoping physics will add some cool features, but online interactions are going to be just as important again I bet.
 
I think when first gen '3D' consoles were released, the tech was so new and everyone came from 2D graphics, that the generation after that was bound to see huge improvements just from that experience alone, never mind all the advances in tech and hardware. But PS3 to PS4 is 16x increase in memory, and a massive increase in parallelism, just to name a few features. I definitely agree that graphics do not scale linearly when it comes to human perception though, and game designers better be aware that they should think hard about spending the new power wisely and not just on enhanced graphics. People need new experiences, and graphics are only there to support immersion. I'm still hoping physics will add some cool features, but online interactions are going to be just as important again I bet.

Thing is that the intangibles which are likely to see with the PS4 and 720 are hard to measure. There's no such thing as "8x better artificial intelligence!" and "20x better physics!" can't be seen in screenshots, plus it's just what we expect. I mean, the reaction one would expect from the average person who you tell "But it's realistic now!!" is "What, were you faking it before??"
 
Your metrics seem to be jumping around a bit. Are you talking about clock speed? FLOPS? Triangles per second?

I meant that both transistor count as well as clockspeed showed massive jumps. Resulting in great graphical leaps as an example I gave the poly per sec figures. So I'm talking about overall performance.
 
Well, the transistor increase from say the PS3 to PS4 GPU is greater than the transistor increase from Xbox GPU to PS3 or 360 GPU.

9.3X vs. roughly 5X.
 
Well, the transistor increase from say the PS3 to PS4 GPU is greater than the transistor increase from Xbox GPU to PS3 or 360 GPU.

9.3X vs. roughly 5X.

interesting, but I still think at least in the cpus, the combined transistor+clockspeed performance increase from ps2 to ps3 is not matched in the latest jump.
 
interesting, but I still think at least in the cpus, the combined transistor+clockspeed performance increase from ps2 to ps3 is not matched in the latest jump.

There's a limit to how high you can run clock speeds, period. It's not a sharp wall though, your power goes up like crazy while trying to hit it. That wall was hit pretty much around the time XBox 360 came out (see Pentium 4). The mentality behind Xenon and Cell's design somewhat echoes that of Pentium 4's.

You could make the argument that the game consoles in particular clocked higher than they should have - one has to wonder how much die space it'd have cost to have half the clock speed, far fewer pipeline stages, somewhat wider execution as twice as much SIMD width.

Smaller transistors were being ran faster and faster and used more power. These days the trend has settled on trying to use more transistors to get the same workload to use less power in order to both back away from the peak power wall and allow more scaling down into smaller form factor devices.
 
My estimation of Jaguar's transistor count is about 520-600 million. Cell's is 241 million.. however Cell runs twice as fast so it's getting more out of those transistors. In the end I guess it could be a wash, with Cell being better at some things and Jaguar better at others. Here's hoping they can get the speed of Jaguar up to 2GHz.
 
1) Transistors got smaller
2) Power consumption got higher
3) Die size got larger
4) Frequency got higher

CPUs got at the high point of their power budget sooner, too.
GPUs were coming from lower (in 1999/2000, when PS2 came out, you could still buy a GPU without heatsink), and ended up as 200W monsters, where they slow down on the curve.
 
Back
Top