nAo said:
aaaaa00 said:
The original patent had a half-GPU stuck on the end of 2 CELLs that just did rasterization didn't it? Something like a 4 CELL CPU, and a 2 CELL + rasterizer back end GPU.
It was just wishfull patenting
Looking back at 1999, who knew that fixed function, and then programmable, GPUs would take off like they did?
This is not directed at you nAo... while thinking about that question I did some digging on the 1999/2000 timeframe and what the industry was like. Looking at it from a 1999 perspective I would say that Sony was probably at least semi-seriously exploring the BE option. Not setting their hearts on it mind you, but looking at how things had developeed it was definately an option worth looking at.
In 1997 we have the Riva 128 and in 1998 the Riva 128 ZX. In 1998 was the Voodoo 2 also. (I actually had a Riva 128 paired with a Voodoo 2 on a PII 233MHz w/ MMX and 128MB of PC100 memory... boy that was fast!). The TNT came out in late 1998. In Spring of 1999 we saw the TNT 2 and 3Dfx shipped the Voodoo 3.
In 1999 we get the first GeForce (GeForce256) which basically moved the geometry load (T&L) to the GPU from the CPU. And it was not until the GF3 and DX8 (in early 2001?) that we got our first taste of programmable shaders.
From Sony's perspective, when they first mentioned the PS3 and "CELL" (was that in 1999?) GPUs were still in their infancy. They were only begin to migrate away from fixed function--so maybe a really fast CPU with a lot of floating point performance--we be much more flexible and would be best for allowing both developers decide how much they will use for graphics or general game code. In theory is sounds like a great idea at least. And the fact CELL is a streaming architecture, very similar to how a modern GPU works, would be the best way to get a CPU to do GPU like tasks.
We know that Intel overestimated where chip frequencies would be at. I believe we were projected to surpass 8GHz a while back. Chip makers have also hit issues of leakage that have resulted in increased heat and power consumption. While these were to be expected, I am sure that not even Intel planned on a 100W+ CPU.
And back to CELL... the SPEs are called "synergistic processing elements". Right now I believe the SPEs are SIMD, but you could put other stuff there like a scalar unit... or a rasterizer?
So if we had reached 65nm, and clockspeeds were up in the 8GHz-10GHz range, we would be looking at ~2 CELLs in the same area as 1 with 2-3x the frequency... that gets us in the TFLOPs range. Add in another pair (instead of the RSX GPU) and that is 2TFLOPs. Interestingly that is the same claimed total programmable performance of the PS3. Specialize some of the SPEs for raster work... viola! Broadband Engine!
Now that never happened, and maybe Sony never planned on it happening. It could have been a twinkle in Kutaragi's eye. It could have been a "Plan B if everything lines up... and if it does not lineup our Plan A is a traditional GPU while having CELL as the CPU" or vice versa. Maybe Toshiba had a CELL based GPU in the works and also a more traditional GPU design, but Sony opted for NV because of better IPs, NV was in a crunch and would give them a great deal, and most importantly the tools.
Maybe NV/Sony approached eachother way back in 2002 when MS/NV were having a pricing tift. I surely do not know
But as SLOW as this industry seems to move at times (Come on DAVE! We want that Xenos article!!), looking back at the last 5 years gives you a lot of perspective. A LOT has changed, and very quickly. We went from less than 1GHz to over 3GHz quite fast... and then hit a wall. We went from very basic GPUs to including T&L, then to offering programmable GPUs. All of that in the initial planning stages of the PS3.
To move forward to today... think about what the next 5-6 years will bring. I bet Sony and MS will put out quite a few patents of ideas, especially over the next 3 years. Then they will select the best couple scenarios and begin moving forward with designs (each design probably having more than one varient... e.g.. more CPU cores and less cache and less fast memory and a less CPU cores with more cache and more slow memory).
Who knows, we may eventually see the PS3 vision in PS4
Me on the other hand... I have always been a big fan of GPUs. The first computer I paid for out of my own pocket had a Riva 128 and eventually a Voodoo2. Seeing how the GPU market has blossomed and how their is a strong current toward more flexibility and programmability I do not see them going anywhere--especially in how they have pushed memory development and effeciency. So I have always been on the GPU bandwagon I guess... but only because what my eyes were showing me. What Sony showed of CELL at E3 seemed to indicate it is not weakling... so who knows. If not PS4, maybe PS5!