PS3: The GPU that will be vs the GPU that could have been

Carl B

Friends call me xbd
Legend
'Visualizer,' 'Pixel Engine,' 'Graphics Synthesizer 3': These were all terms back in the day for possible components and/or implementations of Sony's graphics system in PS3. Fillrate in the tens of Gigapixels. Software rendering. Ray-casting based; possibly raytracing as well. Co-operation with Cell. Tile-based aspects. Things were looking weird, exciting, and confusing back in the day, because it was several different patents that led people to reach their own conclusions.

But what I want to know is: what happened?

Why the RSX but not the 'Visualizer?'

Now, obviously it just didn't work out - I'm looking for people's thoughts on what they think the reasons for that were.

Too much die space needed relative to what they hoped, ie too few Cells in PS3?

Window to develop the tech just wasn't big enough?

Advantage seen in traditional rasterizer/hardware?

Anyway just some food for thought with the dawn of the new console era upon us. :) I myself am fascinated behind what possibly happened behind the scenes over at Sony/Toshiba for it to fall apart like it (obviously) did.
 
I think it's mainly that the moment another device can push out more FLOPS--by any measurement--Sony starts drooling uncontrollably. ;)
 
I simply think it's another example of Sony PR writing a check that it's ass couldn't cash. I'm actually surprised that they went with Nvidia. These Japanese companies tend to have alot of pride and want to do their own thing. They learned a lesson from PS2's woefully under-powered hardware.
 
Megadrive1988 said:
short answer:

SGI technology > Sony/Toshiba technology

8)

Alright Megadrive, but I'm not letting you get away that easy! ;)

Can you expound in terms of my questions?

Or are you going the route that NVidia can bring a more robust technology to the table? Just trying to understand your answer beyond the context of a one-liner. :p
 
wco81 said:
What is the fillrate of the RSX?

Someone mentioned the Xenos fillrate is 34 gigapixels?

I thought Xenos was 16 Gigapixels with 4x AA??
 
wco81 said:
What is the fillrate of the RSX?

Someone mentioned the Xenos fillrate is 34 gigapixels?

No word that I can find - only '100 billion shader ops' and '51 billion dot products.'
 
What's with Sony and their FLOPs obsession by the way?

Genuine question, i've been wondering for a while why they have focused so much attention on the FLOPs figures with so many other things they could get obsessed with...
 
london-boy said:
What's with Sony and their FLOPs obsession by the way?

Genuine question, i've been wondering for a while why they have focused so much attention on the FLOPs figures with so many other things they could get obsessed with...

Because that's what you want when calculating ever more realistic scenes: more FLOPS!!!
 
Designing a powerful CPU that can compete with Intel,AMD,etc. can costs billions. Designing a powerful GPU that can out-function and out-perform ATI and Nvidia can cost alot too. Jut ask powerVR, MAtrox, BitBoys, SIS, Trident, S3, etc.
 
Pozer said:
Designing a powerful CPU that can compete with Intel,AMD,etc. can costs billions. Designing a powerful GPU that can out-function and out-perform ATI and Nvidia can cost alot too. Jut ask powerVR, MAtrox, BitBoys, SIS, Trident, S3, etc.

The tech to be used in the chip was in many ways very similar to the Cell itself - or so it seemed. I don't disagree with the idea that chip design can be expensive, but the process was clearly underway already when they decided to abandon it. It's those possible reasons for abandonment mid-development that are what what got me thinking.
 
london-boy said:
What's with Sony and their FLOPs obsession by the way?

Genuine question, i've been wondering for a while why they have focused so much attention on the FLOPs figures with so many other things they could get obsessed with...
It's not Sony per se, but their PR department. The PS3 specs sheet looks eerily like a cut-and-pasted Xbox 360 specs sheet with omissions where it would appear to the layman that the PS3 was at a disadvantage (e.g. polygon throughput and number of shaders).

I mean, without this spec sheet, Sony wouldn't have much of a Power Point presentation at their presser. Microsoft is clearly at a disadvantage since they served Sony up a floater that they could manipulate to their advantage.
 
xbdestroya said:
It's those possible reasons for abandonment mid-development that are what what got me thinking.

A combination of underestimating the rapid progress of mainstream GPU performance, their own solution not meeting estimates (and possibly a bunch of license issue, ATI and NVIDIA have alot of patents between them).

Cheers
Gubbi
 
Gubbi said:
xbdestroya said:
It's those possible reasons for abandonment mid-development that are what what got me thinking.

A combination of underestimating the rapid progress of mainstream GPU performance, their own solution not meeting estimates (and possibly a bunch of license issue, ATI and NVIDIA have alot of patents between them).

Cheers
Gubbi

Thanks Gubbi - this is along the lines of what I was looking for. By the way, what are your thoughts on software rendering vs hardware rendering in general in terms of viability in future solutions?
 
Sony mentioned it several times, that partnering with nVidia meant they inherited the whole tools and development infrastructure built around nVidia.

Maybe that was as important as the silicon design itself.

Now, that will attract PC developers but the genre which the PC is best noted for are FPS games. Sony could definitely use help in FPS but other than Halo, have FPS games been blockbusters on consoles?
 
I think a key point is software tools. Wanting to go with OpenGL, nVIDIA offered a package solution. That's been a LONG time in development and they're tools devs already know, plus nVIDIA offers a graphics system already common in high-end workstations which is another market Sony want to leerage Cell on.

That and the realisation a 4 way BBE wasn't going to work as they wouldn't make 65nm in time. The choice was 2 Cells or Cell+GPU, and the GPU would provide a better graphical solution than a Cell.

If they HAD have gone with Cell, maybe it would be real-time raytracing, but we'd be in a PS2 type situation when it came to writing anything. Without the tools the development would be too uphill, the games wouldn't materialze, and XB360 would walk away with first place.
 
wco81 said:
Sony mentioned it several times, that partnering with nVidia meant they inherited the whole tools and development infrastructure built around nVidia.

Maybe that was as important as the silicon design itself.

I like this - great point! With MS having been in the period of talking about how easy 360 would be to program for, maybe Sony decided to go away from their 'shoot the moon' strategy and get something a little more understood and conventional to their devs.

EDIT: Good point also Shifty - that's why I threw in the the 'die-size' aspect of things, not to mention the toolset's, which are something I didn't, but should have considered.
 
london-boy said:
What's with Sony and their FLOPs obsession by the way?

Genuine question, i've been wondering for a while why they have focused so much attention on the FLOPs figures with so many other things they could get obsessed with...

Smart marketing. Sony has a clear advantage in FLOPs so they are taking the PR oppurtunity to win mindshare.

It has been so effective that there is an active post on the forum where someone even asked, "What tasks can even be done with a general processor". To me that is shocking--but clearly demonstrates the mindshare Sony has won with their specs. Fast FP units are nice and will indeed accelerate certain tasks (vertex processing, procedural synthesis type tasks, probably physics and collision detection), but looking at past consoles and the PC market general processing cores have done quite well for games. Obviously they are not perfect for all types of games and tasks, but they are quite flexible and with some tweaking can improve in their FP performance. The question is will the single PPC core be enough to handle all the tasks SPEs are not well designed for? Who has made the best tradeoffs?

Sony with one PPC core and 7 extremely fast vector units? or

MS with 3 PPC cores with beefed up VMX units?

Sony is betting most games will benefit most from more FP performance, while MS believes general processing performance is still quite important. Not that either sucks in either department (both quite excellent designs), but they do have different philosophies toward where game development is going.

The question who has made the right gambles imo. Both companies are being faced with the issues and hurdles of getting multithreaded apps to work well with games. Sony has gone with a streaming processor design that has individual *physical* FP centric units to deal with each thread. MS has gone a more traditional route with 3 general processing cores with a bump in FP performance.

MS developers need to find a way to get enough FP performance to keep up, while Sony developers need to find ways to get the SPEs to do tasks generally tasked to a general processing core.

Based on developer feedback from a number of sources it really does seem properly multithreading applications is a HUGE hurdle, so it may be years before we begin to see developers beginning to tap the power of the systems. With cross platforming and both systems having PPC cores I get the feeling a lot of initial 3rd party software will be very limited in how they take advantage of the designs. Games are not your typical streaming tasks, but I am probably more excited about what CELL can do for physics. Considering the relatively poor performance Intel and AMD have shown with their multicore chips I wonder how well the XeCPU will perform in a closed box scenario. I know that the x86 software so far really has not been tailored well for multithreading, but I guess I have some doubts about how well typical processors perform in the real world--I guess it depends how well developers can thread their apps.
 
To take the tool-set theory a little farther, I do recall the rumor being that NVidia approached Sony, and not the other way around - and it would have been during the time when Microsoft began touting it's unsurpassed ease of development. I could really believe just the development advantage (how difficult is software rendering?) alone as being a key determiner in the decision, and in that having been one of NVidia's primary selling points as well.
 
Back
Top