PS3 hardware design choices - good or bad? *spawn

DoctorFouad

Newcomer
Me too...

& probably pretty much every studio whose got a scalable job-scheduler-based engine already built to support PS3 (i.e. probably most of them nowadays)

Easy-to-program doesn't come into it when you've already invested in the technology to leverage a specific architectural hardware design, however esoteric it maybe.

It would just be a case of recompiling against a new platform SDK, fixing up your HAL layers for hid, command buffer gen, shader patching, system info & config interrogation etc. & switch your #define in the job scheduler from MAX_NUM_CORES = 5 to 127 or something...

the idea of using SPEs to do graphics jobs instead of a specialized GPU hardware is really not a very good idea as has been proven time and time again in almost every multiplatform ps3/xbox360 game.

For graphics I would take a more powerful GPU anytime over SPEs.

for the physics engine, I dont know if the modern GPGPUs can do a better job than the SPEs.

for the AI engine, I dont know if a modern OoO general purpose CPU can achieve a better job than SPEs.

I think SPEs in ps3 would have made perfect sense if the ps3 had a powerful GPU a la xbox360. if that happened, the SPEs would be used for more complex AI and physics. That would have been very impressive indeed.

But that was a historical opportunity lost by sony. for next gen hardware things have changed a lot....
 
the idea of using SPEs to do graphics jobs instead of a specialized GPU hardware is really not a very good idea as has been proven time and time again in almost every multiplatform ps3/xbox360 game.

How do you come to this conclusion at all? If anything without CELL, PS3 games would look a LOT worse. Sure you could say "remove CELL and put a more powerful GPU into the console"... but that's a moot point anyhow.

On the other hand, PS3 exclusives which are very much designed around using CELL for GPU tasks excell at graphics. Now I am not saying they are the best looking games, mind you, but GOW3, Uncharted 2 and 3 as well as others do get most "best graphics" awards in this console cycle.

And... "time and again"... wow. Just wow. The further we go along this generation, the more indestinguishable the games become. Just look at Max Payne 3 now, and compare the progression of all Rockstar games released in the past.
 
How do you come to this conclusion at all? If anything without CELL, PS3 games would look a LOT worse. Sure you could say "remove CELL and put a more powerful GPU into the console"... but that's a moot point anyhow.

The assumption being of course that cell is being a net advantage over every other option. I don't think that's actually clear.
 
How do you come to this conclusion at all? If anything without CELL, PS3 games would look a LOT worse. Sure you could say "remove CELL and put a more powerful GPU into the console"... but that's a moot point anyhow.

Without CELL ps3 games would look worse ? really ? are you sure without CELL sony would have used the same RSX GPU and Video memory ?
xbox360 didnt have the CELL, it got released 1 whole year ahead of ps3 in the US and almost 1 year and a half in europe, though more than 90% of multiplatform games look and perform better in xbox360, the console not having the CELL.

My point is that using a more powerful GPU for graphics is simply more efficient and makes creating a graphics engine easier than using a more powerful programmable CPU for graphics.

On the other hand, PS3 exclusives which are very much designed around using CELL for GPU tasks excell at graphics. Now I am not saying they are the best looking games, mind you, but GOW3, Uncharted 2 and 3 as well as others do get most "best graphics" awards in this console cycle.

I am not going to enter into any kind of debate around the question : are uncharted games possible in xbox360 ? are GOW games possible ? Killzone ?

But just think about this possibility : if Naughty Dog created an exclusive game for xbox360, wouldnt they cater their graphics engine for what xbox360 do best ? wouldnt they use for example higher rez alpha buffers compared to what we see now in their games for ps3 ? the fire effects ? the explosions ? just think about it....


And... "time and again"... wow. Just wow. The further we go along this generation, the more indestinguishable the games become. Just look at Max Payne 3 now, and compare the progression of all Rockstar games released in the past.

thats the point, BECAUSE of the CELL, it took developers years just to achieve multiplatform games parity.....

I like the CELL processor but not to do graphics, I would have loved if the CELL was used for more complex impressive AI routines and Physics. Unfortunately the RSX deficiencies didnt allow the CELL to shine in those respects....

Honestlty I understand the decision of sony not to go with CELL for its ps4, that would be suicidal.
 
Last edited by a moderator:
So... your argument that "CELL is bad at GPU tasks" is, in fact, "a different console would've been different... at least that's how I read it.

thats the point, BECAUSE of the CELL, it took developers years just to achieve multiplatform games parity.....

No... without CELL they couldn't have done it. And they couldn't have done Uncharted 3s water simulation. And neither GOW3s MLAA... ... and mind you that nearly all early games were done for 360 first and then shoved onto PS3. Of course that didn't work out well, especially if they used the eDRAM to a major degree. On the other hand, games made for PS3 first and ported to 360 later, we never saw these problems. In fact a lot of those games (not many) are MASSIVELY better on PS3, too. Like Final Fantasy 13.

Of course, my assumption is that RSX is still in use if CELL wasn't used. You are saying that if Sony hadn't used CELL, they would've used a different GPU...

And your "distaste" for CELL... I am not sure why that is, though. I mean, yes, it is more complicated and overall Sony could've done better... but that never was CELLs fault (in PS3). It was (nearly) ALWAYS RSXs fault.

EDIT: Thinking about it... your original point was using CELL for GPU tasks was bad... in some ways you are right, no doubt, but in others, I must disagree.
 
And your "distaste" for CELL... I am not sure why that is, though. I mean, yes, it is more complicated and overall Sony could've done better... but that never was CELLs fault (in PS3). It was (nearly) ALWAYS RSXs fault.

But if Sony had spent the millions they put into developing Cell on driving Nvidia to provide a custom GPU solution rather than an outdated off the shelf model then PS3 could have ended up with Xenon for the CPU and a GPU akin to G80. Even with only 75% the final capability of G80 it would have walked all over the 360.

So I'd say that on balance Cell was a pretty bad choice.
 
But if Sony had spent the millions they put into developing Cell on driving Nvidia to provide a custom GPU solution rather than an outdated off the shelf model then PS3 could have ended up with Xenon for the CPU and a GPU akin to G80. Even with only 75% the final capability of G80 it would have walked all over the 360.

So I'd say that on balance Cell was a pretty bad choice.

Dude the 7800GTX wasn't even released at the time Sony first showed off PS3 so that statement is silly imo.

And as a die hard PC gamer in my eyes PS3 exclusives do walk over 360 ones.
 
Dude the 7800GTX wasn't even released at the time Sony first showed off PS3 so that statement is silly imo.

And as a die hard PC gamer in my eyes PS3 exclusives do walk over 360 ones.

Somehow, MS managed to get Nvidia to produce a GPU for the XBOX with capabilities beyond what was available in their PC products at the time and followed that up by having ATI design a GPU for the 360 that had capabilities beyond what was available in their PC products at the time. I don't think it's silly to suggest that Sony had the option to do the same for the PS3.
 
Dude the 7800GTX wasn't even released at the time Sony first showed off PS3 so that statement is silly imo.

I'm not sure what that has to do with anything? Xenos was basically an early iteration of R600 and it launched 2 whole years earlier. The PS3 and G80 launched on virtually the same day. Clearly Sony could have got something much closer to G80 than Microsoft got to R600.

And as a die hard PC gamer in my eyes PS3 exclusives do walk over 360 ones.

Then we disagree, but I don't care enough to argue about it.
 
Dude the 7800GTX wasn't even released at the time Sony first showed off PS3 so that statement is silly imo.
And the machines they used to "show off PS3" before 7800GTX wasn't released used really just PC's with 6800's IIRC, or possibly Cell + 6800's with some bridge between them. They however had nothing to do with RSX.
The early "game demos" were also pure CGI, not realtime rendered game demos.
And as a die hard PC gamer in my eyes PS3 exclusives do walk over 360 ones.

The exclusive games are always a matter of taste
 
From the graphics presentations I have been reading in the last few days by sony first parties (and others), a lot of them say that the SPEs are not as fast as a GPU (for graphics tasks) most of the time, but they use them to take the load off the RSX due to the SPEs being extra computing resources and thus getting a net gain in total.


Dude the 7800GTX wasn't even released at the time Sony first showed off PS3 so that statement is silly imo.

And as a die hard PC gamer in my eyes PS3 exclusives do walk over 360 ones.

The Xenos was based off tech that only made it's way to PCs (in a more powerful form mind you) a few years later.

they couldn't have done Uncharted 3s water simulation. And neither GOW3s MLAA... ... and mind you that nearly all early games were done for 360 first and then shoved onto PS3. Of course that didn't work out well, especially if they used the eDRAM to a major degree. On the other hand, games made for PS3 first and ported to 360 later, we never saw these problems. In fact a lot of those games (not many) are MASSIVELY better on PS3, too. Like Final Fantasy 13.

Uncharted 3 like water is in Hydrophobia, a multi platform game.

MLAA and FXAA have been done in a lot of 360 games.

Then we have both BF3 and Crysis 2 looking about the same on both platforms, bar the fact that both make heavy use of the SPEs on the PS3.
 
Last edited by a moderator:
Have you SEEN the Uncharted 3 water (and sand tech for that matter)? It's not just the graphics, but the physics simulation behind it. I've only played Hydrophobia for a short amount of time, so I can only comment in a limited capacity. But I haven't seen ANYTHING that amounts to what Uncharted 3 has shown... Water affecting the wavey motion of the ship sitting on top of it... (though I am watching a Hydrophobia video now and must say, it does look good... not sure how good it looks on consoles, though)

http://gdcvault.com/play/1015309/Water-Technology-of

But... I guess it's also a taste thing, so it doesn't really make sense to argue about it.

My general point is... I don't think CELL was the problem on PS3. Ever. Put a good GPU into PS3, say a downsized 8800GTS (to make it as small as RSX) or the 360 GPU, and PS3 would've walked all over any 360 game, as it would've enabled CELL to use its processing power for gameplay/physics stuff, instead of helping RSX render stuff. Not sure how many developers (esp. third party) would've used it, but it would've enabled them to do it.
 
But if Sony had spent the millions they put into developing Cell on driving Nvidia to provide a custom GPU solution rather than an outdated off the shelf model then PS3 could have ended up with Xenon for the CPU and a GPU akin to G80. Even with only 75% the final capability of G80 it would have walked all over the 360.

So I'd say that on balance Cell was a pretty bad choice.

I totally agree, when people talk about ps3 development and R&D process they forget that the initial plan of Ken Kutaragi was just getting rid of a dedicated hardware for graphics (GPU) altogether and just use another CELL for the graphics (that would have eneded a lot worse for sony).

It is really simple : ken kutaragi had an idea, an original one : spending a lot of money to create a very powerful parallalyzed CPU capable of doing GPU tasks even better than GPUs ! at the same time this CPU has the benefit of CPUs : flexibility, you can also program on it complex physics and AI.... but history has proven that kutaragi's idea is a bad one, his idea caused a lot of trouble for the playstation business.
 
Hm... I am not sure his idea was bad, though. I mean, what are GPUs today? And were are they going? They are becoming a LOT more like CPUs. Kutaragi just went the other way around (making CPUs more GPU like, instead of GPUs more CPU like).
 
KK's original idea was to have something similar to two Larrabees but time has shown that two Larrabees are no match for a dedicated CPU and GPU. With die sizes being equal CPU/GPU will always win when it comes to performance.
 
The PS3 and G80 launched on virtually the same day.
Because the PS3 was delayed thanks to BRD issues. If PS3 had been designed for release at end-of 2006, they could have chosen a G80 and different GPU. Assuming of course when they started designing PS3 in 2005, they could predict what GPUs would be out and when and what they'd be capable of.

Or putting it another way, XB360 does as well as it does because of it has a great GPU with advanced features that ATi was in development of anyway. What if ATi's US tech was a another year or two out, and MS's choice was the same as Sony? Would XB360 with an RSX and eDRAM be as competitive as XB360 with Xenos? And what custom part could Sony get for spending a few hundred million on a custom GPU instead of Cell? They couldn't fabricate a larger chip and make it affordable, and nVidia couldn't have matched ATi's unified shaders because they weren't up to speed with US (nVidia PR claiming ATi's US wasn't any good). Just throwing money at that problem wouldn't have worked. Well, if Sony had commisioned a GPU with unified shaders, maybe they could have got something, but its performance would be an unknown and could go horribly wrong. And they'd have to make that choice around 2003 maybe, 3 years into having already started next-gen development and choosing to invest in Cell.

It's all very well and good evaluating choices with hindsight, but you need to view the engineers' choices from where they were to get context. The future of graphics wasn't certain back in ~2000 when Sony started thinknig about their next console. They chose programmable performance and the ability of GPUs to provide that wasn't known. They chose a flexible architecture that they hoped would have beneifts elsewhere too. Certainly Cell was a good choice to power BluRay. So in 2001 they committed to the Cell project. That was decided upon back then. How were they then supposed to launch with a G80 in 2006?? During development they evaluated a number of GPU options and settled upon RSX, based on the most powerful architecture available at the time (only trumped by ATi bringing out a new technological advance). In hindsight, Sony could have gone with more GPU...well, actually they couldn't because they were at the maximum chip size with RSX. A larger GPU would mean terrible yields. So they could have gone with a more advanced technology than the outdated 7xxx series...only they couldn't because they had the most advanced they could get. They could have commissioned a custom GPU using magical new technologies with unknown returns. Maybe to address the issues of programmable performance they could asked nVidia to include some versatile, custom vector processing units, building on the ideas of VU0 and VU1 in the Emotion Engine...

Cell didn't work out as well as hoped, but you can't begrudge the design choice really. It was a very flexible processor that did prove very useful and effective in some cases, and muddled by in the worst cases. RSX appears something of a rush job, but Sony didn't go cheap there, and they at least had something that worked instead of going with some crazy idea from Toshiba (whatever that might have been).

What is interesting is how those early decisions affected Sony's choices later, and it points to the idea of waiting and using off-the-shelf components. Why try to predict what computing needs will be 5 years in advance and try and design something for that, when instead you can know exactly what computing needs will be needed one year in advance and you can just buy a suitable set of processors to chuck in your box? Hell, we're seeing that now with the threat of streaming games. Just waiting, and then either deciding to throw in some CPU and GPU, or going with a streaming platform, gives flexibililty. Maybe Sony have seen OnLive, decided it's the future, and their next console will just be a stop-gap measure to ride a few years until everyone can get OnLive?
 
And what custom part could Sony get for spending a few hundred million on a custom GPU instead of Cell? They couldn't fabricate a larger chip and make it affordable,

Overall the PS3 was packing more combined CPU/GPU transistors than the 360 (because of Cell). However it still arguably struggles to keep up today despite that advantage so something must be wrong somewhere.

If they had went for a smaller CPU - say Xenon, and spent those extra transistors on a bigger GPU - say a customised R580 then the overall transistor count would have been almost identical but I'm betting Xenon + customized R580 in a console would have had no issue keeping up with and exceeding Xenon + Xenos.

I'm not saying Sony made bad decisions at the time they needed to be made given the information available at the time, but in hindsight, I think there were better technically feasible options that could have been used had they made different decisions earlier on.
 
If they had went for a smaller CPU - say Xenon, and spent those extra transistors on a bigger GPU - say a customised R580 then the overall transistor count would have been almost identical but I'm betting Xenon + customized R580 in a console would have had no issue keeping up with and exceeding Xenon + Xenos.

I dunno, R580 was a big chip (about twice the size of Xenos, and 50% bigger than RSX even without all the additional functionality), used lots of power to hit X1900 speeds (far too much for a console) and needed a 256-bit bus. It just seems too big and hungry any way you swing it.

A variant of the X1950 (RV570) might have been a good fit, but that probably launched too late.

Needing a last minute shoe-in, I'm not sure Sony could really do anything other than launch with RSX even with the benefit of hindsight.
 
I dunno, R580 was a big chip (about twice the size of Xenos, and 50% bigger than RSX even without all the additional functionality),

It wasn't quite that big. Only 384m transistors compared with 304m for RSX and 337m for Xenos + daughter die. R580 + Xenon comes in at the same transistor count as Cell + RSX.

used lots of power to hit X1900 speeds (far too much for a console) and needed a 256-bit bus. It just seems too big and hungry any way you swing it.

Yes agreed. This is where the customisation would have to come in. Clock speed could maybe be dropped to 500Mhz, ROPS halved and the memory bus reduced to 128bit or maybe 192bit which would cost them, but on balance of the money saved on Cell R&D it may have worked out cheaper overall. Regardless though memory bandwidth would be the biggest challenge.

Needing a last minute shoe-in, I'm not sure Sony could really do anything other than launch with RSX even with the benefit of hindsight.

I totally agree, I don;t think they could have made a better decision at that point. My argument is more the decision process from the start to go with Cell over a more traditional CPU and bigger GPU was the wrong one.

There's no point in overthinking it really, given the transistor advantage PS3 holds over xbox, they could simply have gone for a 4 core Xenon and 64 shader variant of Xenos and called it a day.
 
The 8800GTS 320mb consumed litrally double the idle power of a 7800GTX and 40w on full load.

Then there is the extra heat, it's all well and good saying Sony could of had some kind of G80 part and by all means they could of but please remember that all the G80 derived PC cards that had the same power and thermal envolope as a 7800GTX were also slower then said 7800GTX.

Only the 8800 cards were faster but aside from having to change everything in the machine for the extra power and heat requirements there was also the extra cost and availability, trying to get enough cards ready for the PC market and millions more for Sony would of been a big ask from Nvidia.

From the graphics I see PS3 doing I would say it's been easily ahead of 360 for the last couple of years.
 
Back
Top