PS3 hardware design choices - good or bad? *spawn

sony already took huge risks and spent a lot of money developing a very complex expansive CPU and optical drive, why not finish the job and spend even more money to develop a great GPU too ?
They already were losing (and have lost) billions thanks to PS3. Something like a $200 loss on each console sold in the beginning. You wanting them to have lost more??

And that ignores the impossibility of what you ask. You can't design and manufacture en masse a new cutting edge GPU in <6 months. Your demands are unreasonable.

so yes it was sony's fault not providing developers with an adequate GPU so they could use the CELL for more interesting tasks than just helping the GPU.
Why does the PS3 have to have graphics comparable to whatever console MS put out? It doesn't, at least not in principle, like Wii. With any hardware, it's down to the devs to choose how to use it. They could have produced simpler looking games that played better or did something new. They decided to release the same games on XB360 and PS3, and so to use PS3's CPU to up support its graphics.

Or putting it another way, if MS didn't release a console and PS3 was working with lesser graphics than it is now but richer game experiences, no-one would be complaining. It's only the retrospective comparison with XB360 and how the industry went that makes Cell look like a bad choice. That's the problem with choice - you never know until after you made it whether you made the right one or not. There was nothing fundamentally wrong with Sony's thought processes or strategy. It just didn't pan out for the them.
 
Why does the PS3 have to have graphics comparable to whatever console MS put out?

...

Or putting it another way, if MS didn't release a console and PS3 was working with lesser graphics than it is now but richer game experiences, no-one would be complaining...

I'll start out by saying I think the ps3 design was great for what it was. By Cell being inherently flexible, it could make up for whatever shortcomings the chosen GPU had, and had the potential to showcase bold new gameworld physics and animation. Though only a handful of games made full use of Cell for those purposes, they still showcase the capabilities of the concept and design.

The two things I'd say Sony screwed up on was overvaluing the BRD market, and not developing their dev tools and libraries in time for early adoption.

A couple of other nitpicks might say their online offering, but by having it free, it helped offset any disadvantages to xbl.

Overall the hardware design choices were solid and forward looking. As a console should be, especially in this day and age of rapid tech progression.

_______

Having said that, I think you're being a bit naive here.

Remember, not only was ps3 a year (or more in some regions) later than xb360, it was also significantly more expensive mostly due to non-gaming hardware decisions (BR).

If anything, I'd say customers were expecting not only graphics parity, but graphics superiority AND richer gaming experiences (better physics, animation).

Take a trip down memory lane on some old posts here and you can see that quite clearly.

But even without the richer gaming experiences, I'd say the vast majority at least expected graphical superiority, and if better/richer gaming experiences came with it, then all the better.

The race between MS and Sony was too tight and fierce at the time to let an obviously less powerful machine hit the market.

These days, it seems they are projecting to each-other to see how quickly either one of them can reach the bottom of the spec race ...
 
I would say a lot of this debate centers around the fact that the Xbox 360 was undercooked in terms of readiness for release and the PS3 was overcooked. They both ought to have been released about the same time and they each had their own paradigms, customish GPU vs custom CPU.

Personally the only major mistake IMO with the PS3 design given the way things went is the fact that they went with an 8 SPE design cut down to 7 when it reality it probably would have made better sense to use a 6 SPE design without the deactivated SPE.

I think Kutaragi said something about using 8 SPUs because 8 was a beautiful number in computing. I hope he was joking and that they had better reasons than that. :D

Going with your 6 SPU idea, it would have saved leakage on the deactivated SPU (assuming they do what AMD did until recently), saved power on an active SPU, and probably saved power on part of that ringbus thing. Within the same TDP they may have been able to increase clock speed to 3.3 or 3.4 gHz, which would have meant early games relying mostly on the PPU would have had a boost (and perhaps some even now). Might have helped in some of those DF face-offs.
 
Nvidia thought R580 was 352 mm^2 (but maybe they were exaggerating for PR purposes):

http://www.nvnews.net/vbulletin/showthread.php?t=65835

Having looked into it further there seems to be some confusion over the size of R580's die. Some sites are saying 315 and others 352 so I'm not sure which to believe!

G71 swelled from just under 200mm^2, to 240 mm^2 with RSX on the PS3, and that's despite losing 8 ROPs and half its video memory bus to Cell. R580 for the PS3 could have been creeping up on 360 or 400 mm^2 (depending on which R580 figure you use) with another 40 mm^2 added on.

Was RSX based on G71 though or a die shrunk G70? The internals were modified between G70 (302m transistors) and G71 (278m transistors) to reduce pipeline length.

So if RSX was simply a die shrunk G70 and not G71 then it's die size pretty much equals an unchanged G70 - minus some transistors for the ROPS and plus some for the PS3 specific stuff.

I'm pretty sure Xenos wasn't based on R600, but on an older architecture that never made it's way into the PC space as it was deemed too complex or too difficult to manufacture or something like that. ATI showed it off to MS when they were looking for something for the 360, and it was adapted and updated for 90nm and Microsoft's next console. It later fed in to the development of ATI's unified shared stuff in the PC space.

I remember it being something like this:
R400 (canned) -> R500 (Xenos) -> R600
R420 (X800) -> R520 X1800 -> R580 (X1900) -> R600

It's not that MS got R600 early, it's that they spent years working with ATI on getting an alternative line of technology to R520 and R580 because it far better suited their needs. I doubt G80 would have offered superior performance per Watt to what MS got, and even if it had, Sony would have to have been working closely with Nvidia for a long time to get it.

Yes that's my understanding too so fair point. It certainly wouldn't have been possible for NV to provide a primitive/cut back version of G80 to a console 2 years before it launched which was effectively what ATI did with Xenos by leveraging the R400 design and carrying it forward into the PC space as R600.

However G80 was much closer to PS3's launch so I'm betting if Sony had put the money in (money saved from not using Cell) then it could have had a customised solution that leveraged a lot of the advances found in G80. And you may be correct that such a GPU might not have had the performance/watt of Xenos even then, however the PS3 does have a higher power budget than the 360 anyway so as long as that custom GPU had higher overall performance but still fit within the Cell+RSX PS3's power envelope if used with a lower power CPU like Xenon then that's all that really matters.
 
I'm still waiting for someone to provide time lines that show that G80 was mature enough to be considered as a viable option to be used in PS3 at any point of development.

R600 being in Xenos is completely irrelevant as ATI are not Nvidia and have different time scales so using the fact that 360 has a R600 derivative as an argument to back up that G80 could of been done for PS3 is laughable at best.

PS3 was shown off for the first time in may 2005 at E3 which means that specs were final, annoucnhed to be using a GPU based on Nvidias new 7800GTX GPU, dev kits were running 6800GTX's in SLI and was later changed to RSX.

Now in September 2006 rumours were starting to surface of Nvidias G80 and what specs it might have, nothing was official and Nvidia wasn't giving anything away. The 8800GTX then released in November of that year.

Doing research on G80 shows that it took Nvidia 4 years to design as it was a completely new design for them. So it was released in September 2006 meaning that Nvidia could of started work on it as early as 2002.

So personally looking at the time frame G80 or any derivative of it would not of been an option as Nvidia were still designing the damn thing, maybe even still designing it in 2005/early 2006 depending how much they bugged fixed....etc..etc..

There would of been no way in hell Nvidia could of carried on working on the desk top version of G80 and spared engineers to spend a good 12+ months making a custom chip for Sony. Then spent even longer testing and bug fixing, then making enough to meet Sonys demand...

G80 just simply was not ready.... Could Sony of got a custom chip based on it, perhaps but IMO the architecture was still in the design phase and just simply was not ready.

ATI IMO rushed R600 as they needed to get a GPU out the door to combat G80 and to get some sales which resulted in ATI getting a complete ass whooping in the process.
 
I'm still waiting for someone to provide time lines that show that G80 was mature enough to be considered as a viable option to be used in PS3 at any point of development.
You're somewhat misunderstanding. The theory is that instead of spending many millions on making Cell, Sony could have given that money to nVidia to make a G80-type GPU ready in 2005/6 for PS3, just as ATi managed to get US for XB360 a year or two before they had the tech in their desktop parts. If that's possible, which I seriously doubt (as you say, RnD is the limiting factor, and just throwing money at a problem doesn't solve it), then PS3 could have been graphically more capably at the same price. Or somesuch.
 
Those kind of "what if?"s are pretty pointless. What if they bet on GPU, gave nVidia a bunch of money to develop a breakthrough design before they were really ready and got something as broken as the original GeForce FX? The PS3 would have been in an even worse position. Even with hindsight being 20/20, it's pretty obnoxious revisionism to insist Sony should have made different choices and assume ideal outcomes.
 
I agree. PS3 only looks 'bad' because ATI came up trumps on Xenos. If that had turned out badly, there'd be zero complaints. People are making comparisons now that couldn't ahve been made during the design phase. Unless you have awesome industrial espionage feeding back intel on what your rivals are doing and what sort of of technological performance they are chasing, you can only design your own product to your own spec. PS3 as an all round product is pretty good. There's a good mix of programmable shader power or pure, programmable performance, for devs to do whatever they want with. Ken provided them with a lot of resources. The devs may have prefered less flexible, more easily accessible graphics performance...well, that's leading us to what appear to be changes int he way the consoles are being designed next-gen. And some people are disappointed at the lack of bizarre, custom hardware! So you can't please all the people. ;)
 
You're somewhat misunderstanding. The theory is that instead of spending many millions on making Cell, Sony could have given that money to nVidia to make a G80-type GPU ready in 2005/6 for PS3, just as ATi managed to get US for XB360 a year or two before they had the tech in their desktop parts. If that's possible, which I seriously doubt (as you say, RnD is the limiting factor, and just throwing money at a problem doesn't solve it), then PS3 could have been graphically more capably at the same price. Or somesuch.

But it's all what if's, none of us really know what times this applies too, it could be just a simple thing like Nvidia saying no as G80 wasn't ready, not enough engineers, not enough time.

You mention 2005/06 but as none of know exactly how far Nvidia was with G80 at that time or any time got that matter it's just flat out impossible to be able to say if it could been possible.

What if Sony contacted Nvidia say 2 years earlier in 2003/2004 for a custom solution would it of been doable in 2 years? I doubt it as according to my previous post Nvidia only started G80 in 2002 maybe even early 2003 so by 2003/04 they were possibly still in the heavy design stages and were no where near close to having any thing that could be considered final enough for Sony to use.

If you were Sony would you of wanted a GPU scraped together based on an unfinished, unproven architecture? I know I wouldn't...

You also have took at performance, An 8600GTS with 32 US and a core clock that's 125mhz faster then RSX was beaten by a desk top 7800GTX in pretty much every situation!

You would needed at least 64 US to of made the move to G80 worth while, the 9600 GT was the only card to come with 64 US and that didn't arrive until Feb 2008.

Sony made the best choice going with G70, it was proven, cheap and available and offered decent performance. The top flight G80 based cards were much faster but all the cards that used G80 and operated at 7800GTX power levels and thermals were a lot slower.

An 256mb equipped 7800GTX consumes 85w at full pelt, A 512mb 9600Gt requires 95w at full load. All of the high end G80 cards all pull over 100w with the 8800Ultra pulling 175w at max capacity.

Given the time scale, where Nvidia was with G80, the power and thermal limits I seriously doubt Nvidia could of offered any better then G70.

What ATI was doing at the time and there production time scale is irrelevant to what Sony and Nvidia were doing.


Those kind of "what if?"s are pretty pointless. What if they bet on GPU, gave nVidia a bunch of money to develop a breakthrough design before they were really ready and got something as broken as the original GeForce FX? The PS3 would have been in an even worse position. Even with hindsight being 20/20, it's pretty obnoxious revisionism to insist Sony should have made different choices and assume ideal outcomes.

Exactly!
 
R600 being in Xenos is completely irrelevant ...

Especially when you consider that it wasn't! R400 (2003/2004 ish, canned) -> R500 (2005, Xbox) -> R600 (2007).

With Nvidia there was no "R400" for them to develop / evolve a unified shader console design from in a relatively short period.

You can't take away from MS that they looked ahead, chose an unproven architecture and did a great job though.

Was RSX based on G71 though or a die shrunk G70? The internals were modified between G70 (302m transistors) and G71 (278m transistors) to reduce pipeline length.

I guess RSX could be based on G70 rather than G71. Given the timescales and both of them being designed for 90 nm I'd just assumed Nvida had done a sister version of G71, but I could easily be wrong.
 
But it's all what if's, none of us really know what times this applies too, it could be just a simple thing like Nvidia saying no as G80 wasn't ready, not enough engineers, not enough time.
Oh, I agree with you. We in fact know Sony didn't want a "GPU scraped together based on an unfinished, unproven architecture" because they reportedly evaluated at least an option from Toshiba and maybe an internally developed idea, and they seemingly changed at the last minute to a more conventional option.

That doesn't stop some from believing that just throwing money at creating a new GPU could have seen some special, revolutionary chip created in time, better than anything nVidia could (and did) develop in the same time period without Sony prodding them to do better.
 
Oh, I agree with you. We in fact know Sony didn't want a "GPU scraped together based on an unfinished, unproven architecture" because they reportedly evaluated at least an option from Toshiba and maybe an internally developed idea, and they seemingly changed at the last minute to a more conventional option.

That doesn't stop some from believing that just throwing money at creating a new GPU could have seen some special, revolutionary chip created in time, better than anything nVidia could (and did) develop in the same time period without Sony prodding them to do better.

What if Sony had elected to contribute an equivalent amount of engineering and financial resources towards collaborating with Nvidia to produce a GPU instead of collaborating with Toshiba/IBM to produce Cell?
 
I continue to believe that PS3 as a design ended up being less than the sum of its parts. It was a system whose individual components instead of complimenting each other, ended up bottlenecking each other and preventing any one of them from ever really reaching it's full potential. OTOH, it's a testament to the quality of those individual components that when developers have had the budget, time, talent and incentive to work around the bottlenecks and leverage the system's strengths they have been able to achieve the results that they have.

My main criticism remains that several factors unrelated to, "what would make the best games machine" informed several of the major decisions during the PS3's design process. I believe that had those peripheral factors not existed, the PS3 would have been a different, better console.
 
What if Sony had elected to contribute an equivalent amount of engineering and financial resources towards collaborating with Nvidia to produce a GPU instead of collaborating with Toshiba/IBM to produce Cell?
It's not just money that advances technology; there's understanding. G80 was only possible with time spent researching how to implement US. If you consider, say, 10 billion spent on graphics RnD over the next 10 years at nVidia resulting in a DX17 GPU. If some multibillionaire was to dump 10 billion on nVidia's doorstep tomorrow, would nVidia be able to release that DX17 part in just one year instead? Just buying a load of engineers and laboratories isn't enough. The money spent on RnD is actually buying time and expertise to learn how to make things better. The limiting factor is human intelligence in learning how things work and how to change them for the better.

I'm nota GPU engineer and don't really understand what the limiting factors are (we have got GPU engineers on this board though. Fingers crossed they show up and contribute!), and how quickly ideas can be explored, iterated, and produced. But I see nothing in the history of processors to suggest that something spectacular can always be created just by investing more. A custom part is a variation of existing RnD AFAICS. The best Sony could have done IMO was to commission ATi for a GPU, same as MS, and they'd have got a Xenos-style GPU. Which would have resulted in a system like XB360...not really any better off than PS3 and without the potential to use the tech elsewhere (which never manifest itself for STI). MS certainly got better value for money, but that's not something a business can predict with any certainty.
 
Oh, I agree with you. We in fact know Sony didn't want a "GPU scraped together based on an unfinished, unproven architecture" because they reportedly evaluated at least an option from Toshiba and maybe an internally developed idea, and they seemingly changed at the last minute to a more conventional option.

That doesn't stop some from believing that just throwing money at creating a new GPU could have seen some special, revolutionary chip created in time, better than anything nVidia could (and did) develop in the same time period without Sony prodding them to do better.

But what would they of got for the money? Nvidia were busy with G80 and looking at the time scale G80 was not mature enough to be worked on by a separate team for use by Sony, ATI could of done something but with having a contract with MS could of caused issues.

Toshiba should of been the last choice as let's face it, they're hardly GPU giants.

Maybe they could gone the PS2 route and had a similar chip to the GS but much more beefy?

Also wasn't RSX down clocked from its original 550Mhz to 500Mhz at the last minute

And just how much faster is Xenos? I'm a PC gamer and as such don't really keep up to date with the consoles but I constantly hear that Xenos is the more powerful out of the systems, but by how much? 5% 10% more?

I still believe that going for G70 was not that bad of a choice, they could ended up with a lot worse and any thing remotely better would of cost a fortune or delayed the machine.
 
That's exactly what I'm saying in my other posts in this thread!

Slightly beefier G70 design, maybe some extra shaders+higher clocks.

Some kind of Xenos variant.

Some kind of R580 variant, cut down clocks and maybe a texture to shader ratio of 2:1 instead of 3:1

Some king of R570 variant, I loved my X1950 Pro, it rocked!!!

Completely wild design from Toshiba.

Some gimped G80 chip.

Beefed up super version of PS2's GS.

From those options a variant of either R580 or R570 would of been very interesting IMO, not running US but could of been very fast and powerful options.
 
All the debate over whether a G80 or R580 variant could have been possible for PS3 is taking us away fro the original point. That being that PS3 would likely have been more competitive with the 360 had Sony never attempted to use Cell and just gone with a more traditional CPU and large GPU setup.

The simplest way to demonstrate that is to look at the transistor budget of both consoles. PS3 has about 35 million more transistors to play with than the xbox 360. If they had put the same CPU in there as what the 360 were using (be than Xenon or something else as Xenon couldn't have existed without Cell) then they'd have had an extra 35m transistors to play with in the GPU as well as a years worth of process maturity which could equate to higher clock speeds. So they could simply have gone to ATI (early in the development process) and got a beefier, faster versions of Xenos which would have resulted in an unambiguously faster console.
 
Slightly beefier G70 design, maybe some extra shaders+higher clocks.
That's what they planned on, but yields weren't high enough to have it in launch quantities. That's why the chip is what it is.
Some kind of Xenos variant.
IIRC, MS owns a lot of IP related to that chip.

IMO the problem isn't could they have gotten more bang for their buck, since so much software is cross-platform, but could they have gotten similar bang for much less buck.
 
Except Transistor wise Sony could of afford to go with Xenos and Cell which would be by far the most interesting and best console. Development cost wise I have no idea and I'm not sure we'll ever really get the figures for what Sony paid Nvidia and Microsoft paid Ati. They could be similar as well. Which just means Ati I did a better job.
 
Back
Top