Why did Sony use a G70 RSX instead of something better? *spawn

eastmen

Legend
Supporter
I really wonder what cell + geforce 8 series gpu could have done. That $500 budget could have gone to a cell + geforce 8 and 512megs of ddr with the 256 rdram. Could have been an amazing console if they didn't try to go with that weird gpu and force bluray into the console.
 
I'm also sure if NVidia had been given more time and freedom in the design it would have been better. Maybe some kind of hybrid G80 tech could have happened like Xenos.
 
Last edited:
I'm also sure if NVidia had been given more time and freedom cash in the design it would have been better. Maybe some kind of hybrid G80 tech could have happened like Xenos.

There, I fixed it. :p OK, that's a little tongue in cheek.

It was a minimal effort product due to limited time to implement as well as NV not getting as much as they would have demanded if they were to have used a G80 based GPU core. While actual G80 based products didn't release until basically the same time as the PS3, had Sony been willing to pay as much as NV would have demanded for using that generation of GPU core, then NV would have been more than happy to use a G80 based GPU core in the PS3. Similar to how AMD was more than happen to use what could arguably be called RDNA2 based GPU cores (they share features of RDNA1 and RDNA2 so it would be correct to call them either RDNA1 based or RDNA2 based) in consoles prior to them appear in an official AMD consumer product.

Doing that, however, is costly, especially as it would likely have required NV to divert resources from a consumer launch of G80 in order to have it integrated into the PS3 in time for launch. And whatever NV would have demanded would have been incredibly excessive for a consumer console that hoped to eventually make a profit.

So basically, not a practical solution for Sony as they already knew the PS3 was going to be late, far over budget, and already would be generating a significant per console loss for multiple years. Spending big on having a 3rd party GPU integrated into the PS3 at that point was likely a bridge too far for whoever was in charge. By that point Ken Kutaragi was likely facing increased scrutiny and opposition within Sony due to Japanese corporate politics and he would have had to have been treading more carefully within the company so that they would at least allow him to save face by not immediately removing him either prior to or at the launch of the PS3. As it was while he was removed as president of Sony, he was allowed to stay on for about half a year. In terms of Japanese cultural corporate operations, he was allowed to stay that long in order to save face due to how well he'd served the company in the past but his leadership during the development of PS3 meant that they had to remove him from the company.

Regards,
SB
 
RE ; RSX

Nvidia in ps3 deal was announced in December 2004, half a year before console unveil in may 2005 with launch initially targeted at fall 2005 or spring 2006. Even to outsider it is obvious there wasn't a time to do any real custom work from NVidia roadmap besides NVidia willing to do so ( and history points, they are rather not willing to do so to licensees ).

Another matter was, If I recall in 2005 NVidia was little busy spreading FUD about unified shader concept not to look behind ATI xenos and( delayed )r600 so its not like they had something ready.

If we are discussing alternate history, sony and nvidia doing real custom work staring years early and intended for launch in late 2006 , there should be a possibility for lets say some custom 8600 at least. Cell+ little g80 + unified pool of maybe even 768MB gddr3 would be cute. Wonder how Cell would work with GDDR3 instead of XDR , Were there any technical merits for XDR memory better suiting cell architecture? against this it would be perhaps shared bus stealing cells bandwidth in such scenario.
 
RE ; RSX

Nvidia in ps3 deal was announced in December 2004, half a year before console unveil in may 2005 with launch initially targeted at fall 2005 or spring 2006. Even to outsider it is obvious there wasn't a time to do any real custom work from NVidia roadmap besides NVidia willing to do so ( and history points, they are rather not willing to do so to licensees ).

Another matter was, If I recall in 2005 NVidia was little busy spreading FUD about unified shader concept not to look behind ATI xenos and( delayed )r600 so its not like they had something ready.

If we are discussing alternate history, sony and nvidia doing real custom work staring years early and intended for launch in late 2006 , there should be a possibility for lets say some custom 8600 at least. Cell+ little g80 + unified pool of maybe even 768MB gddr3 would be cute. Wonder how Cell would work with GDDR3 instead of XDR , Were there any technical merits for XDR memory better suiting cell architecture? against this it would be perhaps shared bus stealing cells bandwidth in such scenario.

Toms Hardware in a 2006 article on G80 states that Nvidia had spent 4 years on it, so that puts it around 2002.

So by the time Sony started on or had some idea of what PS3 was going to be Nvida should have been well in to G80 design phase and thus would have had the time to work with Sony.

So time was not the reason, I feel it was likely marketing as did Nvidia really want to market this awesome new architecture for PC and also have it in a console too?

I also vaguely remember that Cell needed XDR for it's bandwidth and latency.
 
Toms Hardware in a 2006 article on G80 states that Nvidia had spent 4 years on it, so that puts it around 2002.

So by the time Sony started on or had some idea of what PS3 was going to be Nvida should have been well in to G80 design phase and thus would have had the time to work with Sony.

So time was not the reason, I feel it was likely marketing as did Nvidia really want to market this awesome new architecture for PC and also have it in a console too?

I also vaguely remember that Cell needed XDR for it's bandwidth and latency.
It's not as if Nvidia did not have other products in the pipeline at the time. Nvidia GPU's had tight launch schedules and were released every year so time was likely a major factor in the roadmap at the time with regard to development resources. Geforce 5800 Ultra launched in March 2003, Geforce 6800 Ultra in April 2004, Geforce 7800 GTX in June 2005 and Geforce 8800 in November 2006.
 
It's not as if Nvidia did not have other products in the pipeline at the time. Nvidia GPU's had tight launch schedules and were released every year so time was likely a major factor in the roadmap at the time with regard to development resources. Geforce 5800 Ultra launched in March 2003, Geforce 6800 Ultra in April 2004, Geforce 7800 GTX in June 2005 and Geforce 8800 in November 2006.

And yet they managed it with OG Xbox....
 
It's not as if Nvidia did not have other products in the pipeline at the time. Nvidia GPU's had tight launch schedules and were released every year so time was likely a major factor in the roadmap at the time with regard to development resources. Geforce 5800 Ultra launched in March 2003, Geforce 6800 Ultra in April 2004, Geforce 7800 GTX in June 2005 and Geforce 8800 in November 2006.

Not to mention, the first console sized G8x variant didn't arrive until Spring 2007, on the new 80nm process.

There wasn't a pre-existing design to piggy back off, like Sony did with RSX (and MS did with the GF3 and GF4 mash up, NV2A).

Nvidia would have had to move a far greater amount of resources off other products, delaying them, while simultaneously accelerating their roadmap (if that's even possible) to bring a G8x chip into much higher volume production earlier than they did with G80. And they would have had to do this at very short notice. (Again, Sony claimed to be targeting a spring 2006 launch).

Even assuming Nvidia could have brought their G8x architecture forward for Sony, and changed G80 to be rather different for PS3, and done so at relatively last minute (lets just pretend), why would you want to hurt your lineup of high margin, industry dominating PC GPUs like that?
 
And yet they managed it with OG Xbox....

NV2A was an incremental advance over GF3 which had been released 8 months earlier.

Geforce 4 released 2 or 3 months after NV2A, and was an even more advanced development of Geforce 3 than NV2A was.

G80 was new, there was no GF3 style already released chip to piggy back off.

Edit: Sorry Shifty, if you move the convo elsewhere I'll carry on there, otherwise I'll just drop it.
 
NV2A was an incremental advance over GF3 which had been released 8 months earlier.

Geforce 4 released 2 or 3 months after NV2A, and was an even more advanced development of Geforce 3 than NV2A was.

G80 was new, there was no GF3 style already released chip to piggy back off.

And good as all that is, Nvidia still managed it with OG Xbox, ATI even managed it with Xbox 360.

It's likely that Sony couldn't afford it, G80 has a larger memory bus than G70 and this was halved for PS3 and halving G80's bus would have still required more VRAM than RSX did which would have made an already expensive console even more expensive.

Power, thermals would also have played a part in the choice too and could have possibly required the PS3 console itself to be re-designed to accommodate a larger cooler.

There are so many factors that could have been the reason why G80 wasn't used other than 'there just wasn't enough time for Nvidia'
 
Maybe g70 was easier to implement than g80?

As Sony also didn't have lot of time to switch from dual cells to Nvidia.

Btw was ati in exclusive contract with Microsoft?


Anyway, an "Iwata ask" kind of article from Sony about ps3 would be amazing.

Dang, I miss Iwata-san and his Iwata ask articles
 
Xenos was said to be based on a R400 PC project that had been shelved. Would have been on 130nm at the time? 90nm in 360.
 
Last edited:
And good as all that is, Nvidia still managed it with OG Xbox, ATI even managed it with Xbox 360.

NV20 was a pre-existing chip as was Xenon's basis (I beleive it was R500, not R400, my dude!). Probably ~2+ years of design and integration time. Sony didn't really have this.

It's likely that Sony couldn't afford it, G80 has a larger memory bus than G70 and this was halved for PS3 and halving G80's bus would have still required more VRAM than RSX did which would have made an already expensive console even more expensive.

Cost is a really big factor with massive chips, that's true. It took Nvidia months and a process shrink to make a smaller G8x for the mainstream. The first smaller and more affordable G8x devices were well into 2007.

Power, thermals would also have played a part in the choice too and could have possibly required the PS3 console itself to be re-designed to accommodate a larger cooler.

There are so many factors that could have been the reason why G80 wasn't used other than 'there just wasn't enough time for Nvidia'

Time was absolutely the defining factor, but you are correct in saying there are other factors that would be applicable too. Features vs die area has always been a thing (every good thing in your life has an expensive overhead).

Truth be told, I honestly believe Sony got the best GPU for PS3 that they could under the circumstances, and it was second only to Xenos. And while I was was dismissive of PS3 initially due to its inefficient silicon mass, I was wrong. In the right hands it's a BANGER, every bit as much as the X360.
 
NV20 was a pre-existing chip as was Xenon's basis (I beleive it was R500, not R400, my dude!). Probably ~2+ years of design and integration time. Sony didn't really have this.

There was an aborted experimental R400 project that was replaced by a more straightforward continuation of R3x0 named R420. That initial R400 project seemed to have been the basis for Xenos.
 
Last edited:
Did nvidia ever really do a custom solution for anyone ?


I was just posing the question that the Xbox 360 launched in 2005 at $300-$400 but sony launched in 2006 with $500-$600 as its price point. One of the major reasons for that price increase was bluray. Would sony have been better off spending that additional $100-$200 (maybe even more) on actual better silicon.

Would simply increasing ram to 256/512 or even 512/512 been enough to give visually better looking games over the 360 ? maybe they could have put in a larger 7000 series ?
 
Would simply increasing ram to 256/512 or even 512/512 been enough to give visually better looking games over the 360 ? maybe they could have put in a larger 7000 series ?

There wasn't really a larger 7000 series GP.

The G70 in PS3 did have its memory bus halved to 128bit and the ROP count dropped from 16 to 8 which had quite the impact on performance.

People have recently found a way to overclock RSX and the VRAM in PS3 via a custom firmware and it actually shows really good performance scaling in the comparison videos on YouTube.
 
I always interpreted G70's use in the PS3 as a last minute decision as certain limitations with the Cell in GPU workloads came to light. The final PS3 was kind of a Frankenstein's monster of fixes to solve problems as they arose. A traditional GPU substituting a second Cell or Cell based IC for graphics, the addition of the GDDR3 memory to actually make use of the proper GPU, Flex I/0, etc.

G80 would've been too big and hungry obviously, the 8600 core, kind of unambitious, while G70 was definitely further along and ready for earlier integration.
 
Has anyone from within Sony ever talked about the Cell based GPU, the issues it presented and what specifically caused it to be cancelled?
 
Back
Top