Why did Sony partner with Nvidia instead of ATI for the PS3?

Acert93

Artist formerly known as Acert93
Legend
While we know very little about the RSX and Xenos chips (features and real world performance), the few tantilizing details we have gleaned has me wondering: Why did Sony not partner with ATI? What technologies did Nvidia offer that made an offering from ATI less enticing? Was it an issue of technology, cost, strategy, and/or other?

I ask this question because 2 features of Xenos seem, at least at the surface, as "perfect matches" for CELL.

1. Unified Shaders. The benefit is flexibility. The GPU could be treated as a traditional GPU, or an ambitious developer could offload more of the vertex work to the SPEs and use the GPU for Pixel Shading--without leaving any of the GPU's shader units idle. So games that use the SPEs for heavy physics, AI, and other game related tasks would still have the power of a next gen GPU; and developers looking for a lot of eye candy could dedicate the SPEs to vertex work and use the GPU shader units for pixel shading.

2. eDRAM. The benefit is dedicated bandwidth for the backbuffer, leaving the memory pool bandwidth to be utilized by CELL and for the transfer of textures and large meshes. With the ambitious goal of 1080p anti-aliasing, HDR, and other bandwidth intensive tasks from the framebuffer are going to consume a lot of system bandwidth. The tradeoff of some logic to ensure high levels of AA and HDR would be a great partnering with 1080p. (This may very well be possible with the RSX without eDRAM, but obviously eDRAM would not hurt... I am also assuming that the eDRAM implimentation in Xenos will do what ATI set out to accomplish).

So I ask: What did Nvidia bring to the table that Sony found more enticing than ATI? Any ideas?
 
Re: Why did Sony partner with Nvidia instead of ATI for the

Acert93 said:
So I ask: What did Nvidia bring to the table that Sony found more enticing than ATI? Any ideas?

A kick ass GPU? A better deal? Nvidia wanting revenge? Toshiba's RS GPU totally blew goats?
 
Let's remember who owns R500 IP. If I had to guess, I would say that since Sony didn't have time to order a full-custom design, they would have to settle for PC-derivate from either manufacturer. At this point, it could be anything - maybe Nvidia's PC-derivate suited their needs better then ATIs, maybe Nvidia, fearing being shutout of next-gen console market priced RSX aggressively, maybe ATI simply didn't have enough recourses at this point to commit to yet another major project; maybe Sony didn't feel comfortable buying form the same shop as their competitors; maybe some combination of the above. Who knows.
 
It sounds like from ati they could have gone custom (though not enough time ) ,or r520 based product as the next product the r600 seems to be based off the r400 and wont be out till late 2006 from what i understand which would be to late .

So they most likely opted with the rsx as the r520 while it will be very fast most likely wont have the smae features as the nvidia part
 
a few random thoughts.

Nvidia's massive amount of engineering resources in one location? whereas ATI is all spread out on two different coasts, across at least 3 GPU design centers. Nvidia wasnt busy with 2 other console contracts at the time. Nvidia has the closest thing there is to perfect drivers. not actually perfect but probably had an edge over ATI. Jen Hsun and Ken share very commen visions. (not that Dave Orton of ATI doesnt but Ken and Jen ideas seem to mesh almost perfectly) Nvidia's proven track record of a console GPU (as much as I perfer ATI over Nvidia, ATI has not really got a console GPU under its belt yet, the ArtX Flipper in Gamecube doesnt count obviously) the CG toolset
 
I think Jawed is probably correct, though I think there is more to it than that. nVidia has great developer relations, especially in the PC space. Sony already has good relationships with console developers, but if they want to advance into Microsoft's turf they need to win over the PC turned console developers. nVidia gives them access to those people. Another possibility is that IBM and Sony are somewhat poised to invade the PC space with the cell processor. They may never actually do it, but I could see nVidia being more willing to cooperate with such a plan than ATI is.

As far as technical reasons go, it's probably a bit of a tough call. Maybe originally they planned to have more memory throughput and didn't care about the edram, or maybe nVidia has something in store that will make it a moot point. Perhaps they are planning on sharing some of the vertex shader work with the cell and didn't want a unified shader architecture. It's pretty tough to know without more info about the RSX.

Nite_Hawk
 
Megadrive1988 said:
Nvidia has the closest thing there is to perfect drivers. not actually perfect but probably had an edge over ATI.

Uh oh.


Megadrive1988 said:
Jen Hsun and Ken share very commen visions. (not that Dave Orton of ATI doesnt but Ken and Jen ideas seem to mesh almost perfectly)

I find that statement intriguing... would you mind elaborating, MD?
 
as much as I perfer ATI over Nvidia, ATI has not really got a console GPU under its belt yet, the ArtX Flipper in Gamecube doesnt count obviously

Why?, the people from ArtX all work at ATI now. In fact aren't most of ArtX still working as a team within ATI?
 
StarFox said:
as much as I perfer ATI over Nvidia, ATI has not really got a console GPU under its belt yet, the ArtX Flipper in Gamecube doesnt count obviously

Why?, the people from ArtX all work at ATI now. In fact aren't most of ArtX still working as a team within ATI?


By the time ATI aquired ArtX, the Flipper was already completed, correct me if I'm wrong.
 
DigitalSoul said:
StarFox said:
as much as I perfer ATI over Nvidia, ATI has not really got a console GPU under its belt yet, the ArtX Flipper in Gamecube doesnt count obviously

Why?, the people from ArtX all work at ATI now. In fact aren't most of ArtX still working as a team within ATI?


By the time ATI aquired ArtX, the Flipper was already completed, correct me if I'm wrong.

Yes. But all the people who worked on the Flipper still work at ATI. All their [ArtX's] experiences with it is collectively ATIs. So it might as well be the same as if ATI had worked on the Flipper from the start ...
 
StarFox said:
as much as I perfer ATI over Nvidia, ATI has not really got a console GPU under its belt yet, the ArtX Flipper in Gamecube doesnt count obviously

Why?, the people from ArtX all work at ATI now. In fact aren't most of ArtX still working as a team within ATI?
Even Ati's CEO is from ArtX so I'm not follow Megadrive's reasoning. Regardless, I doubt this factored into Sony's decision making. Execution, from a hardware perspective, in the PC or console space is not that different.
 
Sony didn't have enough time for a custom design, so their only choice from ATI would have been the R520 (or possibly the refresh?).

How would that have looked, especially when you consider Xenos is far better suited to the console environment.

Nvidia was their only option.
 
Does ATi even have the resources to produce an R520 for PS3, R520+AMR for PC and GPU's for Revolution and Xbox360?
 
trinibwoy said:
Does ATi even have the resources to produce an R520 for PS3, R520+AMR for PC and GPU's for Revolution and Xbox360?

Don't forget R580(?), RV520, R600 and the rest - both ATI and Nivida have multiple PC chips in the pipeline, each requiring a design team.
 
I'm curious, since the Cell/RSX combo has enough power to generate 2 HD outputs, what can it do if it's running only one, how much system resources are needed to generate an additional 720p output for example and how does that compare to AA performance penalty for single screen AA. Be interesting if they can do both at the same time and not run into a bottleneck.
 
Tacitblue said:
I'm curious, since the Cell/RSX combo has enough power to generate 2 HD outputs, what can it do if it's running only one, how much system resources are needed to generate an additional 720p output for example and how does that compare to AA performance penalty for single screen AA. Be interesting if they can do both at the same time and not run into a bottleneck.

Old GF4 440 MX cards can do that... in desktop apps. The problem with games is with 2 720p images you are doubling you are pretty much doubling bandwidth needs. I am not sure if we will know how well the PS3 will be able to handle this until we get more info.

The plus side of the PS3 dual outputs is that this may filter back to the PC, where I am guessing more PC users have 2 monitors compared to 2 HDTVs. I just wish they had done 3... 3 outputs is much better than 2 for gaming because most games center at the middle of the screen. Now 2 screens in 4 players split screen would be pretty smooth. Party Game!
 
Back
Top