Why did Sony partner with Nvidia instead of ATI for the PS3?

I wonder how large NVIDIA's RSX team is? It probably isn't very large, because all it has to do is adapt their existing design to use the FlexIO bus.

I bet ATI could have found the resources to do the PS3 GPU. It certainly would have been a PR win for ATI if all three next gen consoles had chosen ATI!

I think the main reason Sony chose NVIDIA was because, by coincidence, NVIDIA's next big product cycle is coming out at roughly the same time as the PS3 launch. So the RSX is the best technology that was available, given the compressed time frame in which the PS3 GPU had to be developed.

There are some other less important, but still quite nice, benefits to Sony:

NVIDIA has been shipping Shader Model 3.0 PC graphics cards for quite some time. PS3 Alpha kits can use off-the-shelf video cards that closely match the features (if not the performance) of the RSX.

NVIDIA is probably hungry for the business, and may have given Sony a better deal than ATI was willing to.
 
Two reasons, I think:

First, as Jawed said, cg and OpenGL (Nvidia has great GL drivers)

Second; price. I think Nvidia was pretty desperate to be in at least one of the next gen consoles. It would make Nvidia look bad if all next gen consolers were powered by Ati. So I'm sure Sony got a really, really sweet deal out of Nvidia.
 
Megadrive1988 said:
Nvidia has the closest thing there is to perfect drivers.

NVIDIA SM3.0 compilers/optimisers are far far from perfect! :(
 
NVIDIA are pretty much the only player in the workstation business. Or at least, they are in a much much better position than ATI.

Cell is meant to be used on workstations.

Sony has bought Alias|Wavefront.

All professional 3D applications work better on Nvidia hardware than ATI, simply because of the drivers.

I think that in the end, when it comes to consoles, the XGPU and the RSX are pretty similar in final performance, so i'm quite sure that Sony had to look at other aspects of their businesses, where Nvidia is a much stronger partner than ATI.

With NVIDIA, it all ties in together nicely.
 
_phil_ said:
Sony has bought Alias|Wavefront.

??? you take drugss LB ?

Hey that's what i heard around here from "reputable" sources... If they haven't bought them, they were very interested. I'll google it now.

And to answer your question: yes. It would be rude not to answer a question :D
 
:D

A/W used to be middle-ware for every console ,every generation,just like discreet , winamp ,coffe makers, aspirin, pizza,and some other legal or less legal substances. :)
 
_phil_ said:
:D

A/W used to be middle-ware for every console ,every generation,just like discreet , winamp ,coffe makers, aspirin, pizza,and some other legal or less legal substances. :)

:LOL: Oh :oops:

I thought 3D Studio Max was standard in the videogame market, while Maya was more of a movie industry kind of application.
 
They're ALL becoming middleware developers because of COLLADA - the interchangeable data format for 3D.

Regards ATI not having enough time to develop a PS3 GPU, why would that be? The IP they're developing for MS is surely ATI's, just licensed for MS to use and they can use it elsewhere more or less(?). They've used eDRAM already in Flipper. They were working on unified shaders already. Just a case of integrating those same techs into a Cell friendly architecture.

The idea of a seperate back-buffer processor (embedded logic) is the only aspect I can see being unique to Xenos, and maybe ATI are restricted in providing this technology to other parties?

I'd assume like other here that nVIDIA were chosen because of their highend presence and OGL support, plus developers relations and middleware. What's ATI's middleware like by comparison?
 
Shifty Geezer said:
The idea of a seperate back-buffer processor (embedded logic) is the only aspect I can see being unique to Xenos, and maybe ATI are restricted in providing this technology to other parties?

I don't think you understand Xenos then.
 
So what else is unique that wasn't ATI work anyway? What's unique to Xenos that won't be appearing in other ATI chips?
 
shaderguy said:
I wonder how large NVIDIA's RSX team is?

The team supposedly comprised of roughly 50 people, at least according to David Roman at NVidia. So a small-ish team, but then again not tiny by any means.
 
Acert93 said:
Tacitblue said:
I'm curious, since the Cell/RSX combo has enough power to generate 2 HD outputs, what can it do if it's running only one, how much system resources are needed to generate an additional 720p output for example and how does that compare to AA performance penalty for single screen AA. Be interesting if they can do both at the same time and not run into a bottleneck.

Old GF4 440 MX cards can do that... in desktop apps. The problem with games is with 2 720p images you are doubling you are pretty much doubling bandwidth needs. I am not sure if we will know how well the PS3 will be able to handle this until we get more info.

The plus side of the PS3 dual outputs is that this may filter back to the PC, where I am guessing more PC users have 2 monitors compared to 2 HDTVs. I just wish they had done 3... 3 outputs is much better than 2 for gaming because most games center at the middle of the screen. Now 2 screens in 4 players split screen would be pretty smooth. Party Game!

I think that the RSX merely DISPLAYS its output in dual fashion but does not RENDER two independent fields of view. If thats true its just like any other PC cards with dual output capabilites and not a statement of RSX power...
 
I think that the RSX merely DISPLAYS its output in dual fashion but does not RENDER two independent fields of view. If thats true its just like any other PC cards with dual output capabilites
I can display separately rendered views on my GF6800GT just fine thank you very much.

If dual output on PC cards were just doubling one picture noone would ever bother using them, the whole point of using dualhead is to get a larger desktop.

It's obvious that games will take a performance hit regardless of any bandwith requirements - 32:9 field of view could very well double visible geometry, it's not just a simple fillrate increase.
 
Fafalada said:
I think that the RSX merely DISPLAYS its output in dual fashion but does not RENDER two independent fields of view. If thats true its just like any other PC cards with dual output capabilites
I can display separately rendered views on my GF6800GT just fine thank you very much.

If dual output on PC cards were just doubling one picture noone would ever bother using them, the whole point of using dualhead is to get a larger desktop.

It's obvious that games will take a performance hit regardless of any bandwith requirements - 32:9 field of view could very well double visible geometry, it's not just a simple fillrate increase.

Most dual head displays either twin (nvidia calls them "clone" images) or extended images. Most DONT display separate unsless they are separate desktops (of the 2d variety). But I would love to learn more about this capability (to render games to separate outputs which is the question at hand) if it exists. ;)
 
blakjedi said:
Most dual head displays either twin (nvidia calls them "clone" images) or extended images. Most DONT display separate unsless they are separate desktops (of the 2d variety). But I would love to learn more about this capability (to render games to separate outputs which is the question at hand) if it exists. ;)

They all have seperate outputs for each screen. Have since the first consumer card (G400) supported dual monitor (I shipped a game back then that used it).

If you haven't seen the option, I suggest you play with your card a bit more...
 
Are you (blakjedi) talking about seperate backbuffers/frontbuffers for each screen vs. one buffer shared between two screens?

Which is currently used?
 
Back
Top