(Interview) NVidia on Playstation 3, Xbox 360, & Wii

So Nvidia couldn't handle a separate console GPU contract? Is that because of staffing or (as hinted at above by Jawed) the RSX didn't diverge too greatly from their already established roadmap?

How do these things work? Sure MS/Sony/Nintendo pay for what they get but the realities of that are... what?

I'd assume that you work to what your capacity already is rather than trying to expand short term right? Put it another way, ignoring the realities - could ATI have handled a Sony contract (whatever that may have been: custom or "off the shelf") as well as the 360 contract? (After all they've managed 360 and Wii!)
 
It's well known nVidia had a less than satisfactory deal with MS with the xbox, whereas the deal with Sony is propably going to make more sense economically to them, with Sony and partners manufacturing the RSX and all.

Did they, I thought MS had the worst end of the deal, didn't they have to but all the chips from nvidia, and nvidia didn't drop the price when the process shrunk

I thought that is why the xbox lost so much money, because Intel and nvidia didn't let them fab the chips somewhere else, so they had to buy directly from them

I remember it was a big deal when nvidia first announced that they were letting Sony fab the chips themselves
 
On the topic, I think Nvidia don't design a specific GPU because Sony don't offer them enought money nor they have enought time to do so.
Anyway this PR talk brings nothing new on the table.
 
Jawed said:
With nearly $500M in lost profit caused by FX (and the fact the architecture was a dead end) they had to rip up their roadmap, which is why they couldn't afford to do a fully custom console GPU, for anyone.

Pretty much. My point said nothing for the reasoning as to why they did not want to deviate from their roadmap (but you're correct, the effects of the GeforceFX were no doubt influential for nVidia at the time companies were looking for partners), just that it would be too expensive for them to do so, and they were not in a position to accomodate a special case relative to their own plans. This says nothing about Huang's opinion on the complexity or merit of MS's requirements, just that he thought and thinks that they were too different from their own.
 
I'd also say that they didn't design a new GPU because the current architecture generally works.

US would be good and make sense with the versatility, but two things against it in PS3 IMO are 1) nVidia haven't got it up to scratch and 2) Could it be too much versatility? With Cell able to take up GPU tasks, a GPU that could take up CPU tasks and be more flexible might be considered too much for devs to get their heads round. Keep it as much a known quantity and leave Cell in charge of worrying developers. The other option, eDRAM, has pros and cons too. PS3 seems to be doing all right without it so far from what we've seen.

Given the added benefits of cost scaling due to cross-over development, it seems a sound decision. For XB360 things were different. MS wanted a cheaper memory setup without the BW to support HiDef+the trimmings, so chose an eDRAM solution to work with that.

Much as I'm a fan of exotic architectures, I can't see there'd be a huge benefit to PS3 by going with expense of something new and totally custom.
 
Jawed said:
With nearly $500M in lost profit caused by FX (and the fact the architecture was a dead end) they had to rip up their roadmap, which is why they couldn't afford to do a fully custom console GPU, for anyone. That's why Sony got something off the shelf with some tweaks.
Jawed

Exactly. And to the others, no.. I didn't misunderstand his comments. He clearly stated that they couldn't afford to go too far away from their existing roadmap in order to design and supply a console GPU of the type that MS wanted.

We could not afford to build the graphics for the 360. Our most important asset is our people. If we use our people on a project where the economic return is not good enough, and there are other projects we could be working on, then we're going to lose money. We were a lot smaller company than ATI at the time. Maybe ATI could afford it and we couldn't.

Clearly a question of economics. Did they want to design the GPU for the 360? Doing so would have meant diverting resources away from the 6800, 7800, etc.. in order to to fill that goal. They believed it was more profitable for them to continue in the direction they were going in and pass by the 360 order. They then gave the PS3 a GPU based upon the tech that was already in their roadmap. Which is clearly different than what MS wanted for the 360 and what they were able to get from ATI

ATI is excited about unified shaders. If you pull back, how do you see if your people are making the right decisions?

For each one of our generations, we need to have a vision of what we want to do. It costs hundreds of millions of dollars to come up with a new architecture.

Does that mean something other than the fact that nVidia didn't have hundreds of millions of dollars required to change their roadmap to one of unified shaders in time to meet the requirements of MS and the 360 GPU?

As far as nVidia getting screwed on the Xbox deal.. HUH? nVidia made millions on the Xbox deal with MS because MS was stupid. There's also reports that nVidia is going to make even more money on the PS3 deal because they had Sony over a barrel... the Cell couldn't act as a GPU, and ATI was already committed to both MS and N by that time. I believe I read a report less than a month ago that said nVidia was getting $100 or so for every RSX sold to Sony.

I'd say nVidia made out far better this generation than any of the other players.
 
Shifty Geezer said:
US would be good and make sense with the versatility
But US is not required to support key SM3 features like performant vertex texturing or performant pixel shader dynamic branching. This isn't an argument about traditional versus unified architectures.

We should see how a "traditional" architecture works with G80 (obviously it has geometry shader, streamout and constant buffer features that are not traditional), where the performance of both VT and DB should be in a wholly different league. Fingers-crossed.

It's kind of a shame that PS3 isn't another 6 months later than it is - maybe then it could have had a version of G80 in it (say half-size, around 250m transistors)...

Personally I don't buy the "but Cell can help with the graphics" argument - as much as possible of graphics should be done by a GPU, with hardware tailored for the job. Indeed he's specific about this, CPUs doing what they're good at and GPUs doing what they're good at.

Not only does an SM4 architecture (like G80) add new features, but it takes a huge amount of workload off the CPU (and main memory, XDR in the case of PS3).

It seems to me that a G80-derivative for PS3 would have been an easy win, which makes the timing (and G70-derivative RSX) very puzzling. Bearing in mind that Vista is about a year late (compared to plans back in 2003/4-ish) NVidia should have been planning for an SM4 GPU in-time for late 2005... Even if G80 (as it is now) is more advanced (as a 2006 GPU) than it would have been as a 2005 GPU, I can't see how NVidia wouldn't have already been underway with something like G80 when Sony approached NVidia.

So it seems like the FX fallout put NVidia so far behind that NVidia was unable to even offer an early-G80-derivative (i.e. not feature complete, say no integer support and no FP16-HDR+AA) to Sony.

Arguably there's a problem that G80 (or a derivative) contains features that are essentially M$-confidential, so maybe NVidia was contracturally prevented from offering "SM4-like" features (well beyond what was prolly on the table during OpenGL discussions, back then?) to Sony - i.e. that the best NVidia could offer Sony was an SM3-like GPU without any SM4-like capabilities...

Jawed
 
I always have read the MS got screwed by the Nvidia deal that and the hard drive are the main reasons they lost so much on the xbox1. That is why MS wanted IP rights so they could make their own chips to save a lot of money vs last generation. From what I read they paid the same per chip from day one till the last day dispite die shrinks that made the chips much much cheaper to produce.
 
Jawed said:
With nearly $500M in lost profit caused by FX (and the fact the architecture was a dead end) they had to rip up their roadmap, which is why they couldn't afford to do a fully custom console GPU, for anyone. That's why Sony got something off the shelf with some tweaks.
I thought the NVIDIA CEO was talking about the difficulties of working for 2 companies at the same time, not about doing customization work for one company. Also the one who pays for GPU customization is a client, not NVIDIA. Unified Shader is not a console customization since it's said ATI has a plan to move to it in their PC GPU line earlier than NVIDIA. eDRAM is a customization, but it's not very likely that Sony wanted to take that path for the PS3 GPU solution.
 
one said:
I thought the NVIDIA CEO was talking about the difficulties of working for 2 companies at the same time, not about doing customization work for one company.
He doesn't have to tell the whole truth. The impact of the FX fiasco was that NVidia couldn't afford to do any customisation - they had their hands full making the 6 series.

Put another way, if G80 was more advanced than it's turned out, derivatives of it could, perhaps, have easily suited M$ and Sony, with low customisation.

Also, arguably, ATI spent rather more on Xenos than we've eked-out. There's a theory that R520's debugging was hampered by lack of people, because Xenos also had problems that needed to be solved and was a higher priority. You could argue that AMD is buying ATI (cos ATI is cheap) only because ATI slipped up with R520, exacerbated by the Xenos deal's priority.
Also the one who pays for GPU customization is a client, not NVIDIA.
It's not just a money problem but a people problem. These guys don't grow on trees.

Jawed
 
quest55720 said:
I always have read the MS got screwed by the Nvidia deal that and the hard drive are the main reasons they lost so much on the xbox1. That is why MS wanted IP rights so they could make their own chips to save a lot of money vs last generation. From what I read they paid the same per chip from day one till the last day dispite die shrinks that made the chips much much cheaper to produce.

If I remember well it was actually Nvidia who got screwed by MS, but not intentionally. Since XBOX wasnt doing well, Nvidia (and I think Intel as well) were furious with MS because they were loosing lots of money or not gaining the expected returns

I dont think Nvidia would have continued supporting MS after the losses and the "bad" deal they had with the XBOX.
 
Nesh said:
If I remember well it was actually Nvidia who got screwed by MS, but not intentionally. Since XBOX wasnt doing well, Nvidia (and I think Intel as well) were furious with MS because they were loosing lots of money or not gaining the expected returns

I dont think Nvidia would have continued supporting MS after the losses and the "bad" deal they had with the XBOX.

I think you got it backwards (mostly). You think nvidia and intel lost money on this? There was a dispute with nvidia that MS won in arbritration. However nvidia and intel made a ton of money off xbox 1. I don't think they were furious with MS since the money they made was based on chips produced and not royaltee structured. Nvidia also supplied multiple chips for the xbox. It was a really good deal for them.
 
Nesh said:
If I remember well it was actually Nvidia who got screwed by MS, but not intentionally. Since XBOX wasnt doing well, Nvidia (and I think Intel as well) were furious with MS because they were loosing lots of money or not gaining the expected returns

I dont think Nvidia would have continued supporting MS after the losses and the "bad" deal they had with the XBOX.

Intel and nVidia didn't pay for the xbox to be manufactured. Microsoft did, Microsoft lost money on the xbox. I have NO idea HOW you can even think Intel or nVidia got screwed since they got PAID for every console manufactured. It is entirely impossible for either Intel or nVidia to lose money on xbox sales unless they themselves gave MS a deep initial discount which they never did. MS paid nVidia for design work and Intel already had the tech handy since it was a current processor. The only reason nVidia got mad was because MS wanted them to lower the price of their chips when the manufacturing cost went down but good ol' nvida decided they would rather earn more per chip rather than the same amount.
 
Okay... So back to basics:

MS wanted a GPU that utilized a unified arcitechure because that is what was being supported/promoted/demanded by DX10. ATI didn't have any such GPU in the works, and neither did nVidia.

However, ATI had the resources available to not only produce their current line of PC GPUs but also forward-think towards unified architecture. nVidia was too focused on recovering their losses from the FX series to spend hundreds of millions of dollars on R&D in order to meet the requirements of the 360.

So essentially... ATI took a big hit when nVidia was providing a better product for the past 2-3 years with the 6000-7000 series of cards that out performed their PC offerings, but ATI got 1) The 360 contract and 2) took the R&D hit on developing a unified arcitecture before nVidia has done such a thing... and it appears from everything that I've read that nVidia knows they WILL have to go in that direction.. they just didn't have the assets available to go that route when MS came knocking on their doors for a 360 GPU.

The end result of this, if I can figure it out correctly, is that the 360 GPU should be far more advanced, and should also be far more compatible with PC graphics (DX10) that will happen in the next 2-3 years.

While in the meantime, nVidia has been stomping ATI PC sales with their current offerings, but haven't yet (or hadn't at the time) invested the R&D into building an entirely new architecture, which they still need to do (or maybe are doing now, with the G80 but sure weren't interested in doing when MS approached them for the 360 GPU).

Essentially, nVidia is using the cash flow from the 6000-7000 series in order to fund the R&D to pay for their next generation of GPUs, because their PC cards were superior. ATI already spent the R&D on their next generation of PC cards because they had the available resources to do it when MS approached them for the 360.

What this means, of course, is that nVidia's dominace in the PC market place should also be affected (negatively) by their decision not to "look forward" back when MS first approached them. And while ATI took a hit over the past 2 years (or so), their market position should be much improved in the foreseeable future because they've already made the investment.

Is that an incorrect reading of the information we have available?
 
That was a great interview, he didnt slag anybody seriously, yet he highlighted things to come within his own camp.

Very civilized.
 
RancidLunchmeat said:
Okay... So back to basics:

MS wanted a GPU that utilized a unified arcitechure because that is what was being supported/promoted/demanded by DX10. ATI didn't have any such GPU in the works, and neither did nVidia.

However, ATI had the resources available to not only produce their current line of PC GPUs but also forward-think towards unified architecture. nVidia was too focused on recovering their losses from the FX series to spend hundreds of millions of dollars on R&D in order to meet the requirements of the 360.

So essentially... ATI took a big hit when nVidia was providing a better product for the past 2-3 years with the 6000-7000 series of cards that out performed their PC offerings, but ATI got 1) The 360 contract and 2) took the R&D hit on developing a unified arcitecture before nVidia has done such a thing... and it appears from everything that I've read that nVidia knows they WILL have to go in that direction.. they just didn't have the assets available to go that route when MS came knocking on their doors for a 360 GPU.

The end result of this, if I can figure it out correctly, is that the 360 GPU should be far more advanced, and should also be far more compatible with PC graphics (DX10) that will happen in the next 2-3 years.

While in the meantime, nVidia has been stomping ATI PC sales with their current offerings, but haven't yet (or hadn't at the time) invested the R&D into building an entirely new architecture, which they still need to do (or maybe are doing now, with the G80 but sure weren't interested in doing when MS approached them for the 360 GPU).

Essentially, nVidia is using the cash flow from the 6000-7000 series in order to fund the R&D to pay for their next generation of GPUs, because their PC cards were superior. ATI already spent the R&D on their next generation of PC cards because they had the available resources to do it when MS approached them for the 360.

What this means, of course, is that nVidia's dominace in the PC market place should also be affected (negatively) by their decision not to "look forward" back when MS first approached them. And while ATI took a hit over the past 2 years (or so), their market position should be much improved in the foreseeable future because they've already made the investment.

Is that an incorrect reading of the information we have available?



ATI also has "two console horses" to bet on. there is a very decent chance that Xbox 360 and Wii sales combined, could be more than PS3. even EA thinks so. this is a nice win for ATI. this may also pay off even bigger down the road assuming both MS and Nintendo stay with ATI for the 360 and Wii successors.
 
a688 said:
Intel and nVidia didn't pay for the xbox to be manufactured. Microsoft did, Microsoft lost money on the xbox. I have NO idea HOW you can even think Intel or nVidia got screwed since they got PAID for every console manufactured. It is entirely impossible for either Intel or nVidia to lose money on xbox sales unless they themselves gave MS a deep initial discount which they never did. MS paid nVidia for design work and Intel already had the tech handy since it was a current processor. The only reason nVidia got mad was because MS wanted them to lower the price of their chips when the manufacturing cost went down but good ol' nvida decided they would rather earn more per chip rather than the same amount.
I didnt even think that. I read it somewhere years ago. Or probably it was something else and changed in my mind since it was been such a long time since I read it.

Thats why I said if I remember well than just state it as a fact
 
RancidLunchmeat said:
So essentially... ATI took a big hit when nVidia was providing a better product for the past 2-3 years with the 6000-7000 series of cards that out performed their PC offerings, but ATI got 1) The 360 contract and 2) took the R&D hit on developing a unified arcitecture before nVidia has done such a thing... and it appears from everything that I've read that nVidia knows they WILL have to go in that direction.. they just didn't have the assets available to go that route when MS came knocking on their doors for a 360 GPU.

Unification at the hardware level does not seem to be in nVidia's near-term future (which when MS came knocking would have been longer term future). I remember when Kirk said that they didn't think unification was right for now, people argued that he was only saying that for RSX's sake, and that their own DX10 chips would be unified. But that doesn't seem to be the case, reportedly - they're still sticking with 'dedicated' shaders for now, or so it seems. I doubt that for the G80 onwards it's a case that they could not have gone unified - I'm sure nVidia are happy with their roadmap and would consider it to be the 'better' approach than what MS was looking for at that point. Furthermore, an investment in unified hardware at that point for them would have taken longer to leverage more widely than it would have for ATi (it's not as closely in nVidia's future for other products), thus the return on that investment may not have made economical sense for them. Particularly if they thought that by the time they would go unified themselves, things would be more different from what MS was looking for.
 
Last edited by a moderator:
RancidLunchmeat said:
Okay... So back to basics:

MS wanted a GPU that utilized a unified arcitechure because that is what was being supported/promoted/demanded by DX10. ATI didn't have any such GPU in the works, and neither did nVidia.

Wouldn't it be fair to say that the R400 is the genesis of the work that you now see in the R520 and in Xenos, and that ATI didn't start from square-1 when Microsoft came calling, but actually had already done significant R&D in the area? I can hardly believe that MS called up ATI and said "We want US" and ATI said "oh, we haven't done any work in that area. But, hey, we'll start if that's what you want for $$$"
 
Back
Top