Nvidia, Sony, Cell, and RamBus *PTSD Flashbacks*

  • Thread starter Deleted member 11852
  • Start date
D

Deleted member 11852

Guest
The original xbox deal led to long and expensive lawsuits, whereas the PS3 ended up in an overall crappy architecture with 2 types of different memory that were difficult to handle and a GPU with subpar pixel shader performance whose limitations had to be worked around using the Cell's SPEs.
Blaming Nvidia for PS3's complex architecture is one hell of a stretch.
 
Last edited by a moderator:
Those 3 parameters could have being OK if they had used a modern tech instead of a 1999 CPU and if they hadn't allocated some precious die area to hardware BC and added complexity because of it.
The CPU cores are decent and therewas not much choice. Now the system could have used a couple extra cores. I believe a standard, PC like, NUMA would have served them well, especially as they had 2 chips anyway.
 
Blaming Nvidia for PS3's complex architecture is one hell of a stretch.
You mean it would have been completely impossible to develop a XDR memory controller for their northbridge/GPU and get an UMA overall?
Sure, there are more entities at fault, but while microsoft got a first-ever unified shader architecture together with eDRAM from ATi, nvidia handed Sony a pre-existing and rather old G71 with halved ROPs (because there was only as 128bit bus to the memory regardless), fix up a FlexIO to Cell and call it a day.

Prediction: people will acuse me of saying that implementing a XDR memory controller for an existing GPU core is super easy, and I'll say beforehand that I stated no such thing, ever. It might have been hard, very hard. Just not impossible.
 
You mean it would have been completely impossible to develop a XDR memory controller for their northbridge/GPU and get an UMA overall?

I said nothing like that. Cell's memory architecture is not one that lends itself to unifed memory architectures. The XDR memory sub-system is a feature of Cell. The decision to go with Cell, and the baggage it carries, is on Sony and not Nvidia.

Sure, there are more entities at fault, but while microsoft got a first-ever unified shader architecture together with eDRAM from ATi, nvidia handed Sony a pre-existing and rather old G71 with halved ROPs (because there was only as 128bit bus to the memory regardless), fix up a FlexIO to Cell and call it a day.

So you're perception of the deal is that Nvidia just gave Sony a chip and Sony didn't know what it's capabilities were during negotiations? That none of this is on Sony, they were the were innocents in this, tricked by evil Nvidia?
 
You mean it would have been completely impossible to develop a XDR memory controller for their northbridge/GPU and get an UMA overall?
Sure, there are more entities at fault, but while microsoft got a first-ever unified shader architecture together with eDRAM from ATi, nvidia handed Sony a pre-existing and rather old G71 with halved ROPs (because there was only as 128bit bus to the memory regardless), fix up a FlexIO to Cell and call it a day.

Prediction: people will acuse me of saying that implementing a XDR memory controller for an existing GPU core is super easy, and I'll say beforehand that I stated no such thing, ever. It might have been hard, very hard. Just not impossible.

The Nvidia part was a late hot fix to Sony's PS3 architecture because their own rasterizer chip together with Cell didn't perform. What do you think Cell's vector engine was about? Just a fancy co-processor?:)

I'm sure if they hadn't tanked so much money into the whole Cell adventure(including fab) they would have shelved the whole design the moment they realized that they flunked it. But careers and money were invested into it so they rode the pale horse...
 
So you're perception of the deal is that Nvidia just gave Sony a chip and Sony didn't know what it's capabilities were during negotiations? That none of this is on Sony, they were the were innocents in this, tricked by evil Nvidia?

1 - *your
2 - My only perception is the same as everyone else's: the PS3 came out later and brought a substantially less advanced, less flexible and worse performing GPU than the X360.


Regardless, the PS3 got a nvidia GPU. The PS4 did not.

The Nvidia part was a late hot fix to Sony's PS3 architecture because their own rasterizer chip together with Cell didn't perform.
That's what most people say, yes.
But how late? The X360's development started and the console was released within less than 3 years. That was really fast. So exactly how late was nvidia to the party in order to be excused to provide an old GPU architecture to Sony?
 
So late that Sony needed a working GPU immediatly and couldn't wait for any tweaking/modifications, that's how late it was.

Sources on specific dates? Was it a year? 500 days? 200 days?

"So late" is so generic that it really doesn't mean anything. Maybe nvidia had 2 years to do it, and simply chose not to because they decided it was so late to do it.
 
Sources on specific dates? Was it a year? 500 days? 200 days?

"So late" is so generic that it really doesn't mean anything. Maybe nvidia had 2 years to do it, and simply chose not to because they decided it was so late to do it.
There's only so much I can say, and I'm not sure I should have written my previous post.

As for Sony & MS not working with NV, I heard it's more about costs than anything (including fees for a shrink and such) that didn't please either MS or Sony.
 
Sources on specific dates? Was it a year? 500 days? 200 days?

"So late" is so generic that it really doesn't mean anything. Maybe nvidia had 2 years to do it, and simply chose not to because they decided it was so late to do it.
Make it 'too late'. Sony went to nVidia and all nVidia had was their bleeding edge, very expensive G80 due out the same time as PS3, or the existing G70 which was working and could be tweaked. What would a PS3 with G80 have cost? Not really feasible. So it'd need to be a custom part on the latest design. It's not as though nVidia are incapable because they gave MS NV2A for XB. Ergo we've two possibilities. 1) nVidia couldn't be arsed to provide anything better and Sony were good with that not negotiating a better deal. What would the reasons for that be? That Sony left it too late and weren't in a strong position to negotiate? 2) nVidia didn't have enough time. What would the timeline be on that? Whatever it is, that answers your 'how late' question.
 
I said nothing like that. Cell's memory architecture is not one that lends itself to unifed memory architectures. The XDR memory sub-system is a feature of Cell. The decision to go with Cell, and the baggage it carries, is on Sony and not Nvidia.
XDR was a bad decision, ultimately IBM moved to DDR with later revision of the Cell. The main PS3 failure, is not the GPU, neither Cell a posteriori for the system has a whole it is the memory amount and type, as shown with every PC you need a lot more RAM than, VRAM (the gap is no longer as big now with really high resolution and asset quality), the PS3 went with 2 tiny pools of fast memory.

So you're perception of the deal is that Nvidia just gave Sony a chip and Sony didn't know what it's capabilities were during negotiations? That none of this is on Sony, they were the were innocents in this, tricked by evil Nvidia?
Imho there are no argument for tackling Nvidia wrt the PS3. As for the XBOX, MSFT signed bad contract, actually Nvidia was brave to face bullying from a way bigger company asking for a rebate not accounted for in existing contracts.
If Nvidia failed to any of its contractual obligations there would have been lawsuits, and there have been lawsuits, for them and plenty of of companies that have nothing to do with gaming, some won some lost. Now it is pretty clear where the pretty specific "nvidia is a bad business partner", when there are no factual evidence of broken contractual obligations in the aforementioned case.
It is imho something else than a rational conversation I won't engage further... but I agree with you ;)
 
Did they even approach ATi?
Even if they did, who says ATI would have been able to provide something Sony would accept. Maybe Microsoft had some sort of exclusivity deal.
 
XDR was a bad decision, ultimately IBM moved to DDR with later revision of the Cell.
When PS3 was being taped our, XDR was a smart decision and it suited the low-latency burst access of the SPEs. Cell would not have been in a happy marriage with DDR2 and DDR2 would have meant a much more complex PCB.
 
When PS3 was being taped our, XDR was a smart decision and it suited the low-latency burst access of the SPEs. Cell would not have been in a happy marriage with DDR2 and DDR2 would have meant a much more complex PCB.
Yet it married it. The gain in memory space was worse the loss in SPU throughput which were overwhelming better better at the tasks they were suited for than competing designs. IBM did it for supercomputer, Sony did not needed to reach such big memory space but they could have bought more main RAM, DDR2 being the standard of the time.
 
Even if they did, who says ATI would have been able to provide something Sony would accept. Maybe Microsoft had some sort of exclusivity deal.
Not saying. Doubt there'd be an exclusivity deal. ATi still had to do some work with Wii.

Just trying to establish whether or not anyone had time to do magic. :]
 
Yet it married it. The gain in memory space was worse the loss in SPU throughput which were overwhelming better better at the tasks they were suited for than competing designs. IBM did it for supercomputer, Sony did not needed to reach such big memory space but they could have bought more main RAM, DDR2 being the standard of the time.
The XCell 8i, which added support for DDR2 in 2008, added a number of modifications to the processor and the memory controller so it could support DDR2. That was not an option for Sony unless.
 
Not saying. Doubt there'd be an exclusivity deal. ATi still had to do some work with Wii. Just trying to establish whether or not anyone had time to do magic. :]

That's my understanding. Nvidia was almost a last minute decision. In terms of exclusivity I was thinking of ATI's unified shader architecture. If Microsoft tied that up then the only ATI option would ATI's previous generation cores and were they any better than what became RSX? ¯\_(ツ)_/¯
 
The XCell 8i, which added support for DDR2 in 2008, added a number of modifications to the processor and the memory controller so it could support DDR2. That was not an option for Sony unless.
The Cell was not done right in the first place that is what I meant. That was a bad arbitration to compromise the memory space for a CPU embarking accelerators. Likewise 8 vs 6 SPUs was another bad one, raisedcost, made bandwidth even more of an issue, etc.
Anyway Cerny spoke about a choice between a big fast memory pool and and big and slow one back by a tiny extremely fast one, I don't remember the exact wording but I believe the PC take on the problem whereas it is born out of necessity, had its time to evolve and show its worth. We are in an era which require a beefy amount of VRAM, and I suspect the PC way is the right way looking at the memory solution and their associated costs.
Now I guess I'm not rigorous enough in my thinking as you could get NUMA and a SOC at the time, like the late 360 revisions. So I should restrict my pov to the memory model and not discrete CPU and GPU vs SOC. I ultimately favor the former, as have different IPs are self locking the path to price reduction. There is also the fact that existing GPU more often than not are targeting performances that are as relevant for PC as for console, which implies that you can use of the shelves part for the GPU and focus your effort on a tinier lesser chips embarking mostly the CPU cores.
 
Last edited:
Let's agree to disagree about pretty much all your thoughts on Cell. :yep2:
 
Back
Top