Nvidia, Sony, Cell, and RamBus *PTSD Flashbacks*

Discussion in 'Console Industry' started by DSoup, May 17, 2016.

  1. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,972
    Likes Received:
    5,795
    Location:
    London, UK
    Blaming Nvidia for PS3's complex architecture is one hell of a stretch.
     
    #1 DSoup, May 17, 2016
    Last edited: May 17, 2016
    Akumajou, rockaman and liolio like this.
  2. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    The CPU cores are decent and therewas not much choice. Now the system could have used a couple extra cores. I believe a standard, PC like, NUMA would have served them well, especially as they had 2 chips anyway.
     
  3. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,977
    Likes Received:
    4,566
    You mean it would have been completely impossible to develop a XDR memory controller for their northbridge/GPU and get an UMA overall?
    Sure, there are more entities at fault, but while microsoft got a first-ever unified shader architecture together with eDRAM from ATi, nvidia handed Sony a pre-existing and rather old G71 with halved ROPs (because there was only as 128bit bus to the memory regardless), fix up a FlexIO to Cell and call it a day.

    Prediction: people will acuse me of saying that implementing a XDR memory controller for an existing GPU core is super easy, and I'll say beforehand that I stated no such thing, ever. It might have been hard, very hard. Just not impossible.
     
    BRiT likes this.
  4. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,972
    Likes Received:
    5,795
    Location:
    London, UK
    I said nothing like that. Cell's memory architecture is not one that lends itself to unifed memory architectures. The XDR memory sub-system is a feature of Cell. The decision to go with Cell, and the baggage it carries, is on Sony and not Nvidia.

    So you're perception of the deal is that Nvidia just gave Sony a chip and Sony didn't know what it's capabilities were during negotiations? That none of this is on Sony, they were the were innocents in this, tricked by evil Nvidia?
     
    Akumajou, xpea and liolio like this.
  5. Nisaaru

    Regular

    Joined:
    Jan 19, 2013
    Messages:
    878
    Likes Received:
    196
    The Nvidia part was a late hot fix to Sony's PS3 architecture because their own rasterizer chip together with Cell didn't perform. What do you think Cell's vector engine was about? Just a fancy co-processor?:)

    I'm sure if they hadn't tanked so much money into the whole Cell adventure(including fab) they would have shelved the whole design the moment they realized that they flunked it. But careers and money were invested into it so they rode the pale horse...
     
  6. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,977
    Likes Received:
    4,566
    1 - *your
    2 - My only perception is the same as everyone else's: the PS3 came out later and brought a substantially less advanced, less flexible and worse performing GPU than the X360.


    Regardless, the PS3 got a nvidia GPU. The PS4 did not.

    That's what most people say, yes.
    But how late? The X360's development started and the console was released within less than 3 years. That was really fast. So exactly how late was nvidia to the party in order to be excused to provide an old GPU architecture to Sony?
     
  7. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,986
    Likes Received:
    847
    Location:
    Planet Earth.
    So late that Sony needed a working GPU immediatly and couldn't wait for any tweaking/modifications, that's how late it was.
     
  8. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,977
    Likes Received:
    4,566
    Sources on specific dates? Was it a year? 500 days? 200 days?

    "So late" is so generic that it really doesn't mean anything. Maybe nvidia had 2 years to do it, and simply chose not to because they decided it was so late to do it.
     
  9. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,986
    Likes Received:
    847
    Location:
    Planet Earth.
    There's only so much I can say, and I'm not sure I should have written my previous post.

    As for Sony & MS not working with NV, I heard it's more about costs than anything (including fees for a shrink and such) that didn't please either MS or Sony.
     
    xpea likes this.
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,714
    Likes Received:
    11,167
    Location:
    Under my bridge
    Make it 'too late'. Sony went to nVidia and all nVidia had was their bleeding edge, very expensive G80 due out the same time as PS3, or the existing G70 which was working and could be tweaked. What would a PS3 with G80 have cost? Not really feasible. So it'd need to be a custom part on the latest design. It's not as though nVidia are incapable because they gave MS NV2A for XB. Ergo we've two possibilities. 1) nVidia couldn't be arsed to provide anything better and Sony were good with that not negotiating a better deal. What would the reasons for that be? That Sony left it too late and weren't in a strong position to negotiate? 2) nVidia didn't have enough time. What would the timeline be on that? Whatever it is, that answers your 'how late' question.
     
    RancidLunchmeat likes this.
  11. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,720
    Likes Received:
    5,815
    Location:
    ಠ_ಠ
    Did they even approach ATi?
     
  12. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    XDR was a bad decision, ultimately IBM moved to DDR with later revision of the Cell. The main PS3 failure, is not the GPU, neither Cell a posteriori for the system has a whole it is the memory amount and type, as shown with every PC you need a lot more RAM than, VRAM (the gap is no longer as big now with really high resolution and asset quality), the PS3 went with 2 tiny pools of fast memory.

    Imho there are no argument for tackling Nvidia wrt the PS3. As for the XBOX, MSFT signed bad contract, actually Nvidia was brave to face bullying from a way bigger company asking for a rebate not accounted for in existing contracts.
    If Nvidia failed to any of its contractual obligations there would have been lawsuits, and there have been lawsuits, for them and plenty of of companies that have nothing to do with gaming, some won some lost. Now it is pretty clear where the pretty specific "nvidia is a bad business partner", when there are no factual evidence of broken contractual obligations in the aforementioned case.
    It is imho something else than a rational conversation I won't engage further... but I agree with you ;)
     
  13. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,972
    Likes Received:
    5,795
    Location:
    London, UK
    Even if they did, who says ATI would have been able to provide something Sony would accept. Maybe Microsoft had some sort of exclusivity deal.
     
  14. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,972
    Likes Received:
    5,795
    Location:
    London, UK
    When PS3 was being taped our, XDR was a smart decision and it suited the low-latency burst access of the SPEs. Cell would not have been in a happy marriage with DDR2 and DDR2 would have meant a much more complex PCB.
     
    Akumajou and temesgen like this.
  15. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    Yet it married it. The gain in memory space was worse the loss in SPU throughput which were overwhelming better better at the tasks they were suited for than competing designs. IBM did it for supercomputer, Sony did not needed to reach such big memory space but they could have bought more main RAM, DDR2 being the standard of the time.
     
  16. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,720
    Likes Received:
    5,815
    Location:
    ಠ_ಠ
    Not saying. Doubt there'd be an exclusivity deal. ATi still had to do some work with Wii.

    Just trying to establish whether or not anyone had time to do magic. :]
     
    rockaman likes this.
  17. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,972
    Likes Received:
    5,795
    Location:
    London, UK
    The XCell 8i, which added support for DDR2 in 2008, added a number of modifications to the processor and the memory controller so it could support DDR2. That was not an option for Sony unless.
     
  18. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,972
    Likes Received:
    5,795
    Location:
    London, UK
    That's my understanding. Nvidia was almost a last minute decision. In terms of exclusivity I was thinking of ATI's unified shader architecture. If Microsoft tied that up then the only ATI option would ATI's previous generation cores and were they any better than what became RSX? ¯\_(ツ)_/¯
     
    rockaman likes this.
  19. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    The Cell was not done right in the first place that is what I meant. That was a bad arbitration to compromise the memory space for a CPU embarking accelerators. Likewise 8 vs 6 SPUs was another bad one, raisedcost, made bandwidth even more of an issue, etc.
    Anyway Cerny spoke about a choice between a big fast memory pool and and big and slow one back by a tiny extremely fast one, I don't remember the exact wording but I believe the PC take on the problem whereas it is born out of necessity, had its time to evolve and show its worth. We are in an era which require a beefy amount of VRAM, and I suspect the PC way is the right way looking at the memory solution and their associated costs.
    Now I guess I'm not rigorous enough in my thinking as you could get NUMA and a SOC at the time, like the late 360 revisions. So I should restrict my pov to the memory model and not discrete CPU and GPU vs SOC. I ultimately favor the former, as have different IPs are self locking the path to price reduction. There is also the fact that existing GPU more often than not are targeting performances that are as relevant for PC as for console, which implies that you can use of the shelves part for the GPU and focus your effort on a tinier lesser chips embarking mostly the CPU cores.
     
    #19 liolio, May 19, 2016
    Last edited: May 19, 2016
    rockaman likes this.
  20. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,972
    Likes Received:
    5,795
    Location:
    London, UK
    Let's agree to disagree about pretty much all your thoughts on Cell. :yep2:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...