Next gen RAM choices *spawn

Discussion in 'Console Technology' started by aaronspink, Jul 1, 2012.

  1. upnorthsox

    Veteran

    Joined:
    May 7, 2008
    Messages:
    2,106
    Likes Received:
    380

    Different era's though in bandwidth, sizes, and memory prices. PS2 and GC, their edram was their ram, 360 was a bit of a tweener where MS wasn't sure if the bw to main mem was enough. PS4 with 2-4GB at 192GB of bw makes no sense to complicate the mem structure with edram. 720 with 8GB of DDR4 at 77GB of bw is again a bit of a tweener bandwidth-wise but may be enough that edram is more of a hinderance than help.

    This coming gen is really tough for mem choices. All have risks and none is a slam dunk.
     
  2. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    9
    Location:
    Leicestershire - England
    If ps4 is only 2gb...then that might not he the best solution, however if it 4gb with a massive 192gb/s then that will easily be the best imo.
     
  3. upnorthsox

    Veteran

    Joined:
    May 7, 2008
    Messages:
    2,106
    Likes Received:
    380

    A quick scan of AMD and Nvidia cards shows a 256 bit bus for the 3870 @192mm^2. It might go lower but @166mm^2 other cards are at 128 bit bus.

    I don't think anyone should be considering 2 shrinks. 14nm will need double or triple masking, finFets, and possibly EUV. That not a shrink but pretty much a complete redesign.
     
  4. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    I think the PS2 had "normal" none embedded ram for its main memory, with only the GS having 4MB of embedded ram, but absolutely, things change. With the 360 there was also the issue of possibly launching with 256MB, and then memory benefits of resolving MSAA as you copy off the edram, so that may have factored into things a little too.

    Yeah definitely. With additional background OS tasks even the role of the system itself is less clear. 8GB DDR4 + 32MB of EDRAM would probably trump 2GB of GDDR5, but there are so many other questions like bus width, OS reservations, possibly accessing the edram with the CPU

    I'm intrigued by the idea of 12GB in Durango dev kits, but 8 planned for the final system. Does 12GB in dev kits (as opposed to 16) indicate split memory pools with one pool doubled to 8GB, or is it 4 + 8 in a single pool, with MS just trying to shave a few dollars off early dev kits?

    Even without two full nodes, 128-bit buses might work if you had a separate CPU and GPU with split memory pools. Say 4GB low latency DDR4 for the eight core CPU, 4GB GDDR5 for the GPU and a small amount of edram on the GPU. Perhaps 32nm IBM for the GPU and 28nm GF for the 8 core CPU. Each chip could shrink at its own pace and not be limited by a large bus. That's two chips, two off chip memory buses and three memory pools total though, so maybe that's getting a bit messy.
     
  5. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,296
    Location:
    Helsinki, Finland
    IBM PowerPC A2 has 32 MB of EDRAM included in the SoC (die photo: http://www.power.org/events/2010_ISSCC/Wire_Speed_Presentation_5.5_-_Final4.pdf). It's in mass production (used already in many supercomputers). The chip is fabbed at 45 nm (same process as the newest XCGPU). So including the EDRAM would be definitely possible now... at least if you can have as good margins as you do with supercomputers :)
     
  6. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    Thanks! So cost must be the issue then.

    I'm guessing that either it's cheaper to keep making the edram daughter die on 65nm, cheaper to make the XCGPU with another fab who can't do the edram (isn't it someone other than IBM who actually manufactures the current XCGPU?) or the cost of engineering a suitable on-chip emulator for the current off-chip bus (as they had to do for the CPU FSB) is off-putting. Or maybe it's some of all three.

    I'm still secretly hoping that the WiiU uses a CGPU with embedded memory, despite all the rumours suggesting otherwise. IBM being able to do it and being the only named fab so far means I've not totally given up yet!
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    So basically, EDRAM is good if you want your system to suck. Thx.

    Where are you getting all these conjectures? I still dont know why it's ok for one (the poorer) manufacturer to not care about costs but the other cant do x, y, and z because of them...Also, so you're saying graphics are all that matters (since apparently, if one has them cost reductions are not needed for that player)

    I also fail to see why it's different an iota from 360, since that was a very bleeding edge performance device at the time. Morseo than PS4 will be, almost certainly (as Xenos was a top tier GPU of the era, while pitcairn is more mid range)!


    Maybe but to be honest we're just theorizing. Nobody but MS and Sony know the true intricacies. I just think it's a pretty reasonable theory that EDRAM was a net performance and cost loss for 360. I'm also open to that I may be wrong, but I'll never know.

    I'm not sure what you're veering off to, but all we know is PS3 has reduced price more and is currently at price parity (depending how you look at it). Maybe the $129 Xbox doesnt exist cause the EDRAM is too expensive? TBH we're just going in circles here.


    Apparently. But those are rumored specs and you seem to have an aversion to those :razz:


    There are pluses and minuses, I tend to think the minuses of EDRAM outweigh the plusses, but I'm no expert (BTW, dont we have an EDRAM thread?). Also I dont see why dual RAM pools mean more chips.

    I like this sort of speculation. I cannot wait to learn about the final Durango system design to see exactly what they've done. All though Sony's design seems quite traditional , almost boring (but that doesn't mean low performance!).

    It's always been how they tackle the bandwidth that's interesting, Sony's solution seems to be just a bog standard 256 bus. MS=?
     
  8. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    195
    Location:
    Stateless
    Well wrt to the 360 and the edram I can't help but think of the comment of a member here that at some point MS was not heading for HD.
    Considering SD resolution the size of the edram would have been more than enough in a lot of case.
    MS decided other wise and went HD with twice the main memory and the same daughter die which ultimately allowed them to extend their product life cyle.

    I'm not sure about how the 360 would have performed with a 256bit bus and gddr3 or xdram, 44GB/s is not that much bandwidth. Now if the trade off would have been 1GB of GDDR3 (if doable and I'm not sure it was) vs 512MB it could have proved interesting. As few games uses virtual texturing and it made it only at the end of this gen, the extra memory may have yield MS a greater competitive (more easily perceived) than better handling of transparencies. Definitely better textures are pretty obvious to any one.

    But if its 512MB(256 bit bus) vs 512+edram I'm not sure MS made the wrong decision.
    EDIT
    Not to mention that Edram save quiet some memory.
     
    #68 liolio, Jul 4, 2012
    Last edited by a moderator: Jul 4, 2012
  9. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    no u? :eek:

    Who said Sony don't care about costs? Who said graphics are all that matters?

    Pretty strong rumours indicate that AMD got the contract early this year (February), and that they are handling everything processor related. Sony's R&D will be slashed compared to Cell and PS3 and there probably wouldn't be time to work with IBM even if Sony wanted to throw lots of money their way. Even a large APU would be cheaper than RSX + Cell.

    You can't see why it's an iota different from the 360? From the top of my head I'd pick:

    - The requirement of sampling from buffers making the 360's edram solution outdated
    - Using a (larger) APU instead of separate (smaller) CPU and GPU
    - Sony cutting back R&D
    - Sony not having the time to work on a customised part like MS did
    - No IBM involvement
    - Process shrinks and their cost benefits slowing down

    ... as being notable differences. There are probably other differences too.

    I think the balance of evidence shows it was a net win for the 360. You have talented developers flat out stating on this board that that the edram is crucial to the 360 performing the way it does, and you have measurements showing that MS shrank the GPU way below the size demanded by a 256-bit bus (twice) in the time in which they were combating RROD and bringing the 360 into profitability. We also know that profitability through shrinking was MS's business plan from the beginning.

    We also know that MS could make the current XCGPU on IBM 45nm chip if they wanted (and Nintendo are doing this for WiiU) but they choose to keep making part of it on an older and undoubtedly cheaper node, at least for the moment.

    I never said that dual ram pools means more chips - it doesn't have to. A 256-bit bus would have demanded double the minimum number of memory chips for the 360 though, which would have meant a larger and more complex motherboard and worse $/MB for the platform in the long run.

    I'm looking forward to seeing what Nintendo do too!

    Sony moving from the exotic PS3 to what might appear to be a pretty capable PC graphics chip with a couple of AMD CPU modules bolted on would be a huge change, but it might put them on the right side of MS wrt PC multiplatform games, which at a stroke would give their games strength in numbers. If they were the only vendor offering fully unified memory and practically unlimited buffer sizes that would be a win too. I hope they go for 4GB of ram.
     
  10. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    9
    Location:
    Leicestershire - England
    Wouldn't a split pool also need 2 memory controllers?...that can be good either can it?
     
  11. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    Yeah, I think it might have been ERP, pointing out that SD with 4xMSAA would fit perfectly in the edram. They definitely had one eye on HD resolutions with the hardware tiling support but it looks like focus shifted rapidly towards HD in 2004/2005. The 360 has survived and prospered despite some pretty huge changes in MS strategy and changes in the industry itself, some starting back before the 360 was even released.
     
  12. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    You basically said as long as Sony is going for a high performance product, they wont have to worry about shrinks or reducing costs. What else would be your logic there? Your own quotes


    I dont understand this all, it's not reconcilable. How would the benefits not be there? Sony does not want to shrink but it's crucial for MS? Sony does not need to cost reduce but MS does? Sony can wave a magic wand and be above a "decade long cost reduction slog" against their competitor, but it's crucial ms not use a 256 bus? You're not being consistent at all.

    I took you as saying "As long as Sony takes the graphics high road, they dont need to compete on price". From this is what I took the graphics statement. If you are arguing from some other point, please feel free to explain it...


    Again, none of these points apply to MS? why so much double standard? Why does Sony not have time BTW, they've had six years?

    I'm just not getting your points at all. Why can you not understand the inconsistency in what you're writing? You're basically laying out giant arguments for why EDRAM is necessary for MS. But then you turn around and completely ignore those arguments for Sony. For example, you said it is necessary for MS to use a <256 bus so they can shrink. Why is not necessary for Sony?



    We have sebbi liking the EDRAM and Humus (?) wrote some devastatingly critical posts against it, which at the time everybody seemed to accept without objection...we dont have a whole ton of devs commenting about such things here at all. I'm sure EDRAM is better than none in the 360 setup, the question is was it better than alternative setups, namely a PS3 style system, and the EDRAM budget allocated to other things like more RAM or shaders.

    And?

    I wasn't arguing for a 256 bus in 360 and never have, rather a dual pool 128 bus like PS3 arrangement, and crucially, the EDRAM budget being allocated elsewhere (as always, one can never look at system design in a vacuum, just as I always point out the question is never "do you want backwards compatibility in a console"? To which everybody answers "yes!". The real question is "do you want BC or that cost instead allocated to more RAM/CPU/GPU?". Forgive me if that wasnt clear.
     
  13. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    I didn't say that at all, you're just making that up.

    My quotes show I clearly said nothing of the sort.

    - It's great that the PS4 APU needs lots of bandwidth, because it means it's high performance.
    - If the benefits of EDRAM won't be there for Sony, they shouldn't use it.
    - If they don't intend to get into a decade long cost reduction slog against MS, then a bus limiting shrinking won't be the factor it would have been

    There is no contradiction.

    You're doing a funny mix of ignoring key points and re-interpreting others, then trying to create false dichotomies. That's not my fault and I can't fix it.

    I don't know what MS's plans are. A different strategy might result in different factors being more or less important. This generation, cost reduction through shrinking was crucial to the success of MS, but not to the Wii, which cost reduced less aggressively. The magic wands are your idea.

    Then you dun messed up big son!

    I've explained several times now but each time but you do that funny thing I pointed out above.

    Oh wow! You said there wasn't an iota of difference between the 360 and PS4 so I responded with several possibly important differences and this is the response?

    I can't have a discussion under these conditions - you keep switching the terms of the exchanges and dropping important points when they don't fit your narrative.

    No I'm not! Oh my wow.

    It *was* necessary for MS with the 360, because that was their business plan! I don't know what MS are going to do this time! If they want smaller chips than are required for a 256-bit bus (either at launch, soon after or years later) then they're going to have to either limit performance or use EDRAM. I'd assumed another 360 like approach, but I don't know this.

    Edit: I also pointed out why it may not be necessary for Sony to shrink beyond the limits of a 256-bit bus, but you have ignored this and simply restated the question. There is literally nothing I can do about you refusing to acknowledge an answer (at all, in any way), except to not keep answering it.

    Two 128-bit GDDR3 buses would require twice as many memory chips as the 360S uses, and all the other stuff that goes along with that.

    We aren't getting anywhere, we should probably call it a day and agree to disagree about ... most things. ;)
     
    #73 function, Jul 4, 2012
    Last edited by a moderator: Jul 4, 2012
  14. Safi

    Newcomer

    Joined:
    Jul 8, 2012
    Messages:
    3
    Likes Received:
    0

    What if the 8GB of ram is a fast cache connected to the gddr5 ?
    example:

    APU===Gddr5 UMA===ddr3--HDD

    The ddr3 is cheaper than an SSD and faster. If Microsoft/Sony fund a custom memory design you;ll have a large pool of memory and fast ram.

    Does that not solve the memory issue ?

    Im no expert :roll:

    Sorry for the double post, i messed up the quote. Cant edit.
     
  15. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    Option #2. Go with a low end GPU and bandwidth isn't an issue. e.g. Cape Verde has like 70GB/s.

    Look at some of the rumors: (1) 1TFLOPs (2) less GPU than the PS4 (<1.8TFLOPs) (3) 6670 class GPU via IGN (4) MS leak showing 4-6x faster with a very low ALU count.

    If these rumors are true, a wide DDR4 bus may be more than sufficient to feed such a GPU, especially if 720p is the target resolution.

    Something to chew on.
     
  16. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    9
    Location:
    Leicestershire - England
    Let's hope you are wrong...that sounds more like a good fit for the wuu.
     
  17. ultragpu

    Banned

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,306
    Location:
    Australia
    I wish WiiU has 8g of ddr4 and a 1TF gpu, then its games wouldn't look below the likes of GOWA or TLOU. But all in all the gpu spec should be much higher than a 6670 for the 720.
     
  18. Prophecy2k

    Veteran

    Joined:
    Dec 17, 2007
    Messages:
    2,468
    Likes Received:
    379
    Location:
    The land that time forgot
    You'd like to hope wouldn't you. I guess if the leaked doc holds any merit, the proposed dual APU setup (one for games and one for BC/services) would limit their GPU options given an overall fixed silicon budget and thus eventual retail price point.
     
  19. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Arent you the guy chewing me out for suggesting Mars/Cape Verde? Lol.

    Now you're getting it...

    And yeah, I've already brought up 256 DDR4 could feed a HD7770. That would require MS to not waste money on EDRAM though, and I just cant see them doing that even if it's unnecessary, LOL. They'll put EDRAM in just to waste people's time.
     
  20. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    195
    Location:
    Stateless
    Aren't HD77xx too tiny for a 256 bit bus?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...