The pros and cons of eDRAM/ESRAM in next-gen

Discussion in 'Console Technology' started by Shifty Geezer, Jan 8, 2012.

  1. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    The existence of the crossbar that routes data to eSRAM block controllers or memory channels was already disclosed.
    In part, this enables a mostly transparent mapping of accesses to the eSRAM or main memory based on properties assigned via page table after the initial setup.
    Upping the peak numbers that in terms of the number of eSRAM blocks and memory clients does impact the GPU's internal crossbar, then the on-die interconnect that routes accesses to the necessary endpoints.
    AMD's APU read and write paths are very wide for this class of chip.

    No realistic/worthwhile eDRAM manufacturer, and no mass-level production of a stacked large SRAM or DRAM chip for a SOC this size and TDP. This console generation came a few years early for stacked DRAM/2.5D, which isn't quite the same but appears more tractable than getting high-power chips in a stack configuration.

    The PS Vita gets away with a stacked WideIO interface, but assuming Durango is ~100W TDP, there's a good chance that stacked solution burns as much or less power at peak compared to the larger chip at deepest idle, given that the Vita TV is rated at less than 3W max as a total unit.
     
  2. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    AMD's GPU architectures are designed to be scalable up and down. Do you think they spend 3-5 years to develop multiple variations within the same family to meet various price points? Of course, not. And I never said the design wasn't good. Just under spec'd based on what we know now. And they should have aimed higher. ROP/CU counts and ESRAM size could have definitely been adjusted during development without altering the design investment and would not have needed to be set in stone from the outset.

    Clearly its not something that can be changed as late as system RAM modifications, but 1 year prior to planned tape out isn't unreasonable.
     
  3. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    864
    Likes Received:
    693
    I don't see space for altering the design one way or the other without increasing the already formidable size of the chip and that would start to make the chip uneconomical to manufacture. Sony was rumoured to be planning 4GB GDDR5 for ages so I doubt MS felt uncomfortable with their choice of 8GB DDR3 + ESRAM. Time has been kind to Sony as all signs point to 8GB GDDR5 being a late 2012, early 2013 development and has left MS with no time to redesign unless they wanted to risk giving Sony a 360-esque head start in the market.
     
  4. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    You are conflating time to manufacture with time to develop.

    This is wrong.

    Under specced is determined - in the real world - by market performance.

    It is uncertain how much of the relative PS4/ Bon performance in the market is due to "specs". Even on B3D many of the most vocal "spec" cheerleeders haven't got a fucking clue what they're looking at when it's in action.

    You have absolutely no idea how altering significant aspects of the design during development would have affected costs or - crucially - time to deliver.

    What are you basing that on? And when do you think tape out was?
     
  5. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Forgive me. You are right. Of course it's impossible for the design to be anything other than what it is. My mistake.
     
  6. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    Because that is the only possible alternative to sprinkling faery dust over the wafers, right?
     
  7. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    If I understand correctly from the last few years of leaks, and more recent interviews about their design goals... there didn't seem to be any consideration between GDDR5 and DDR3. It's like MS never considered GDDR5, and Sony never even considered DDR3.

    It looks like their respective decisions about main memory was made very early. Sony did say they considered an internal SRAM buffer, but it was never a compromise against GDDR5, it was a compromise of bus width (and interestingly, that would have locked them to 4GB max, so they truly expected 4Gb chips to be ready when they planned it). If we follow the leaks, MS was planning DDR3/4 and was targeting 4GB early on so it wasn't like the entire reason for DDR3 was to reach 8GB.

    I don't think the 4Gb GDDR5 chips being ready on time was a surprise to either company. Perhaps the surprise was the price point which was very volatile for the last 3 years, and it was a huge risk. Many people here, who are very well informed about it, were definitely surprised though. The cost estimates in the "prediction" thread is showing this very clearly. It's a large discrepancy compared to the current cost estimates from isuppli and others.
     
  8. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Can't believe I'm letting myself get drug into this. But what the heck it's a new year.

    Fairy dust is always an option, or alternatively, which is all I ever suggested, was for someone to step up and say lets go big or go home and target a more robust spec. Oh no, the horror. I'm sure their Virtuoso platform would have melted under the pressure. And all the RLT done for not, since clearly it would have called for a complete rewrite. I'm not trying to trivialize, but don't over complicate things either. It was a matter of choice where they landed, not a technical limitation.

    Read any post I have made. Never said it was simple. Never said it could be slapped on at the last minute. Only that hindsight being 20/20, they could have and IMO should have slightly increased their size and power budgets to avoid the position they are in now. The End.
     
  9. pMax

    Regular

    Joined:
    May 14, 2013
    Messages:
    327
    Likes Received:
    22
    Location:
    out of the games
    ...Samsung, Hynix and others will come out with 128Gb memory chips this year, thanks to 3d geometries (see IEEE Spectrum mag).

    Yeah, Sony and MS could have done the same, what the hell would haven taken????
    ------------------------
    hmmm... by chance, do you remember where they disclosed that?
     
    #289 pMax, Apr 2, 2014
    Last edited by a moderator: Apr 2, 2014
  10. Shortbread

    Shortbread Island Hopper
    Legend

    Joined:
    Jul 1, 2013
    Messages:
    5,632
    Likes Received:
    4,921
    That sounds good and all, but what about price point? Does this "more rounded" hardware come with Kinect at a greater price point?

    I would think MS original XB1 design plans have always revolved around Kinect inclusion, and how to design affordable hardware around it.

    I'm not saying MS made the correct choices (seeing Kinect as the future)...however, it was the choice they made knowing Kinect was their vision for future gaming.
     
  11. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    A launch in late 2015/2016 maybe at the earliest, if it was just to take advantage of the new memory types in volume. It would also depend on which techs you had in mind. Some of the early high-density types are mobile DRAM with constrained bandwidth.

    I kind of wished (very early on) there would have been a meeting in the middle on this, with 2.5D integration coming a bit earlier and the consoles taking a little longer, just to see where relaxing the bandwidth constraint would have taken things.


    This wouldn't help with any scheme where the SOC is in a 3D stack with extra memory, the DRAM standards deal with massively lower thermal levels and much higher yield chips.

    http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

    One random aside I noted is that this design pushes the number of controllers as high as the two-time high water mark for AMD's high-end GPUs (R600, Hawaii). If there were a doubled eSRAM, Durango would have had been the broadest memory controller array for a single chip from AMD.
     
  12. Nisaaru

    Veteran

    Joined:
    Jan 19, 2013
    Messages:
    1,133
    Likes Received:
    403
    Do You really get the impression the XB1 is optimized for price as a total product?
    We have the cheap 8GB DDR3 and a die size comparable to Sony's
    The HDMI-In concept looks really iffy to me(TV moving away from analog/cable, TVs have integrated DVB receivers anyway, AV Receivers with all kinds of input sources)
    Then we have a too large case and cooling system for the power and noise profile. So unnecessary misc costs, shipping, less sales space at the retailer and so on.

    The message here is so unsharp. Like designed by committee.
     
  13. Shortbread

    Shortbread Island Hopper
    Legend

    Joined:
    Jul 1, 2013
    Messages:
    5,632
    Likes Received:
    4,921
    I'm pretty sure during earlier sales (before all the price cuts), MS was profiting $30-35 per unit. If MS did spec'd any higher, with Kinect inclusion, pricing would definitely be higher, with very little wiggle room for profit (if, MS swallowed some of the additional cost).

    But to answer your question; XB1 is optimized within a reasonable price (given the BOM data around the web), at the needed profit margins MS wanted, at a price (right or wrong) they felt consumers will pay for.

    XB1 is designed to meet the goals which MS envisioned it to be, an "all inclusive hub" for TV and gaming needs. So, eliminating HDMI-in and other I/O ports, may-not be beneficial for MS goals... XB1 being the ultimate component within the living room space.
     
  14. (((interference)))

    Veteran

    Joined:
    Sep 10, 2009
    Messages:
    2,499
    Likes Received:
    70
    That absolutely was their design plan.

    They were expecting PS4 to be more powerful as they knew they'd have a significant proportion of their BOM in Kinect.
     
  15. DSoup

    DSoup Series Soup
    Legend Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    16,780
    Likes Received:
    12,697
    Location:
    London, UK
    Well it was the PR folks from EA, voted the worst company in America. Hmmm.
     
    #295 DSoup, Apr 3, 2014
    Last edited by a moderator: Apr 3, 2014
  16. Pixel

    Veteran

    Joined:
    Sep 16, 2013
    Messages:
    1,008
    Likes Received:
    477
    I see what you guys are saying.

    They definitely wanted to expand the market outside of cores gamers. Profits from Xbox360 were not huge last generation. Maybe thats why the 400 million dollar nfl deal, and non-core gamer focused reveal back in may. Thats why Kinect included (replicate Nint success in non-coregamer market). Thats why the nfl/skype heavy early promo for Xbox One. Thats why all the settop box plan rumors. During the X360 reveal Allard revealed how they planned to capture the casual noncore gamers with the 360. Here they want to do it again.

    Thats why there were to
     
  17. drbaltazar

    Newcomer

    Joined:
    Mar 15, 2014
    Messages:
    14
    Likes Received:
    0
    Location:
    french part of america
    MS always do best bang for your $! So ESRAM dual port quad channelled was the best solution. Same for DDR3. From what I understand DDR3 are a better choice invsome situation. And in some other GDDR5 is better suited. The issue here is probably this. Game dev are used to the ps4 way. XB1 way? Probably not. I rarelly see a DDR3 graphic card in the market. Yet this is pretty much what MS did. Quad channel ESRAM dual ported all in SOC last I checked was a rarity (if it even existed before XB1 on the consumer world) this thing is so sideways that it might take a while to take advantage of all its capability. I recall when we went from one CPU core to two and then to 64 bit. Outch! Parallelisation of any kind isnt easy (unless maybe you call your self MS, who knows) and game dev are very conservative! Unless they have a sae test proving them a is their better way then s, they tend to use the standard old ways usually.

    ___

    MOD : fixed faulty formatting. If you want to maintain posting rights, observe these simple points.
    1) Add a space after a full stop '.', question mark '?', or exclamation mark '!', and use a capital letter following it.
    2) Capitalise acronyms, so ddr3 = DDR3.
    These are important grammatical rules that make parsing and understanding a lot easier. Otherwise it's hard to tell whether an acronym is a typo, or where one idea ends and the next begins.
    3) capitalise the pronoun 'I'. This isn't necessary to understanding but it irks. ;)
    Typos and spelling mistakes aren't too bad (I add enough of my own to my posts), but the core understanding of your posts has to be communicated effectively for it to actually be a discussion, and that means supporting the readers' parsing of your text with a few very simply applied rules.

    Is $!so a thing? Some odd abbreviation we're supposed to understand? A type of ESRAM? A typo?

    Ahhh, so you're saying Microsoft provide the best bang for buck, which is reason to believe ESRAM is a good choice.

    On your last sentence, the way it's written makes it hard to interpret. Is 'sae' a typo for 'same' or 'sane'? Or an acronym for an 'SAE' test? I get you're comparing 's' to 'a' but it's not very clear.

    I hope you understand and can adapt. :)
     
    #297 drbaltazar, Apr 3, 2014
    Last edited by a moderator: Apr 3, 2014
  18. (((interference)))

    Veteran

    Joined:
    Sep 10, 2009
    Messages:
    2,499
    Likes Received:
    70
    ? Are you talking about Mattrick?

    I think bkilian hit the nail on the head with his Xbox.org is now run by 'MBAs with $ signs in their eyes' post.
    http://forum.beyond3d.com/showpost.php?p=1696487&postcount=1313

    Pretty much, MS probably wanted to see Xbox finally become profitable rather than the huge money sink it had mostly been the past two gens.
     
  19. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    360 wasn't a money sink, probably ended up turning a tidy profit overall, though the early years were certainly bumpy and the 1B RROD charge didn't help.

    So 1st console a money sink because of inexperience/bad business model, 2nd console profitable, doesn't seem too bad.
     
  20. (((interference)))

    Veteran

    Joined:
    Sep 10, 2009
    Messages:
    2,499
    Likes Received:
    70
    Do we know how much profit they made overall? Taking into account the RROD writeoff, R&D costs and hardware subsidising in the first year or two.

    In anycase they probably want to see much bigger profits from XB1 and get them much faster this time around too.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...