The pros and cons of eDRAM/ESRAM in next-gen

Discussion in 'Console Technology' started by Shifty Geezer, Jan 8, 2012.

  1. TheAlSpark

    TheAlSpark Moderator Moderator Legend

    Ah k. Understood. :)

    hm.. Maybe they became superfluous with the yield rates being just as good as another SKU.

    edit: oh nevermind, GDDR5M is supposed to be x16/x8 width instead.

    ----

    mm...

    On the other hand, power consumption measurements for both systems seem rather comparable at the moment. :s
     
  2. liolio

    liolio Aquoiboniste Legend

    I think that whereas both Yukon and Durango featured eSRAM/eDRAM it may not be for the same reasons.
    Yukon was a tiny chip, with a 128 bit bus. 4GB was the absolutely necessary and 4gb GDDR5 could have not been available. So it made sense to have DDR4 eDRAM/eSRAM, going up to a 256 bit bus to secure 4GB of RAM (of GDDR5) might have gotten in the way of price reduction.

    For durango, contrary to Sony, they put those 8GB as an absolute priority, 4gb memory chip was not a given for launch period => eSRAM.

    I would not call Sony's move luck, they let their option opened. 4GB was fine, MSFT planned to used that amount in Yukon and GPU performances aside it was not supposed to less than what durango does. For the ref tablet running windows 8.1 runs game as diablo III, farcry and plenty of others with 2GB, the amount of reserved RAM is pretty enormous, and there is 1GB left which fate has yet to be decided. 1GB is enough to run Windows 8.1 according to recent MSFT policies change.
    Overall MSFT may have an advantage on the paper, but at 499$ vs 399$ with most likely exactly the same deficit in performances in games. I don't think it would have impact services either, you don't run (or need) that much things when you play and when you don't both system have resources in sparse as far as media consumption and running is concerned.
    Reception of the new kinect, the initial strategy for the system would not have better received even if Sony had launched with less RAM.
    I think it was a reasonable bet from Sony part as 4GB was in itself a sane amount of RAM for their system. I think they could have gone with doubling the RAM but it was too good to pass they let MSFT without any PR point, and it may not have cost them that much (less than x2 , x1.5? ).
    If it did cost them a significant amount of money it means that they could also have launched slightly cheaper with 4GB, or include a ps eye. It goes both way.

    I think they paid a beefy tribute to those extra 4GB of RAM. I've hard time deeming worth it.
     
    Last edited by a moderator: May 23, 2014
  3. Scott_Arm

    Scott_Arm Legend

    I believe bkillian said that 4GB was the amount of memory planned for Durango and it changed part way through development. Not sure how late.

    As for Sony being lucky with GDDR5, I only mean that forecasting and predictions, especially years in advance, are rarely accurate. That would be the same for any company. You.'re more often than not going to be wrong and Sony happened to be right. I'm sure they did a lot of research, like everyone else. I'm not downplaying that. They had planned on 4GB or RAM and were able to change it very late because the price ended up being favourable. That was not something in their control, and maybe something they weren't even able to forecast when they were initially planning their design.
     
  4. 3dilettante

    3dilettante Legend Alpha

    I've seen a number of comparisons that put the PS4 measurably higher, but there's no clear breakdown as to the largest contributors.

    The most recent I've seen is this:
    http://www.nrdc.org/energy/game-consoles/files/video-game-consoles-IP.pdf

    Except for connected standby, the PS4 burns more power.
    The figures appear to include all peripherals, which means the PS4 camera and Kinect are part of the total. I'm pretty sure Kinect's use of infrared LEDs is more power-hungry than Sony's stereoscopic solution, so the Xbox is tens of watts more efficient under load even with Kinect.

    The non-gaming uses for these consoles are really high for some reason.
     
  5. MrFox

    MrFox Deludedly Fantastic Legend

    The low end gtx750ti says hello :wink:

    So far as I can tell, practically all brands of GTX750 are using the hc03 bin which is the exact same chip as the ps4. I think that seals the deal for cost advantage. The PCB has the provisions for clamshell, and they still went for 4x 4gbits instead of 8x 2gbits parts. It's the higher density that provided cost savings.

    Considering they used the 6.0gbps parts too, I would guess it mean samsung has pretty good yields for speed, otherwise they'd have to dump the 5.0gbps to someone, and even the low end cards don't use it.
     
  6. Dave Baumann

    Dave Baumann Gamerscore Wh... Moderator Legend

    A few GPU's make it far from a commodity RAM.
     
  7. liolio

    liolio Aquoiboniste Legend

    I think 3dilletante in his answer to one of my post raised a relevant point: the scale of the project.

    MSFT went to design its own audio DSP and integrate it into a super complex SoC, they worked on integrating the eSRAM too, they worked on Kinect, etc. By its very scale (huge project) there was no place made for a more flexible approach. They shipped on time and so far there are no issue with the hardware it is quite a feat in itself.
    Lots of thing could have gone wrong.

    Sony set a less intensive job for itself, mostly integrating IP from AMD with AMD in lead actually.
    I mean for Sony putting the PSV inners together might have been a greater involvement for their hardware teams than it was to put the PS4 together. Not that I imply it is easy to put together a +300mm2 SoC but the scale of the project is still more limited than Durango.

    Wrt to those 4gb memory module I don't know when the professional heard about it, I asked many time if they were coming as I found it weird for DDR3 to make progress and for the GDDR5 to "stall" whereas there was no replacement solution in sight (delays after delays for DDR4 and it seems that more exotic solutions were never really a possibility for a mass produce device in 2013), I got repeatedly the same answer, no roadmap or info on such product / it was not a public data or a data that one in the known could have released.
    I would not believe that AMD, Nvidia, Sony, MSFT heard about it as late as we did (ahead of the PS$ unveiling) but when did they hear about it? :?:
     
    Last edited by a moderator: May 23, 2014
  8. liolio

    liolio Aquoiboniste Legend

    Can you tell us since when AMD knew those 4gb modules were coming? Or it is data under NDA or sensible info given the context?
     
  9. Dave Baumann

    Dave Baumann Gamerscore Wh... Moderator Legend

    I can tell you that we launched product in the <$150 space using this density in October last year, that's already public (R7 260X).
     
  10. liolio

    liolio Aquoiboniste Legend

    Great product by the way :)
     
  11. 3dilettante

    3dilettante Legend Alpha

    They may have had a decent guess or advance knowledge when they gave Bonaire a 128-bit bus, given how very capacity constrained the design would look from a future-proofing perspective if GDDR5 topped out at 2Gbit.

    Sony likely had a good idea or was driving (edit: or helping drive) 4Gb as well. The volumes of GDDR5 are such that manufacturers don't have continuous production runs with the assumption that there will be a market that will absorb them. The DRAM manufacturers would be more conservative about releasing new GDDR5 devices unless someone made it worth their while.

    There were also rumors posted in this forum about 8Gbit chips being developed and specially ordered by a large customer.

    edit: beaten
     
  12. MrFox

    MrFox Deludedly Fantastic Legend

    I never claimed otherwise. I'm just saying that the higher 4gb density doesn't seem to have a negative impact on price, I supported this by pointing to the existence of products using 4gb in situations where the PCB was clearly designed for either 2gb or 4gb parts.
     
  13. eastmen

    eastmen Legend Subscriber

    IF sony is already making money on ps4s sold at $400 then MS will be making money on the xbox one sold at $400.

    The APU is similar in size and the ddr ram is cheaper in the xbox by a sizable amount.

    So I don't see a problem , MS is already at 5m sold , they just have to keep putting out tons of games .
     
  14. Strange

    Strange Veteran

    you mean shipped.

    Also, stopping production is going to increase price per unit. Not to mention inventory costs.

    We don't have a confirmation that XB1 costs less to manufacture than PS4, If anything, there are reports that XB1 is actually more expensive. So I don't think PS4 making money automatically means the XB1 is also making money.
     
  15. Dave Baumann

    Dave Baumann Gamerscore Wh... Moderator Legend

    Stopping production?
     
  16. DSoup

    DSoup Series Soup Legend Subscriber

    I don't think they should stop there. They should send surplus One's back to the plant to be disassembled and have the components returned for a refund. :yep2:
     
  17. kotakaja

    kotakaja Banned

    Lately there is many paper from ISSCC 2014
    especially from eSRAM/ SRAM technology.

    I will posted 2 techical digest of 6T SRAM from both Samsung and TSMC
    both are using 16/14nm process.

    Why?, because it is a perfect time to ask numerous question, related Xbox one eSRAM.
    Especially the big question regarding is Xbox One SRAM is 6T or what ?
    Of course the first thing to do is estimating the area of Xbox one SRAM

    For reference
    Jaguar + Surrounding area = ~ 26 mm2
    http://info.nuje.de/Jaguar_CU.jpg

    X1 VS PS4 die shoot
    http://images.anandtech.com/doci/7546/diecomparison.jpg

    PS4 GPU Area is ~80mm2, PS4 total = 328mm2
    http://www.extremetech.com/extreme/...u-reveals-the-consoles-real-cpu-and-gpu-specs

    Based on above (Use above data we can extrapolated the SRAM area), or use Ruler tool (etc)
    http://i.imgur.com/f8SYdYH.jpg
    Xbox one SRAM area,as there is total 2 Area, certainly less than PS4 GPU or Xbox one GPU area.
    Xbox one SRAM area , per AREA ~40 mm2 (abosolutely less than 45 or 50mm2)

    And we believe this per AREA hold 16MB and the type is 6T
    The Question is based on what ??
    Most of the people that said Xbox One SRAM 6T,
    did not even provide any comparison to SRAM technology of TSMC or Samsung or other.
    or even paper or technical digest for comparison.

    Now i provided the latest in 6T SRAM from both Samsung & TSMC.
    Also based on these data, the density of TSMC 16nm is basically the same as Samsung 14nm FF

    Samsung 128mbit (16MB)
    6T SRAM, 14nm FF
    -----> area : 75.6 mm2 <----
    http://i.imgur.com/7exhOXk.jpg
    http://i.imgur.com/FqRO9Il.jpg

    TSMC 128mbit (16MB)
    6T SRAM , 16nm
    ----> area : 42.6 mm2 <----
    http://i.imgur.com/n1BsRob.jpg
    http://i.imgur.com/XHZwWrQ.jpg

    Now is it possible that Xbox one SRAM is 6T using area less than both examples ? and still using 28nm ?, you be the judge .... :D
     
  18. TheAlSpark

    TheAlSpark Moderator Moderator Legend

    Transistor density can change according to design specs related to I/O, speed/performance & power characteristics.
     
  19. kotakaja

    kotakaja Banned

    but it is impossible the density @28nm is better than 14nm (14nm itself already >2x, real 14nm of course will be ~4x, but currently it is like marketing term )
    i can provide more example.
     
  20. Strange

    Strange Veteran

    http://www.gamerevolution.com/news/...ox-one-production-due-to-excess-supply--25469

    back in late April.

    And I don't think that's surprising. Unless they're selling at least 500k a month, reducing production surely will happen. We projected ~1 million consoles per month way back in September, remember?
     
Loading...

Share This Page

Loading...