3D Xpoint Memory (Intel Optane)

Discussion in 'PC Industry' started by hoom, Aug 2, 2015.

  1. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    In what situation do you find data centers with extremely low write loads AND a gigantic amount of CPU sockets?

    Sounds like a very niche combination to support, especially as if you fuck up you can wreck your memory modules permanently. Also, as it would seem, this tech has ~10x DRAM access latency, so it wouldn't be great as main memory anyway in a high-read scenario I would think.
     
  2. Pixel

    Regular

    Joined:
    Sep 16, 2013
    Messages:
    942
    Likes Received:
    397
    It would be great for consumer level motherboards. Boot up much faster than with SSD.
     
  3. Pixel

    Regular

    Joined:
    Sep 16, 2013
    Messages:
    942
    Likes Received:
    397
    Also the latency and Memory bandwidth would cripple videocards. Just looking at memory bandwidth 6GB/s for 1st generation Xpoint memory dimms, and eventual goal of 25GB/s for future 3D stacked Xpoint memory.
     
  4. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    Even with regular SATA SSDs, boot time is not the major component of system startup time, rather, it's the POST... Or well, it is with my mobo anyway, maybe more consumer-oriented, mass-market mobos than my ASUS RoG-series board have UEFI firmware which boots quicker, I'm not sure. :)

    My older nehalem-based socket 1366 board has a BIOS with a quite substantial POST. Windows bootup, even windows 7, is relatively quick in comparison from that box's quite old Intel X25-E SSD (although it now runs win10, and thus boots even quicker.)
     
    Pixel likes this.
  5. BlackAngus

    Newcomer

    Joined:
    Apr 2, 2003
    Messages:
    122
    Likes Received:
    15
    I was thinking more in-addition to vs replacing current gddr implementations. For instance used as a closer to GDDR memory texture cache much faster than getting data from the HDD?

    Also the 6GB number appears to be per dimm, with multiple dimm's/memory banks wouldn't it be possible to have much more memory/bandwidth available?

    http://www.kitguru.net/components/m...nt-ssds-will-feature-up-to-6gbs-of-bandwidth/
     
    Pixel likes this.
  6. Pixel

    Regular

    Joined:
    Sep 16, 2013
    Messages:
    942
    Likes Received:
    397
    Nice link I've never seen this photo before.

    [​IMG]

    Hmm I think there is little reason to place xpoint physically on cards for texture cache at least for videogames. Games generally designed around X1/ps4 and thus cards don't need that much more memory than they currentky have. Right now many games cache textures in system memory or directly in vram if the card has alot of vram. Our current pcie 3.0 is good enough for streaming textures from all these X1/ps4 ports, PCIE 4.0 is coming in 2017. HBM2 will further increase the amount of memory on cards as the technology gets cheaper. Why not just have the xpoint memory elsewhere in the system and not burden cardmakers with additional costs?
     
    #66 Pixel, Nov 21, 2015
    Last edited: Nov 21, 2015
  7. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    The ideal for a gaming PC as far as I can see is 3D Xpoint SSD (main storage) -> DDR4 (system memory) -> HBM2 (graphics). That's one crazy fast memory pipeline.
     
  8. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    Why not HMC or HBM RAM for the CPU as well? Solder on 16GB, and you won't have to upgrade - there's totally no need to for regular schmoes, or even hardcore gamers. :)
     
  9. BlackAngus

    Newcomer

    Joined:
    Apr 2, 2003
    Messages:
    122
    Likes Received:
    15
    Xpoint would provide much larger quantities of pretty fast memory compared to HBM (Very fast memory), at a price point that is much lower.
    Throwing aside the XB1/PS4 idea would having an order of magnitude more pretty fast memory with HBM allow for the PC to do anything it couldn't do easily today? (much larger totally destructible worlds, much higher texture quality? etc?)
     
  10. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    2,812
    Likes Received:
    403
    Indeed.
    That rumored 250W APU on 14nm finfet with HBM2, new gen GPU & Zen CPU cores could be absolutely epic if execution is good.
     
  11. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    While cool, that would still be much weaker than a 95w CPU + 250w GPU discrete alternative.
     
  12. Rurouni

    Veteran

    Joined:
    Sep 30, 2008
    Messages:
    944
    Likes Received:
    206
    Why not compare it to a combined 250w for CPU+GPU? I think the performance should differ to much. The current gen APU problem is mostly the memory bandwidth and also the fact that it is targeting sub 100w. If someone were actually make 250w APU with HBM, I believe it would compare favorably vs a combined CPU + GPU @250w.
    For comparison, a R7 250 with 6CU@1GHz have 75w max TDP, while Kaveri A10-7850K 8CU@720MHz have 95w max TDP. Of course straight comparison would be hard because one has on board RAM (thus faster in memory bandwidth) and the other one not only have CPU, but also the north bridge. Also it will adjust the CPU clock under a high GPU load, which a CPU + GPU combination wouldn't do. But my point is that overall an APU should at least perform at a similar level vs CPU+GPU within the same power budget. Heck, it would probably perform better at the expense of flexibility in the power budget.
     
  13. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    The point was to illustrate that discrete CPU+GPU will always be able to afford a higher power budget than an APU. Thus no matter how awesome your APU is, there's always going to be a more powerful discrete option.
     
    Pixel likes this.
  14. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    2,812
    Likes Received:
    403
    Or you could compare with a Quad crossfire I guess :razz:

    I have been kinda hoping that top end discrete may go under 250W on 14nm.
     
  15. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,494
    Likes Received:
    806
    I wouldn't be so sure. The graphics AIB market has been eroded from below by integrated graphics to the point where Intel now has almost three quarters of the market (in units). HBM/HMC will allow the CPU/APUs to kill the next tier of AIBs.

    Cheers
     
    fuboi likes this.
  16. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    2,812
    Likes Received:
    403
    I thought Intel had over 80% back in the day when they had the really really bad onboard graphics, so arguably 75% is down.
     
  17. Pixel

    Regular

    Joined:
    Sep 16, 2013
    Messages:
    942
    Likes Received:
    397
    I think there will always be system builders who spend +$250 on videocards, how far can apu's and integrated gpu push their performance into these higher price brackets?
     
  18. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,572
    Likes Received:
    4,476
    That was also before AMD had integrated graphics. AMD likely ate into some portion of that integrated market share while at the same time losing discrete market share to Nvidia with the end result of them having lower market share than in the past.

    That was also when Intel's integrated basically destroyed most of the IHVs making cheap graphics chips for notebook computers (like Trident Microsystems, for example). Those low end graphics chip makers dominated the mobile space with extremely cheap chips until Intel came along. The relatively more expensive and robust ATI chips and later Nvidia chips were never a challenge to either those low end chip makers or Intel's integrated for the vast majority of the notebook market (where cost and not performance was the key determining factor).

    I'm also fairly certain it was never as high as 80% of all PC graphics. 80% of mobile graphics might have happened, but not overall PC graphics shipments. But I could be wrong.

    Regards,
    SB
     
    #78 Silent_Buddha, Nov 23, 2015
    Last edited: Nov 23, 2015
    Grall likes this.
  19. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,572
    Likes Received:
    4,476
    I'm a bit late to see this as I've been a bit busy but the technology looks to be really promising.

    Enterprise Drive - http://www.anandtech.com/show/11209...ep-dive-into-3d-xpoint-enterprise-performance
    Low capacity consumer "caching" drive - http://www.anandtech.com/show/11210/the-intel-optane-memory-ssd-review-32gb-of-kaby-lake-caching

    Obviously first generation products (no idle power states on the consumer drive, for example) and quite expensive compared to mature NAND based drives.

    Performance, however, is impressive for a first generation implementation, especially access latency. It'll be interesting to see if this can mature to the point where it would be suitable (price per GB) for the consumer market. And if it can, how long it'll take Intel to get there.

    Also interesting, is that it doesn't suffer from NAND's dependence on multiple channels for high performance. Thus low capacity drives with only a few chips don't suffer large performance penalties compared to larger capacity drives with more chips.

    Would like to see a hybrid HDD with integrated Optane cache similar to Seagates SSHDDs.

    Regards,
    SB
     
    Gubbi likes this.
  20. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    Intel Optane SSD DC P4800X is apparently released now; the 375GB 2.5" U.2 version is found at one retailer by the search site I'm using and it costs 'only' about €2200. :runaway:

    Jesus! Wasn't optane supposed to be roughly price parity to flash, or am I entirely mistaken here?
     
    Lightman, iroboto and BRiT like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...