How is Sony going to implement 8GBs GDDR5 in PS4? *spawn

Discussion in 'Console Technology' started by Sonic, Feb 21, 2013.

  1. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    Say what?
     
  2. Brad Grenz

    Brad Grenz Philosopher & Poet
    Veteran

    Joined:
    Mar 3, 2005
    Messages:
    2,531
    Likes Received:
    2
    Location:
    Oregon
    Rumors don't say, Mark Cerny said on stage the PS4 has a hardware video encoder built in.
     
  3. Ninjaprime

    Regular

    Joined:
    Jun 8, 2008
    Messages:
    337
    Likes Received:
    1
    So, if you look at those numbers, they appear to be saying 2GB of GDDR5 at 4GHz on a 256bit bus is 8.7W, right? PS4 having 8GB at 5.5GHz with the same number, assuming linear scaling with clock speed is 8.7*4 =34.8W and then 4/5.5= .7272 and 34.8/0.7272 is almost 48W. Of course clock scaling usually isn't linear...
     
  4. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    More ram isn't linear either. How many chips, how many volts, how high is it clocked.
     
  5. Quaz51

    Regular

    Joined:
    May 18, 2002
    Messages:
    916
    Likes Received:
    1
    Location:
    France
    it's old chip, not PS4 GDDR5 chip
     
  6. Ninjaprime

    Regular

    Joined:
    Jun 8, 2008
    Messages:
    337
    Likes Received:
    1
    Exactly. Now, they might have a 30nm "class" shrink, which may account for 20-25% savings, but taking into account other factors I think that ends up back where you started.

    *I say "class" because IIRC 40nm "class" was 46nm and If I remember right 30nm "class" was 39nm, just off the top of my head. RAM marketers are a tricky bunch. :wink:
     
  7. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    What you calculated would be a 1024bit 8GB GDDR5, and I have no doubt that would consume 48W. It's the interface that consumes power and it dwarfs the dram array.

    If we accept the two numbers that samsung provided for 1.35v 46nm 4GHz :
    2GB 128bit = 4.3W
    2GB 256bit = 8.7W

    Logically, using the 128bit example twice would be:
    4GB 256bit = 8.6W

    It means capacity have zero impact on power, and bus width scales it linearly, so by extension, with 4Gb chips instead of 2Gb chips it's still 8.6W.
    With your linear scaling from 4GHz to 5.5GHz you get 11.8W
    Remove your 20% for 30nm it's 9.4W

    Sure it looks too low, but it's the numbers I got from samsung. Is there any other source for gddr5 power consumption?
     
  8. Ninjaprime

    Regular

    Joined:
    Jun 8, 2008
    Messages:
    337
    Likes Received:
    1
    Been trying to to figure this out as well, so far all I can come up with is with 680 GTX cards, 4GB models using 5-8 watts more than similar 2GB models, so I suspect that the # of chips/density scales pretty close to linear.
     
  9. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    Don't trust whole system benchmarks when using different cards, unless the 4GB version have absolutely identical benchmarks, otherwise it will burn more power just because the GPU is better used by the higher memory amount, and a 1% higher frame rate will skew the result. The difference between 680 2GB abd 4GB was 399W versus 395W for the whole system at maximum load. Benchmarks are a tiny bit higher with the 4GB so that means the whole system is working a bit more. This is a 1% statistical noise.

    So far there are clear indications that higher density chips have a negligible impact on power.
    For the number of chip, PS4 would be 16 chips clamshell. The samsung example also has to be 16 chips clamshell for the 128bit bus.
     
  10. DJ12

    Veteran

    Joined:
    Oct 20, 2006
    Messages:
    3,105
    Likes Received:
    198
  11. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    That's not GDDR5 but an alternative, as yet nameless, Rambus tech. Could be called DDR3 Hyperboost or something. Clarification that Sony are using GDDR5 means Rambus's tech is off the cards.
     
  12. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    The hynix list above also says GDDR5M is sampling only in Q3, so it's impossible to be ready on time.
    It's just plain GDDR5, nothing special except for the fact that it's the best possible type of memory available. You win again, rationality!!
     
  13. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    864
    Likes Received:
    693
    Only if you let it....

    Have there been any prior memory techs that have had the same latency/bandwidth contrast as with GDDR5/DDR3 and did it affect general purpose code much? I remember around the DDR/DDR2 transition there was some 'henny-penny, sky is falling down' stuff out there around the higher latencies for DDR2 over DDR1, did anyone actually see a real world decrease in performance?

    As for mitigation from what I've read is it all on the coder to select better data types and to ensure they have the right data in the right places to avoid misses. Does anyone know if the compiler can help here or is it really down to the dev. team alone?
     
  14. Lucid_Dreamer

    Veteran

    Joined:
    Mar 28, 2008
    Messages:
    1,210
    Likes Received:
    3
    Is it possible that Sony used the name GDDR5 only because they wanted the public to clearly see a difference between Durango's 8GB DDR3 RAM? If they said they were using 8GB of stacked DDR3 RAM, wouldn't the average person say the RAM was the same? It would be a lot harder to differentiate their RAM from the competition, wouldn't it?
     
  15. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    I would think it would be cooler to say 8GBs of Awesomesauce RAM rather than tell a very specific lie and then get called on it.
     
  16. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    No, it is not possible. In effect, it is rather worrying that you would take it as a possibility.
     
  17. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    When Mark Cerny said GDDR5 on-stage, the cameraman should have zoomed in to Cerny's face (plus extra spotlight or nightshade's favorite, god rays) to make sure everyone of us imprint the image in our mind. :twisted:

    Hmm... probably more effective and general as collection of libraries, tools, design patterns and best practices.

    Let's hope we hear more in GDC. In the PS3 early days, we got lot's of presentations on how to exploit the SPUs.
     
  18. jihadjoe

    Newcomer

    Joined:
    Feb 26, 2013
    Messages:
    1
    Likes Received:
    0
  19. keenism

    Newcomer

    Joined:
    Feb 26, 2013
    Messages:
    27
    Likes Received:
    0
    Location:
    Behind You
    I keep searching for a nice write-up comparing/contrasting ddr3 with gddr5, learned a lot but not what I was really after... latency! In fact the best was a link directly here, and i'll quote this great post from bobblehead;

    "GDDR5 uses similar signaling to GDDR3. Pseudo open drain and pull up termination, but at lower voltage (1.2-1.5V rather than 1.8V). In order to push the interface faster there is additional overhead on the sending and receiving sides and logical changes to the interface. That overhead adds to the base latency. The DRAM core has roughly the same latency as DDR3 but the GDDR5 IO layer imposes that extra latency penalty. For that extra cost you gain the ability to send data a lot faster. As a result, GDDR5 latency is a bit higher than DDR3 latency in absolute terms, but it's not a huge difference."

    Perhaps people are blowing latency out of proportion, humanity seems to have a nice record of doing so...

    ps.. been lurking here for around 2 years and would like to thank all the people who take the time explaining, breaking down....and explaining again all of the esoteric information I wouldn't understand otherwise.
     
  20. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,698
    Likes Received:
    428
    Location:
    Somewhere out there
    Ya those 4Gb chips didn't exist a few days ago. Relatively new addition.
    I wonder if they support 5.5Gbps. These are also rated at 1.5V.

    I would suppose yes but don't have anything to confirm at the moment
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...