How is Sony going to implement 8GBs GDDR5 in PS4? *spawn

I've posted it in other threads but I should have kept it here.
Better late but never.

The chips that PS4 would be using seem to be on the low end on the voltage side.

Instead of using 6Gbps speeds (192GB/s/256pins=6 Gbps)
They're using 5.5Gbps speeds (176GB/s/256pins=5.5 Gbps).
5.5 Gbps chips that have 4Gb density from Hynix (H5GC4H24MFR-T3C) requires less voltage than the 6Gbps ones (H5GQ4H24MFR-R2C).

To be specific, there seems to be 3 power ratings. 1.6V, 1.5V, and 1.35V.

1.6V is only available on two older high performance chips, and everything else is either 1.5V or 1.35V.
The one from Hynix that completely fits Sony's bill is rated at 1.35V.
Low power :D

http://www.skhynix.com/inc/pdfDownl.../Databook/Databook_1Q'2013_GraphicsMemory.pdf

That would be very clever, using the better than ever efficiency of PC hardware we have nowadays through intelligent implementation instead of brute forcing everything -typical PC in the good ol' days-.

Additionally, what Sony are attempting is very interesting and the console is going to be very capable, especially when you take into account it is closed hardware, just imagine the possibilities, no wonder why developers are excited with it and people like Carmack praised the machine so much.
 
But 8GB in dev kits imply that 4Gb density chips would already be in use. Maybe that's the case (early samples?) but isn't it also possible that the dev kits use a combination of DDR3 and GDDR5 in a discrete setup (which would be the source of the APU+discrete rumours).
I meant the final dev kits that aren't out yet, these must be the exact same hardware as the final console (but hopefully more ram, which now won't be possible for a while). If 4Gb is sampling now (Q1), it's in time for these dev kits.
 
That would be very clever, using the better than ever efficiency of PC hardware we have nowadays through intelligent implementation instead of brute forcing everything -typical PC in the good ol' days.

I'm not sure what's so clever about it. This is a configuration that has been used on PC GPU's for years and is in fact in circulation right now.

In fact, I'd go as far to say this pretty much is a brute force approach. They've taken the very highest memory density available at the end of this year (it's not even available yet) and put it in the maximum possible confiugration. You literally can't have any more memory on a 256bit bus and likely won't be able to for at least a year or so until after the PS4's launch. This is admirably brute force IMO!
 
That almost makes me wish they went for a 512 bit bus. If you already have the chips how much would the interface cost extra? Double the rops and TMU's and you'd have a fill rate and texturing monster.
 
That almost makes me wish they went for a 512 bit bus. If you already have the chips how much would the interface cost extra? Double the rops and TMU's and you'd have a fill rate and texturing monster.

You can't double the TMU's without doubling the CU's so you're basically talking about doubling the entire GPU there which clearly isn't possible.

I'm not even sure they'd need to double the ROPs either, PS4 already has plenty.

Doubling the memory interface would no doubt add considerable complexity and cost to the console but yeah, it would have have made for one incredible memory interface.
 
So the TMU's are still tied directly to the CU's yeah that'd take some modification then which could be why they didn't do it. Fillrate would be nice but would only help with AA and alpha particles. Not sure the complexity of going 512-bit is really in the interface as much as it is in the needing 16 chips on the board though someone can coorect me if I'm wrong there. Sure it limits shrinks but that could be broke out to an interposer at somepoint.
 
In fact they could have designed it around 16 1Gb GDDR5 chips in clamshell mode (2GB according to the earliest rumors) then upped it to 4GB with 2Gb chips and eventually when 4Gb chips came into play upped it to 8GB.
It doesn't make sense that they'd design it for 1gb chips initially since those are available for at least 2 years already (and the announcements for the 2gb chips, which we still haven't seen for 4gb, were out 2009/2010).
 
To be specific, there seems to be 3 power ratings. 1.6V, 1.5V, and 1.35V.

1.6V is only available on two older high performance chips, and everything else is either 1.5V or 1.35V.
The one from Hynix that completely fits Sony's bill is rated at 1.35V.
Low power :D

http://www.skhynix.com/inc/pdfDownload.jsp?path=/datasheet/Databook/Databook_1Q%272013_GraphicsMemory.pdf
Nice find. As far as I know 1.6V isn't really an official gddr5 voltage, that's just an overvolt spec to get the chips running reliably at that speed (kinda like ddr3-1600 where still lots of modules require more than 1.5V).
Nice to see some improvement there though, initially 6gbps required 1.6V, now only 7gbps and soon they can deliver chips with 7gbps with 1.5V too (and get low power 1.35V versions up to 5.5gbps). Right in time for new graphic cards at end of the year too :).
 
Good find, at 1.35v it should be only 10W for the whole 8GB.

I'm not sure how the clamshell configuration works, but the PS4 would be using these x32 chips in an x16 mode. I think it's the same chips either way, there's not chips specifically marked as x16.

What I'm thinking is that maybe they didn't initially plan for a clamshell configuration for the console, and they might have had a PCB designed to allow clamshell for the devkits (8GB), and all chips on one side for the console (4GB), so they would source the exact same part number for either the dev kit or the console. But now with 8GB that would require a complete rework of the casing because they'd need airflow under the board. Hence the delay for the enclosure.

You can use flipchip instead so then everything is on one side. It also gets you half way to a TSV stack.
 
I'm not sure what's so clever about it. This is a configuration that has been used on PC GPU's for years and is in fact in circulation right now.

In fact, I'd go as far to say this pretty much is a brute force approach. They've taken the very highest memory density available at the end of this year (it's not even available yet) and put it in the maximum possible confiugration. You literally can't have any more memory on a 256bit bus and likely won't be able to for at least a year or so until after the PS4's launch. This is admirably brute force IMO!
Are you sure about that? If you read what Carmack wrote you can realise how the design choices behind the PS4 are the correct ones.

The paradigm behind the design of PS4 seems to be easiness of programming and power, all of this comes accompanied with an unified memory system and a few customized chips like an audio chip and one helping during background downloads, and some other little details we don't know about yet.
 
Is there a chance they could have yield issues which would impact how many units are available to consumers?

These chips are available Q1 this year.
Unless they delay it for more than 1 quarter I don't think Sony is in too much trouble.
 
These chips are available Q1 this year.
Unless they delay it for more than 1 quarter I don't think Sony is in too much trouble.

If it pans out based off what we know Sony might actually benefit with memory this time in a similar way to how MS benefited from timing the GPU shift with the 360.
 
Writing 1MB per second to the hard drive is hardly a problem worth engineering around. If any smartphone or point and shoot camera can manage to effortlessly encode and save 1080p videos with their meager RAM and storage speeds, PS4 won't even feel it.
While I agree with the 1MB/s being peanuts, comparing a device like a cellphone (or camera) to another platform is difficult because the cellphone and camera have dedicated DSP's to handle video encoding, it's not a general purpose CPU. As a simple example, cameras can process RAW to JPG at a rate of 3-10 fps, but even an i7 can't do that.

Having said that, I believe the 2/20 presentation mentioned having separate processor for uploading/downloading/some os functionality, it's possible that the PS3 has a DSP or dedicated hardware outside of the APU for video encoding.
 
It doesn't make sense that they'd design it for 1gb chips initially since those are available for at least 2 years already (and the announcements for the 2gb chips, which we still haven't seen for 4gb, were out 2009/2010).
I could see that. Design the PS4 using 1gb chips and rely on that for early specification. When/if 2gb chips have a solid roadmap/feasiblity for launch, update the specification to 2gb allowing 8GB's of memory.
 
While I agree with the 1MB/s being peanuts, comparing a device like a cellphone (or camera) to another platform is difficult because the cellphone and camera have dedicated DSP's to handle video encoding, it's not a general purpose CPU. As a simple example, cameras can process RAW to JPG at a rate of 3-10 fps, but even an i7 can't do that.

Having said that, I believe the 2/20 presentation mentioned having separate processor for uploading/downloading/some os functionality, it's possible that the PS3 has a DSP or dedicated hardware outside of the APU for video encoding.

Rumors say dedicated video decoder (and encoder ?), and possibly an ARM chip (from TrustZone ?) for upload/download management.
 
Are you sure about that? If you read what Carmack wrote you can realise how the design choices behind the PS4 are the correct ones.

I was referring specifically to it's memory setup. I.e. there's nothing clever about using the highest density memory chips available in the largest number of chip configuration possilble for a 256bit memory interface. It's basically what the 4GB versions of the 680 and 670 are doing today.
 
Is it likely Sony have gone with rambus gddr5?

The figures you are saying do not really work when you consider the ps4 'hibernation' mode.

Is it really going to need 30 + Watts even when it's off?

(I assume the only way hibernation can work is keeping the memory powered up to some degree)
 
Back
Top