How is Sony going to implement 8GBs GDDR5 in PS4? *spawn

But that was before we knew about background recording and upload of gameplay footage etc. That sh*t dont come for free and that certainly doesnt leave alot for the O/S if its only 512MB.

Obviously not basing that on anything other than assumption. But seems a no brainer that the kind of live social/online infastructure Sony is building around the OS is going to require some memory ;)

Maybe they could stream it to the HDD instead of ram?, just a idea probably crazy.
 
Hardware encoder chip that streams it directly to storage using the ARM (or whatever) controller chip that can run without any interference from and to the rest of the system. As the system can also download games while 'off' it's clear this shouldn't be too hard.

The really interesting question is where and how much buffering will take place. It's even possible that they use a small bit of Flash (how much would you need for 15 minutes?)
 
512 mb is the only amount ever mentioned as being reserved. Just because people want more to be reduced doesn't make it fact.

Given the amount of features and/or the potential for new features in the future, expecting more than 512mb shouldn't really be seen as a slight against the PS4.
 
Hardware encoder chip that streams it directly to storage using the ARM (or whatever) controller chip that can run without any interference from and to the rest of the system. As the system can also download games while 'off' it's clear this shouldn't be too hard.

The really interesting question is where and how much buffering will take place. It's even possible that they use a small bit of Flash (how much would you need for 15 minutes?)

Depends on the bitrate, I really dont know whats reasonable for 1080P h264 but 10mbit/s doesnt seem to be too bad from googling, that would be around 1125MB of space needed.
 
Given the amount of features and/or the potential for new features in the future, expecting more than 512mb shouldn't really be seen as a slight against the PS4.
Xbox360 and PS3 had 512 mb in total and im fairly sure they could do everything we saw last night if they made am app for it.

Nothing seemed that heavy.

How much memory do you realistically expect all that to take up?

I didn't take it as a slight it just amuses me how conclusions are jumped to with evidence pointing to different outcomes.

They may reserve a GB, but nothing we saw last night will come close to that in my opinion and in 2 to 3 years if they haven't needed it I fully expect the reservation to be lowered like has happened this generation.
 
Depends on the bitrate, I really dont know whats reasonable for 1080P h264 but 10mbit/s doesnt seem to be too bad from googling, that would be around 1125MB of space needed.

if its design for web media expect more like 4-5mbit

heres a couple of youtube 1080P videos

if we assume H265 more like 2-3mbit

Video
ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High@L4.0
Format settings, CABAC : Yes
Format settings, ReFrames : 1 frame
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Duration : 5mn 46s
Bit rate mode : Variable
Bit rate : 4 096 Kbps
Maximum bit rate : 11.2 Mbps
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 25.000 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 0.079
Stream size : 169 MiB (95%)
Tagged date : UTC 2012-09-11 14:11:07


Video
ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High@L4.0
Format settings, CABAC : Yes
Format settings, ReFrames : 1 frame
Format settings, GOP : M=1, N=50
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Duration : 4mn 49s
Bit rate mode : Variable
Bit rate : 5 122 Kbps
Maximum bit rate : 8 182 Kbps
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 25.000 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 0.099
Stream size : 177 MiB (97%)
Tagged date : UTC 2012-01-24 11:38:59
 
They probably upped it a bit from 512MB. You're talking 4GB more of RAM so it won't be a stretch to imply that they could reserve a bit more.


Taking a gross overestimate, 8mbps= 60 MB/ minute. 15 minute will amount to only 900MB. Not too much.
Audio is negligible.
 
IIRC latency of gddr5 should be quite comparable to ddr3 (I've never seen hard numbers). Yes there's some more overhead associated but in the end it isn't really all that significant, assuming a decent MC.

Hardly. GDDR5 latency is much much worse than DDR3. Looking at them side by side makes GDDR5 look incredibly slow, in terms of latency. GPU won't care but the CPU cores are going to have a hard time with it.
 
Hardly. GDDR5 latency is much much worse than DDR3. Looking at them side by side makes GDDR5 look incredibly slow, in terms of latency. GPU won't care but the CPU cores are going to have a hard time with it.

But isn't that the main reason to have an OoOE CPU?
 
OoOe CPUs can usually schedule around 10-15 ns of latency. DDR3 is between 50 and 100ns, GDDR5 more.

Cheers

And that there is cache between GDDR5 and the CPU doesn't help anything? (this is a n00b question, I really don't know much about this)
 
And that there is cache between GDDR5 and the CPU doesn't help anything? (this is a n00b question, I really don't know much about this)

OOO CPUs are usually constructed with ROBs large enough to schedule around latencies comparable to a hit in LLC (last level cache). A LLC cache miss will stall *any* CPU core after 15-40 cycles.

One of the the advantages of OOO is execution continues as long as there are ROB resources available, thus multiple cache misses can be encountered with corresponding memory requests starting and going on in parallel.

The long latency of GDDR5 will hurt the cores, but nothing like the PS360 in-order crap.

Cheers
 
The Sony APU seems pretty big, maybe they managed to fit 512 bit bus in there. So they can go for lower clock GDDR5 to get that 176GB/s bandwidth. But still 8GB GDDR5 is alot of chips one way or another.
 
Taking a gross overestimate, 8mbps= 60 MB/ minute. 15 minute will amount to only 900MB. Not too much.
Thinking about that, you're probably right. Writing to RAM would save HDD thrashing, which constant recording could severely induce. I have one question though, and that's how a rolling 15 minutes of compressed video is structured? Once you've recorded 15 minutes, can you just shift the current memory address to the starting memory and just overwrite the beginning? How does the video codec handle that?

Hardly. GDDR5 latency is much much worse than DDR3. Looking at them side by side makes GDDR5 look incredibly slow, in terms of latency. GPU won't care but the CPU cores are going to have a hard time with it.
'Hard time' is probably an exaggeration. The cache is going to deal with most memory requests. Long ago we had someone present measurement of how effective cache was, and misses were surprisingly few and far between. Techniques learnt out of necessity this gen structure datatypes better too, so the occasions of cache misses should be fewer. The end result is, on a lot of algorithms, data flow should be smooth. Only when the cache fails will there be a massive stall, but all in all that should be something devs can factor in in balancing 8 cores. I'd conclude GDDR5's impact as a painful, sporadic inconvenience, rather than giving cores a hard time.
 
Thinking about that, you're probably right. Writing to RAM would save HDD thrashing, which constant recording could severely induce. I have one question though, and that's how a rolling 15 minutes of compressed video is structured? Once you've recorded 15 minutes, can you just shift the current memory address to the starting memory and just overwrite the beginning? How does the video codec handle that?

You would need to buffer slightly more then 15 minutes of footage because of how H264 records

to expand, if you trash the reference frame of a string of frames then every frame that relies on that reference frame is useless.
 
Last edited by a moderator:
Back
Top