PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Do you think PS4´s APU will have unified address space and some kind of L3 or GPU will have to bring cpu data from cpu's L2 and cpu from GPU´s L2?.


Does this sound like Unified Address Space?



From the guy on Arstechnica a day after the PlayStation Meeting before VGLeaks posted the info about the GPU – compute, queues and pipelines

http://arstechnica.com/civis/viewtopic.php?p=23922967#p23922967

Blacken00100 said:
Posted: Thu Feb 21, 2013 1:38 am

So, a couple of random things I've learned:

-It's not stock x86; there are eight very wide vector engines and some other changes. It's not going to be completely trivial to retarget to it, but it should shut up the morons who were hyperventilating at "OMG! 1.6 JIGGAHURTZ!".

-The memory structure is unified, but weird; it's not like the GPU can just grab arbitrary memory like some people were thinking (rather, it can, but it's slow). They're incorporating another type of shader that can basically read from a ring buffer (supplied in a streaming fashion by the CPU) and write to an output buffer. I don't have all the details, but it seems interesting.

-As near as I'm aware, there's no OpenGL or GLES support on it at all; it's a lower-level library at present. I expect (no proof) this will change because I expect that they'll be trying to make a play for indie games, much as I'm pretty sure Microsoft will be, and trying to get indie developers to go turbo-nerd on low-level GPU programming does not strike me as a winner.



gpu_queues.jpg
 
But that already is made in actual GCN with its ring buffer and 2 ACEs, right?.

Yap, I think so. Those data structures should be in some sort of shared memory between the CPU and GPU.

For compute work, it has more 7 more ring buffers per queue likely to improve utilization. One or more of these buffers should (always) have data for the GPU to work on.

The GPU should be able to fetch data via DMA, or access memory directly too.
 
Hire some white hats to pummel the system with every trick they can think of, I can hope.
They hired Philip Reitinger in 2011, he's now Senior Vice President and Chief Information Security Officer at Sony.
http://en.wikipedia.org/wiki/Phil_Reitinger
Phil Reitinger was the Deputy Under Secretary of the National Protection and Programs Directorate (NPPD) and Director of the National Cyber Security Center (NCSC) at the United States Department of Homeland Security. Reitenger leads the Department’s integrated efforts to reduce risks across physical and cyber infrastructures and helping secure federal networks and systems by collecting, analyzing, integrating and sharing information among interagency partners.
I would think this guy can secure a system and a network. :LOL:
 
They hired Philip Reitinger in 2011, he's now Senior Vice President and Chief Information Security Officer at Sony.
http://en.wikipedia.org/wiki/Phil_Reitinger
I would think this guy can secure a system and a network. :LOL:

I guess, but one would assume that So y had some guy in charge of security before. What I really meant by "hire a few white hats" is I guess the Google Chrome approach of paying people who find bugs and exploits. A $50,000 reward is cheap money to Sony but to some hacker who might be tempted by the "dark side" that's still a lot of coin.
 
They also talked to Geohot (out of court that is) a little while back. I think after they settled? There was an article about that. He probably gave them some tips :)

Preventing memory bus glitching would go a long way I think.. however they could go about doing that. I imagine when everything goes on interposer it becomes harder to do.
 
How will Sony implement security this time round?

Probably...

Custom ARM processor to supervise I/O, including background download/upload.

Unspecified amount of Flash RAM accessible only to the OS (for firmware, and working storage).

Constant online checks via PSN.

H/w security features to guard against h/w level attacks.

EDIT:
If DS4 has biometric sensors, then it will/can be used to authorize payment too. ;-)
 
Probably...

Custom ARM processor to supervise I/O, including background download/upload.

Unspecified amount of Flash RAM accessible only to the OS (for firmware, and working storage).

Constant online checks via PSN.

H/w security features to guard against h/w level attacks.

EDIT:
If DS4 has biometric sensors, then it will/can be used to authorize payment too. ;-)


And maybe they can use the RFID reader to scan your Credit Card
 
Reading about the VCE unit in Radeons, seems there are two encoding methods. One in real-time less intensive that only uses the resources of the VCE block and one that encode faster ( "hybrid mode" ) that real-time but uses also GPU resources.

Do you think they will have to use the last one to Gaikai/streaming?. Then some GPU resources will be drained from game developers to do so. How many flops does this encoding method use?.

HD7970-206.jpg


PS: looking for benchs of VCE i see its hard to find them ( not exposed in the catalyst drivers at least in the beginning ). Intel QuickSync is much much faster in this encoding benchmark of Trinity APU:

http://www.tomshardware.com/reviews/a10-4600m-trinity-piledriver,3202-15.html
 
Last edited by a moderator:
According to the net, Gaikai implemented an iPad client before. Probably depends on the video quality to a certain extent.
 
According to the net, Gaikai implemented an iPad client before. Probably depends on the video quality to a certain extent.

Gaikai uses x.264 cpu encoding. But prior to encode it downscale the image ( in PS4 scan-out engine allow to downscale ). Full x.264 encoding is very cpu intensive, so we can hope for PS4 ( and Xbox ) to make use of VCE. But seing it seems quite a weak performer, i fear it will draing GPU compute power to do so.

Patsu, and this is when we start to miss those fantastic SPEs...
 
Hm ? It should only encode for Remote Play on Vita.

If it's a client, it should decode the video images in the VCE. If an iPad can run Gaikai client, the PS4 should be able to do so easily.

We should also be able to see the SPUs in action for the PS3 client.
 
Hm ? It should only encode for Remote Play on Vita.

If it's a client, it should decode the video images in the VCE. If an iPad can run Gaikai client, the PS4 should be able to do so easily.

We should also be able to see the SPUs in action for the PS3 client.

To decode you have the UVD decoder in PS4 -play videos and such- and i know IPAD uses gpu aceleration for that and PS3 the SPEs. The VCE is to encode only ( Gaikai server part, not client ). Apart from using encoding to stream games to Vita it will be used to stream your games from PS4 to ustream while you play then. To make it fast they will use for sure Gaikai software and method ( the server part ). Or that´s what i understand.

So, if to encode VCE needs opencl assistance ( GPU ) what i meant is that a simple SPE would have been enough to encode at ultra fast speed without needing any aids.
 
Last edited by a moderator:
Going by HD camcorders, h.264 realtime encode isn't demanding or expensive in custom silicon. If AMDs solution isn't that capable, they should stick in some licensed silicon that is.
 
None of the PS4 streaming features require faster than real-time encoding. That's the whole point of the DVR and spectating: the footage is encoded as it happens. Benchmark comparisons are pretty meaningless, as are any GPGPU enhancements to speed up the process. The VCE just needs to be fast enough not to drop a significant number of frames while encoding directly from the framebuffer, including any potential resizing, or multiplexing to composite in audio commentary or your PS4 Eye video feed. From there it's either dumping a temporary MP4 to the hard drive or streaming live to Ustream and/or your PSN friends over your upstream internet connection.

Clients for RemotePlay or Gaikai features only need to be able to able to decode a relatively low bitrate h.264 video stream (say 8Mbps, well below a good quality Blu-ray encode). Today that includes basically every device in existence.
 
But seing it seems quite a weak performer, i fear it will draing GPU compute power to do so.

The "VCE/full mode" is a low power system for real-time H264 encoding. The fact that the VCE takes 3:17 to transcode a 1+ hour video is 'not a bad thing'...

In theory the VCE seems a good match for "PS4 share", but it's possible that Sony are using something else.
 
None of the PS4 streaming features require faster than real-time encoding. That's the whole point of the DVR and spectating: the footage is encoded as it happens. Benchmark comparisons are pretty meaningless, as are any GPGPU enhancements to speed up the process. The VCE just needs to be fast enough not to drop a significant number of frames while encoding directly from the framebuffer, including any potential resizing, or multiplexing to composite in audio commentary or your PS4 Eye video feed. From there it's either dumping a temporary MP4 to the hard drive or streaming live to Ustream and/or your PSN friends over your upstream internet connection.

Clients for RemotePlay or Gaikai features only need to be able to able to decode a relatively low bitrate h.264 video stream (say 8Mbps, well below a good quality Blu-ray encode). Today that includes basically every device in existence.

Yes, and by the powerpoint i posted it seemed like VCE was capable of realtime encoding at 1024p60fps, but i have found no review in the net capable of confirm this ( quite the contrary, what i found is that it is quite shitty without opencl ).
 
Status
Not open for further replies.
Back
Top