PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Kaveri unveiled, with both -but exclusive- GDDR5 ( 128 bits width and 2-channel instead of the 256 bits 4-channel of PS4 ) and DDR3 possible:

http://www.brightsideofnews.com/news/2013/3/5/amd-kaveri-unveiled-pc-architecture-gets-gddr5.aspx
I would bet it's just DDR4-3200 support. Would perfectly explain the "800MHz QDR" BSN is talking about and makes more sense than GDDR5 with just 3.2 Gbps. If you plug Kaveri into an FM2 socket you have DDR3 support, in FM3 you get DDR4.
 
I would bet it's just DDR4-3200 support. Would perfectly explain the "800MHz QDR" BSN is talking about and makes more sense than GDDR5 with just 3.2 Gbps. If you plug Kaveri into an FM2 socket you have DDR3 support, in FM3 you get DDR4.

Yeah, it has more sense. Do you know if this is fabbed in Global Foundries 28nm process ?. I thought it was late and so PS4 changed to Kabini.
 
Yes, and by the powerpoint i posted it seemed like VCE was capable of realtime encoding at 1024p60fps, but i have found no review in the net capable of confirm this ( quite the contrary, what i found is that it is quite shitty without opencl ).

It's unfortunate that the Tom's Hardware review you linked doesn't say how long the 681MB 1080p AVC file they use in their benchmark actually is. Based on the file size I would guess it's 5-8 minutes or so, and the VCE encoding completes in 2 minutes and 54 seconds using the VCE.
 
None of the PS4 streaming features require faster than real-time encoding. That's the whole point of the DVR and spectating: the footage is encoded as it happens. Benchmark comparisons are pretty meaningless, as are any GPGPU enhancements to speed up the process. The VCE just needs to be fast enough not to drop a significant number of frames while encoding directly from the framebuffer, including any potential resizing, or multiplexing to composite in audio commentary or your PS4 Eye video feed. From there it's either dumping a temporary MP4 to the hard drive or streaming live to Ustream and/or your PSN friends over your upstream internet connection.

Clients for RemotePlay or Gaikai features only need to be able to able to decode a relatively low bitrate h.264 video stream (say 8Mbps, well below a good quality Blu-ray encode). Today that includes basically every device in existence.

There was also an idea that you could remote control the session you're watching, so that you could help someone pass a difficult section.
 
To decode you have the UVD decoder in PS4 -play videos and such- and i know IPAD uses gpu aceleration for that and PS3 the SPEs. The VCE is to encode only ( Gaikai server part, not client ). Apart from using encoding to stream games to Vita it will be used to stream your games from PS4 to ustream while you play then. To make it fast they will use for sure Gaikai software and method ( the server part ). Or that´s what i understand.

So, if to encode VCE needs opencl assistance ( GPU ) what i meant is that a simple SPE would have been enough to encode at ultra fast speed without needing any aids.

Haha I stand corrected. UVD for decode. VCE for encode.

If the user uses RemotePlay and Ustream at the same time, it is possible that the Ustream will be handled by Vita instead. I noticed that Ustream may include a live PIP of the player's face.

Now assuming that PS4 want to handle RemotePlay, 15 minute share and/or Ustream all at the same time, then the VCE should be the "DVR" and Ustream encoder. Both can drop frames and quality to give more priority to the RemotePlay game. "Shitty video" is not a technical term, but the encoded video should have a lower level and profile to accommodate Internet bandwidth and server file size restrictions. I wouldn't expect a commercial movie experience here.

I am more curious about RemotePlay vs the foreground app. Will they both render on the same queue, or will one of them take the high priority VShell queue.

EDIT: These are just my guesses/thoughts. Take them with a pinch of salt.

I think the video quality will be affected more by the network behavior say... when watching HD NetFlix while trying to do all of the above. Your network may be shared by other folks doing their own things too.
 
There was also an idea that you could remote control the session you're watching, so that you could help someone pass a difficult section.

I think this would be a separate technology altogether. Both users will need to have a copy of the game. It would be like Co-op except that the host doesn't play. They should not need VCE or UVD here.
 
I think this would be a separate technology altogether. Both users will need to have a copy of the game. It would be like Co-op except that the host doesn't play. They should not need VCE or UVD here.

Will they? My understanding of that feature was it took advantage of the live streaming.
 
Will they? My understanding of that feature was it took advantage of the live streaming.

Doubtful. Even if they organize the meetup in the live streaming theater, the game has to adjust for lag to support a remote player. I would think this is standard co-op affairs.

Today, almost every weekday, I am already helping Dark Souls players clear their bosses as a summoned Phantom.

If the enemies are other invading players in bad networks, I will need to adjust for lag too (i.e., swing weapon before the enemies come in range). This is where experience counts a little.

To improve lag, they may host the game on a server though (e.g., host both on Gaikai).

EDIT: Ok, I see your meant Gaikai streaming. In that case, yes, they could switch player session and have the host become a spectator. And then pass it back. In this way, the game may not need to be Co-op aware. Brilliant !
 
Haha I stand corrected. UVD for decode. VCE for encode.

If the user uses RemotePlay and Ustream at the same time, it is possible that the Ustream will be handled by Vita instead. I noticed that Ustream may include a live PIP of the player's face.

Now assuming that PS4 want to handle RemotePlay, 15 minute share and/or Ustream all at the same time, then the VCE should be the "DVR" and Ustream encoder. Both can drop frames and quality to give more priority to the RemotePlay game. "Shitty video" is not a technical term, but the encoded video should have a lower level and profile to accommodate Internet bandwidth and server file size restrictions. I wouldn't expect a commercial movie experience here.

I am more curious about RemotePlay vs the foreground app. Will they both render on the same queue, or will one of them take the high priority VShell queue.

EDIT: These are just my guesses/thoughts. Take them with a pinch of salt.

I think the video quality will be affected more by the network behavior say... when watching HD NetFlix while trying to do all of the above. Your network may be shared by other folks doing their own things too.

Well, "shitty" i meant for getting a "quality of service" for all the stuff they want to make with streaming ( in theory ) without having to be assisted by shaders.
 
It will generally be shitty because our home networks are not professionally run broadcast networks. Even if you try to encode a high quality video using the GPU, it may choke while Ustreaming because the bitrate is too high for your network and the Internet.
 
Sure, but on the other hand the people that care about features like these are also more likely to have good internet connections. And you don't need that much anyway to get it to be 'good enough'
 
Doubtful. Even if they organize the meetup in the live streaming theater, the game has to adjust for lag to support a remote player. I would think this is standard co-op affairs.

Today, almost every weekday, I am already helping Dark Souls players clear their bosses as a summoned Phantom.

If the enemies are other invading players in bad networks, I will need to adjust for lag too (i.e., swing weapon before the enemies come in range). This is where experience counts a little.

To improve lag, they may host the game on a server though (e.g., host both on Gaikai).

EDIT: Ok, I see your meant Gaikai streaming. In that case, yes, they could switch player session and have the host become a spectator. And then pass it back. In this way, the game may not need to be Co-op aware. Brilliant !

No, he was right. Sony was describing basically a form of ps4 to ps4 Remote Play. If I get stuck and you're online the game is still running on my machine with a video stream being transmitted to you, and you can take control without having the game.
 
Yeah, I realize that after giving it more thought. "live streaming" implies "Gaikai streaming" in Scott_Arm's post. I thought it was "Ustreaming".

I suspect it's done via Gaikai. They simply switch the player session around (from spectate to playing and vice versa). Potentially, any game should just work.

My guess is for best/acceptable performance, we may need to run this off Sony servers instead of our own PS4 though.
 
Sure, but on the other hand the people that care about features like these are also more likely to have good internet connections. And you don't need that much anyway to get it to be 'good enough'

Average upload speeds are not that great... http://www.netindex.com/upload/

Yeah... par for the course.

In independent VCE/UVD reviews, the authors have no specific context to qualify the output quality and performance except against standard H.264 profiles for offline playback (e.g., Blu-ray movies, PC x264 performance).

For live home streaming videos, the implementors shouldn't go beyond the average network specs at homes. Anything beyond is likely a waste of time and resources. Good enough for most users should be fine.
 
For Gaikai/RemotePlay, VCE in realtime is enough, no need for performance-stealing GPU-enhanced enconding.

For Ustream, it also should be realtime-encoded, so the PS4 only stores the compressed video (no way it would keep 15min of uncompressed video around "just in case").

However, Sony does have their own video technology, it might be cheaper for them not licensing the VCE block and include some of their IP instead. After all, AMD says there's plenty of Sony IP in the APU.

How about opening a thread to speculate on the technologies that Sony brought in? It should be even funnier than current threads... :p /jk
 
Yeah, I realize that after giving it more thought. "live streaming" implies "Gaikai streaming" in Scott_Arm's post. I thought it was "Ustreaming".

I suspect it's done via Gaikai. They simply switch the player session around (from spectate to playing and vice versa). Potentially, any game should just work.

My guess is for best/acceptable performance, we may need to run this off Sony servers instead of our own PS4 though.

That might seem more seamless but you're basically sentencing the guy who actually owns the game to the compromised cloud experience rather than the local game he paid for. No one claimed the remote player would have as good an experience as the host, but the technology to seamlessly hand off a local game to a cloud server probably is not feasible in the near term and a peer streamed game play session is probably good enough for limited use in this fashion. In any case peer streamed remote control is what they announced at the Feb 20 event, not the cloud arrangement you're describing. How good that is will obviously depend on how good your upstream connection is, how close your friend lives, etc.

I feel like I should also make clear based on the discussions happening in this thread right now that we can safely assume using certain streaming features will prevent you from using others. Sony hasn't promised you can Ustream a game at the same time as you are using remote play.

Also, there is never a case where this game play video will be uncompressed. All use cases involve automatic real time compression using the VCE.
 
Gaikai WAN peer streaming is interesting but the experience may not be consistent as you mentioned. It would imply 1 spectator and 1 player by default, which is more consuming than straight RemotePlay -- unless the host can't watch his helper play. Has it been done before ? I need to rewatch their presentation to see what I have missed.

To hand off a game session, it would be like transfarring, or "resume PS3 game on Vita". The game saved states (a save point) are transferred to another machine (with the game) to play. A Gaikai server sounds robust enough.
 
For Gaikai/RemotePlay, VCE in realtime is enough, no need for performance-stealing GPU-enhanced enconding.

For Ustream, it also should be realtime-encoded, so the PS4 only stores the compressed video (no way it would keep 15min of uncompressed video around "just in case").

However, Sony does have their own video technology, it might be cheaper for them not licensing the VCE block and include some of their IP instead. After all, AMD says there's plenty of Sony IP in the APU.

How about opening a thread to speculate on the technologies that Sony brought in? It should be even funnier than current threads... :p /jk

Nah, nothing of plenty Sony IP.IMHO only the sound chip.
 
For Ustreaming a Gaikai (or RemotePlay) session, the easiest way is to push the responsibility to the client since the Ustream UI also shows a player video chat window -- assuming the network is robust and the client is powerful enough.

Yes, my expectation is at launch, we will "only" get a subset of what's announced. The OS may not multitask everything pre-emptively and freely. Perhaps like Vita, there will be some restrictions for various reasons.
 
Status
Not open for further replies.
Back
Top