Game Streaming Platforms and Technology (xCloud, PSNow, GeforceNow, Luna) (Rip: Stadia)

Daft question alert:

With the talk of VRR for next gen it got me thinking how this might apply to game streaming.

Can you encode a video stream with a variable frame time, and if so can a VRR screen play it at as the frames arrive to try and avoid frame time fluctuations. Time from server to client will fluctuate so the quicker a frame can leave the better the chance it arrives in time, if it does not VRR would mean it can be displayed without waiting for the next natural refresh?

I lack deep understanding in both vsync and VRR despiite watching many videos on it.

What if we invert that and the cloud actually retards it's own frame start time effectively allowing extra time for controller response and then uses dynamic resolution or VRS to generate the frame in whatever budget it now has. The higher the latency to you the more the server reduces quality so it can hang on as late as possible to get input.

Again possibly daft questions, but it seems for cloud it will be a sum of many small gains that help it, so every little helps.

I made a similar question a while ago.

https://forum.beyond3d.com/posts/2063371/
 
Daft question alert:

With the talk of VRR for next gen it got me thinking how this might apply to game streaming.

Can you encode a video stream with a variable frame time, and if so can a VRR screen play it at as the frames arrive to try and avoid frame time fluctuations. Time from server to client will fluctuate so the quicker a frame can leave the better the chance it arrives in time, if it does not VRR would mean it can be displayed without waiting for the next natural refresh?

I lack deep understanding in both vsync and VRR despiite watching many videos on it.

What if we invert that and the cloud actually retards it's own frame start time effectively allowing extra time for controller response and then uses dynamic resolution or VRS to generate the frame in whatever budget it now has. The higher the latency to you the more the server reduces quality so it can hang on as late as possible to get input.

Again possibly daft questions, but it seems for cloud it will be a sum of many small gains that help it, so every little helps.

In a video stream, each frame has its timestamp to tell the decoder when to display this frame, so in theory VRR is entirely possible.
Network latency is a bigger problem as it's very difficult to anticipate the fluctuation of latencies. The server can try to predict this but by the timescale of networks when you detected an network congestion it's probably already several frames too late.
Note that streaming service normally don't use B frame (bidirectional motion vectors) in order to reduce latency, therefore their bandwidth requirement will be higher than normal Youtube or Netflix video.
 
Had a go at the demo of whatever Gods and Monsters is now called on Stadia. That worked pretty well on my laptop. Input latency was low. Image quality was ok but a bit variable.

It's quite slick, all in all. Can't imagine actually buying a game on there though. :D
 
I understand what they're doing and why, it's the next step in their data mining process. It's also a large step in their ad platform, to allow ability of playing the game inside actual ads. I don't understand why anyone would opt to use it.



All at "no cost"...


Sort of off topic but I think worth a share.

An interesting video on this sort of data accumulation

Surveillance capitalism

 
Wise man I know personally would always say to me "In a hundred years it won't matter."

Tommy McClain
 
I see this as something that will greatly reduce the amount of necessary bandwidth to stream a video in the future. Right now it runs at 5fps on an RTX 2080 TI and it still has a lot of errors, but things are moving fast.


More examples:
https://cgv.cs.nthu.edu.tw/InstColorization_data/Supplemental_Material/sup.html

White paper:
https://cgv.cs.nthu.edu.tw/InstColorization_data/InstaColorization.pdf


Interesting. I had been wondering what if the local device could apply advanced upscaling techniques to improve video streaming quality.

What's interesting is that it already exists today. The 2019 Nvidia Shield TV has built in "AI Upscaling" which apparently is pretty good.



It was recently updated to handle extra resolutions and 60fps(on the Shield TV Pro model) as well as GeForce Now:.

https://blogs.nvidia.com/blog/2020/08/26/shield-upgrade-25/


I have also seen a video of a person using with Xcloud:
.


If these upscaling algorithms/hardware get better and start making their way into more and more simple media streaming devices/TV's I could see it have a huge impact on game streaming.
 
I just hope that this will not become the “tv tv tv” of this generation; that they kneecapped their systems potential because there was a very slight chance that streaming would take off this generation.
By the time it becomes popular, if it ever does, the refresh for the Pro consoles will be due, or maybe even the next generation. My 2 cents
 
Back
Top