Daft question alert:....
Compression is more of an issue -it's hard to compress noise like detail and depends on motion vectors , and currently motion vectors are also downplayed as an issue.
Daft question alert:....
Daft question alert:
With the talk of VRR for next gen it got me thinking how this might apply to game streaming.
Can you encode a video stream with a variable frame time, and if so can a VRR screen play it at as the frames arrive to try and avoid frame time fluctuations. Time from server to client will fluctuate so the quicker a frame can leave the better the chance it arrives in time, if it does not VRR would mean it can be displayed without waiting for the next natural refresh?
I lack deep understanding in both vsync and VRR despiite watching many videos on it.
What if we invert that and the cloud actually retards it's own frame start time effectively allowing extra time for controller response and then uses dynamic resolution or VRS to generate the frame in whatever budget it now has. The higher the latency to you the more the server reduces quality so it can hang on as late as possible to get input.
Again possibly daft questions, but it seems for cloud it will be a sum of many small gains that help it, so every little helps.
Daft question alert:
With the talk of VRR for next gen it got me thinking how this might apply to game streaming.
Can you encode a video stream with a variable frame time, and if so can a VRR screen play it at as the frames arrive to try and avoid frame time fluctuations. Time from server to client will fluctuate so the quicker a frame can leave the better the chance it arrives in time, if it does not VRR would mean it can be displayed without waiting for the next natural refresh?
I lack deep understanding in both vsync and VRR despiite watching many videos on it.
What if we invert that and the cloud actually retards it's own frame start time effectively allowing extra time for controller response and then uses dynamic resolution or VRS to generate the frame in whatever budget it now has. The higher the latency to you the more the server reduces quality so it can hang on as late as possible to get input.
Again possibly daft questions, but it seems for cloud it will be a sum of many small gains that help it, so every little helps.
it has begun, the streaming wars.
Mmmmmm. Begun, the streaming wars have.
So having read a bit more about Facebook Game Streaming, its a very odd and strange thing. They're focusing on streaming free-to-play mobile games to mobile devices.
https://arstechnica.com/gaming/2020...ffering-focuses-on-free-to-play-mobile-games/
It makes a move about discovery, accessibility and engagement.
I understand what they're doing and why, it's the next step in their data mining process. It's also a large step in their ad platform, to allow ability of playing the game inside actual ads. I don't understand why anyone would opt to use it.
All at "no cost"...
So having read a bit more about Facebook Game Streaming, its a very odd and strange thing. They're focusing on streaming free-to-play mobile games to mobile devices.
https://arstechnica.com/gaming/2020...ffering-focuses-on-free-to-play-mobile-games/
I see this as something that will greatly reduce the amount of necessary bandwidth to stream a video in the future. Right now it runs at 5fps on an RTX 2080 TI and it still has a lot of errors, but things are moving fast.
More examples:
https://cgv.cs.nthu.edu.tw/InstColorization_data/Supplemental_Material/sup.html
White paper:
https://cgv.cs.nthu.edu.tw/InstColorization_data/InstaColorization.pdf