Kahawai: High-Quality Mobile Gaming Using GPU Offload

Alucardx23

Regular
Microsoft’s new rendering research could bring better visuals to Xbox One, mobile devices

During Microsoft’s initial Xbox One reveal and subsequent E3 demonstrations, it mentioned a capability that we haven’t heard much about since. According to the company, the Xbox One could theoretically be paired with offsite cloud rendering platforms at some point in the future to deliver a superior game experience to anything the console could handle on its own. Since then, Redmond has been silent on what form the technology might take or if we’d ever see a version of it in the wild — until now. A new paper, published in collaboration with Duke University and the University of Washington details a joint rendering system, dubbed Kahawai, that pairs a client and server GPU together for simultaneous rendering....

http://www.cs.duke.edu/~lpcox/mobi093f-cuervoA.pdf

 
Last edited:
...Kahawai implements two separate techniques for collaborative rendering: (1) a mobile device can render each frame with reduced detail while a server sends a stream of per-frame differences to transform each frame into a high detail version, or (2) a mobile device can render a subset of the frames while a server provides the missing frames.

Furthermore, a 50-person user study with our prototype shows that Kahawai can deliver the same gaming experience as a thin client under excellent network conditions
So the quality of experience is akin to game steaming. That is, higher latency than purely local rendering. The abstract reads more like a bandwidth optimised game streaming system than cloud enhancement.
 
So the quality of experience is akin to game steaming. That is, higher latency than purely local rendering. The abstract reads more like a bandwidth optimised game streaming system than cloud enhancement.

Yep, it seems to be the same. Their main target seems to be to optimize bandwidth requirements.

"We show that our delta encoding technique, when compared with an H.264 thin client, provides much better visual quality when bandwidth is low – less than 0.6 Mbps. Even more impressive, we show that an H.264 thin client requires up to six times as much bandwidth as client-side I-frame rendering to achieve high visual quality. Finally, a 50-person user study with our Kahawai prototype demonstrates that collaborative rendering provides the same user experience as a thin client."
 
So the quality of experience is akin to game steaming. That is, higher latency than purely local rendering. The abstract reads more like a bandwidth optimised game streaming system than cloud enhancement.
Given the input lag, I shall pass on this technology. Except for AI and mobiles, where I find it very useful, I value IQ and now that it took me some time to calibrate the TV and try to make sure that all the details are visible and keen on your eyes...
 
So the quality of experience is akin to game steaming. That is, higher latency than purely local rendering. The abstract reads more like a bandwidth optimised game streaming system than cloud enhancement.
Should still be better than pure streaming. If playstation now is working for you, this should be, too.
 
Why wouldn't newer solutions use H265 for streaming?
Less bandwidth needed for the same end result..
 
If playstation now is working for you, this should be, too.
Except that cellular networks are often fucking awful from a latency perspective (upwards of hundreds of ms roundtrip, so essentially unplayable unless it's a casual puzzler title or such), so unless you're gaming via a wifi access point...and then what's the point really? Why are you on wifi and gaming on a phone? A dedicated gaming system will bring a better experience in such a situation.

Also, with data caps being what they are these days, I wouldn't want to waste them streaming gaming graphics. Seriously, I'm not getting this. Cloud augmentation or whatever you want to call it isn't a good idea. It's better to spend that effort on simply making good games, because graphics on their own don't do that.
 
Could you do this with audio?
Not the same concept, I know, but if the idea is to improve games, any external processing helps.
If all audio data is stored server-side and all audio processing is also done server-side, then the client is only responsible for sending requests for game events. All the audio is combined at the server and sent downstream to the client console. Surely audio streams don't require super high bandwidth.
I have no knowledge of how audio processing works inside a game engine, so this may be a ridiculous suggestion. Also, audio overhead onboard a console during gameplay might be inconsequential (another thing I don't know about). If not, however, it might help free up some power and memory in the console (and might be an effective anti-piracy measure, as well). Feel free to derisively mock this post if necessary.
 
Why wouldn't newer solutions use H265 for streaming? Less bandwidth needed for the same end result..
At the moment, cost. H.265 is hugely computationally intensive when encoding and affordable dedicated hardware is likely some way off.
 
Should still be better than pure streaming.
Input response is probably no better. You'll have to factor in latency for retrieving the fill-in/detail frames from the server so won't be able to refresh ASAP on the client. the only clear advantages are lower BW and offline functionality. Overall experience is likely no different; certainly not game-chaningingly different.

Also the focus on mobile suggests this is more about running a Live streamed games platform for mobiles than be the fruition of MS's cloud augmentation ideas. way back then we made the clear distinction between cloud augments and steamed gaming, and this is a flavour of streamed gaming. Well, that's debatable, but it's cosmetic whether we call it game streaming or not. ;)
 
At the moment, cost. H.265 is hugely computationally intensive when encoding and affordable dedicated hardware is likely some way off.

Dedicated hardware has been around in low-cost SoCs from AmLogic for well over a year. Same thing for most of Qualcomm's SoCs released in 2014. Snapdragon 810 even has an encoding block.
Mediatek's MT6x95 also have H.265 decoder+encoder blocks and they also have a low-cost tablet SoC, MT8127, with the decoding block.

x86 IHVs are getting really late to the party, in this case.
Carrizo isn't out yet and the only discrete graphics card with dedicated hardware so far is nVidia's GM206 (GTX 960).
 
Last edited by a moderator:
Dedicated hardware has been around in low-cost SoCs from AmLogic for well over a year. Same thing for most of Qualcomm's SoCs released in 2014. Snapdragon 810 even has an encoding block. Mediatek's MT6x95 also have H.265 decoder+encoder blocks and they also have a low-cost tablet SoC, MT8127, with the decoding block.
Decode yes, but I've seen nothing with realtime high profile H.265 encode which is what you need for streaming. Similarly it took a while for realtime H.264 hardware to become standard as well. I imagine the H.265 royalties are significantly more than H.264 as well.
 
20 cents a unit doesn't sound bank-breaking to hundreds dollars CE devices. I can see why codec enablement could be moved to an online action, to avoid paying for a license never used.
 
It matters when you'll have hundreds of licenses to pay for. You'll need a lot more than just H265 to sell your device. All phones can do... virtually all current codecs. From Mpeg to H264. All of them cost some cents to include. And the rest as well (S3TC for example, ARM wants its share, MS does, as does Apple and probably Oracle now). It becomes a lot quite fast.
 
20 cents a unit doesn't sound bank-breaking to hundreds dollars CE devices. I can see why codec enablement could be moved to an online action, to avoid paying for a license never used.
Ah, but that's 20 cents for the royalty, you still have to pay for the hardware that includes H.265 encode/decode and I doubt hardware vendors are throwing that in for free. Also, I doubt paying a H.265 royalty means you don't have to pay that H.264 royal, that AAC royalty and a bunch more. 20 cents is further erosion into profit margins to support a video encode standard that has no market presence.

Blu-ray, HD-DVD and 720p/1080p streaming drove H.264 adoption, what will drive H.265? Sure, it's better than H.264 but necessity tends to drive adoption and H.265 is really only necessary for 8K UHD.
 
I agree, but you did mention royalties explicitly. Royalties aren't crippling and the prohibitive costs will be everything else. As you say, realtime h.265 encoding is likely not a cost effective option yet, plus this is more a proof of concept at the moment,
 
Also the focus on mobile suggests this is more about running a Live streamed games platform for mobiles than be the fruition of MS's cloud augmentation ideas

Isn't mobile GPU power going to surpass Xbox one in a couple of years? Lol ::runsaway::
 
Back
Top