I don't doubt the truth of what they are doing, but I do doubt the relevance. It's a selected workload providing an ideal-case, not representative of real game workloads on the whole IMO.
Are you guys new to tech demos or something? They always include PR jibber jabber. Those can be ignored, as always. The point is that there is good info there. We now have an idea of how many updates such a tech demo is sending in each second, even if we still dunno how much data is being represented or the connection being used. We have an idea of what kinda scaling is possible in the cloud and we have a better idea of how the workload is being split up: big workloads in cloud and streamed in gradually based on what is in viewing window.
If we look at the technical accomplishments, the cloud needs to send only positional data to the client (the fact it's an XB1 is immaterial to the discussion). As the objects are free-roaming in space, that's pretty simple data that I'm sure can be compressed in impressive ways. Simulating other data that requires more info to represent it locally is going to hit the BW barrier.
They are most likely sending in nothing but updates to the local gravitaional acceleration field. That's a spatial distribution only. The local box then can run the physics based on the incoming field data for the viewing window.
Any character info will need position, rotation, and animating reference, and probably behaviour too so the local console can continue the simulation in realtime. We're left with what we were saying earlier, that the cloud can run simulations of the larger world and just provide (ahead of requirement) details on the immediate locale for the console to pick up local simulation.
Right...and this is a demo of the scale that is an upper limit on what can be done with persistent physical worlds. Hence, it is useful information. Character AI and anims are totally different. This demo doesn't show those aspects. It wasn't meant to.
And once again, it's talking about persistent worlds in the cloud with squillions of players, and not realtime complex data like particle physics simulations being streamed to the box (which could probably be best served sent as a video feed and composited). That's no different to server-based MMOs, except that the worlds are distributed over compute nodes instead of a single rack machine.
Persistent worlds (not npc's, rather the environments) can be animated using physics. Calculations can be on-going in the cloud with results can be streamed in when the player gets close enough to actually see them. That is exactly what is going on here. Sure, you won't have a great game where all the box can do is use its computing muscle to move rocks around in space and nothing else...
Nobody is talking about real time particle fx. Why are you bringing up aspects that nobody ever suggested was being done as if it is somehow detrimental to the purpose of the tech demo? You're reaching.
Assuming XB1's cloud is Azure, the server setup is (as of November, so this could have Changed a lot):
RPeak TFlops is 167. So the 'infinite power of the cloud' is ~140 XB1s. MS may have increased that massively, 100x more servers, and that'll be 14000 XB1's. Clearly available resources for each console are going to be very limited, although that's not really the topic here. The topic is what workloads can fit. We just have to be wary about taking PR presentations as a point of reference. The real people to ask are those working in cloud computing now who don't have an agenda to sell their CE product.
We already know what resources are available in the cloud. 3 X1 CPU's worth and 3 times the RAM allotment per real world console. It's stupid to conflate a tech demo with a 'PR presentation' just to dismiss it. The
presentation is immaterial. What matters is the tech and what it is doing. This tech demo gives us a starting point as an upper limit for what can be computed in the cloud in terms of scaling physics-based actions to be rendered locally.
Yes I am, unless you can show me a update method that only requires 3 bits update, for how many thousands of asteroids?. Because if you can't then its clear that they using something thats a lot faster then 1.5mbit
Those 3 bits BTW aren't per asteroid its for the entire field.
But I guess you'll continue to believe that theres some voodoo magic going on in the background but I've come to expect this from you (dual APU's, ray tracing chips, TBDR).
1) I NEVER any of the bolded. Stop lying about what others are saying. If you're a compulsive liar then lemme know ahead of time and I'll add you to the ignore list. This is now the 3rd time I've had to correct you on this.
2) You are asserting that none of the tech demo actually happened even thought it was done in front of the press. That's cute. Give me proof. Extraordinary claims require extraordinary evidence...all you've shown for evidence thus far is that you have no idea what you are talking about or what is happening in such a tech demo. Hardly grounds for a compelling argument to support your assertions and numerous accusations.
3) What is being 'updated' isn't the entire global field. It's just what is shown in the viewer. Also, your starting point for your lil 'math' there is assuming a 1.5Mbit/s connection (because that's what MS said is useful). They NEVER said that was enough to do a tech demo or anything related to the cloud computing aspects of the platform. Remember, this isn't latency sensitive stuff here. It takes the user time to move through asteroid fields and traverse 35k ly's worth of space in the demo. During the time it takes the user to move more data is streaming in.
4) It's rather arrogant to presume that since YOU don't understand how it might work off the top of your head, therefore the tech demo that devs spent however much time making must therefore be completely fake. You don't know as much as you've convinced yourself you do.