Server based game augmentations. The transition to cloud. Really possible?

I don't understand what level of physics is bringing the local machine grinding to 1-2fps? Are they fully simulating physical properties (is it concrete?) along with its structural properties like tensile strength so that it breaks under stress and/or lack of support, along with crumbling? :???:
 
I don't understand what level of physics is bringing the local machine grinding to 1-2fps? Are they fully simulating physical properties (is it concrete?) along with its structural properties like tensile strength so that it breaks under stress and/or lack of support, along with crumbling? :???:

sleep(n);
 
I don't understand what level of physics is bringing the local machine grinding to 1-2fps? Are they fully simulating physical properties (is it concrete?) along with its structural properties like tensile strength so that it breaks under stress and/or lack of support, along with crumbling? :???:

Maybe they are using the CPU for physics.
 
Maybe they are using the CPU for physics.

I get they are using the CPU (and possibly GPU) for physics but there are many levels or physics simulation. At the minimum you just have basic weight/mass/shape/friction of each segment so you can simulate gravity and it's reaction when interacting with other segments. Then you go right up to full material simulation where metal will bend and deform without support and too much load and concrete will crack, crumble and break.
 

Been a while since I genuinely LOLed, I tip my hat to you sir.

As for what they were simulating, to what detail and why that so crushed those boxes we need MS to tell us. My understanding of physics is that you can basically spend all of the cycles on it very easily, particularly if we have tens of thousands of objects all colliding with each other at once.
 
It's not like they are solving an N body problem with gravity, there is no interaction for one object with the other 10,000 unless there is a collision. Surely they can eliminate most calculations with rough estimates for intersecting bounding volumes (parallelepiped)?
 
My understanding of physics is that you can basically spend all of the cycles on it very easily, particularly if we have tens of thousands of objects all colliding with each other at once.
Tens of thousands of objects with mass/shape/material (friction) should not reduce a modern console to single digit frame rates - not unless it's simulating physical reactions at a molecular/sub-atomic level. I recall at the PS4 reveal, they demonstrated Havok simulating 1,000,000 (a million) objects falling around a city scape running mostly on GPU using compute.

Obviously the level of simulation is crucial. Had they showed a suspension bridge collapsing, with weight exceeding the tensile strength of suspension cables, which are stretching, thinning and snapping, that would have been visibly (and understandably) impressive but it's hard to take away what we are looking at in the Microsoft video.
 
Tens of thousands of objects with mass/shape/material (friction) should not reduce a modern console to single digit frame rates - not unless it's simulating physical reactions at a molecular/sub-atomic level. I recall at the PS4 reveal, they demonstrated Havok simulating 1,000,000 (a million) objects falling around a city scape running mostly on GPU using compute.

Obviously the level of simulation is crucial. Had they showed a suspension bridge collapsing, with weight exceeding the tensile strength of suspension cables, which are stretching, thinning and snapping, that would have been visibly (and understandably) impressive but it's hard to take away what we are looking at in the Microsoft video.

Very true, what it comes down to is with physics simulation you can go nuts and eat cycles all day long for not much gain. Unless MS explicitly tell us what they are simulating all we can say is that their demo seemed to work but of course without the conditions and parameters of that test we basically know no more than we did before the demo.
 
Very true, what it comes down to is with physics simulation you can go nuts and eat cycles all day long for not much gain. Unless MS explicitly tell us what they are simulating all we can say is that their demo seemed to work but of course without the conditions and parameters of that test we basically know no more than we did before the demo.

In the video, there's the chunk count being displayed near 60K.
The chucks are colliding and bouncing off each other, and new ones being created out of collisions.

The object also look irregularly shaped (concave), from what I remember, the complexity of simulating concave objects vs convex objects are vastly different, common implementation uses multiple convex objects to form 1 concave object, that bring the complexity up another order.

They are not particle debris.

The whole purpose of the demo is to show offloading physics onto cloud compute, if one's gonna argue like this is some sort of conspiracy theory...Honestly, with the amount of money that MSR is spending, they can do much much better than that is.

Oddly I don't think the compute power is the issue, how they can compress the transfer the amount of dynamic data over to the client.
 
Last edited by a moderator:
The object also look irregularly shaped (concave), from what I remember, the complexity of simulating concave objects vs convex objects are vastly different, simplistic implementation uses multiple convex objects to form 1 concave object, that bring the complexity up another order.
Complexity is very important. Collisions meshes are a gazillion times harder to calculate collisions for (including proper response) than basic geometric colliders. Games typically construct physics around boxes, spheres, and capsules AFAIK for that very reason. The complexity of the simulation could be anything from 30,000 colliding spheres to 300,000 colliding vertices. However, the structure of the fragments looks very geometric to me, so they should be making each chunk out of combined cubes and the like.
 
Complexity is very important. Collisions meshes are a gazillion times harder to calculate collisions for (including proper response) than basic geometric colliders. Games typically construct physics around boxes, spheres, and capsules AFAIK for that very reason. The complexity of the simulation could be anything from 30,000 colliding spheres to 300,000 colliding vertices.

Havok let's you simplify the meshes for collision and reaction calculations from those of the objects being rendered but the trick with cloud augmentaiton is the complexity of the objects also has to be sent to and from client to cloud.

It's great if the cloud can calculate reactions, collisions and splintering of items into many smaller items at great precision but if this can't be sent to the client device for rendering quickly then that precision is unnecessary.

However, the structure of the fragments looks very geometric to me, so they should be making each chunk out of combined cubes and the like.

Likewise, the things flying about didn't look particularly complicated in geometric terms. Now that could scale up but you're still limited by sending/receiving geometric details on what could be dozens/hundreds/thousands/millions of objections.

It's tricky, for sure.
 
Complexity is very important. Collisions meshes are a gazillion times harder to calculate collisions for (including proper response) than basic geometric colliders. Games typically construct physics around boxes, spheres, and capsules AFAIK for that very reason. The complexity of the simulation could be anything from 30,000 colliding spheres to 300,000 colliding vertices. However, the structure of the fragments looks very geometric to me, so they should be making each chunk out of combined cubes and the like.

Well, my point is that when multiple BVs are used for a single concave object, having 10 objects colliding in reality is more like compute 100 objects colliding.
From what I remember, colliding sphere is actually simpler than AABBs, and convex hulls are widely used.

I went back and looked at the new video again, new chunks are being created out of collisions, and the later part of the video has the spheres breaking up. The text captioning says that the cloud map is bigger than the local map as well.
I don't think the chucks looks any less detailed than any AAA titles that claims to have destructible physics, I'd argue it's quite impressive for a real-time demo.

Some comparisons:

Unreal, notice how new barrels are sunk into the main pile in the later part:
https://www.youtube.com/watch?v=Vjbpv0TjVis

CryEngine, this one is very "geometric" and the original mesh doesn't collapse either.
https://www.youtube.com/watch?v=IR5Blw1orZ8

Havok, seemingly all convex objects:
https://www.youtube.com/watch?v=sS0Fqx_zxf8
 
Last edited by a moderator:
In the video, there's the chunk count being displayed near 60K.
The chucks are colliding and bouncing off each other, and new ones being created out of collisions.

The object also look irregularly shaped (concave), from what I remember, the complexity of simulating concave objects vs convex objects are vastly different, common implementation uses multiple convex objects to form 1 concave object, that bring the complexity up another order.

They are not particle debris.

The whole purpose of the demo is to show offloading physics onto cloud compute, if one's gonna argue like this is some sort of conspiracy theory...Honestly, with the amount of money that MSR is spending, they can do much much better than that is.

Oddly I don't think the compute power is the issue, how they can compress the transfer the amount of dynamic data over to the client.

I basically agree with you but my point is not this is a fake it's that they haven't told us what they are doing so we can't know if this is a practical implementation for a real world title or just a really nice demo. How are they modelling the objects, are they dynamically created or static chunks, is the geometry as complex as the collision model, what is being sent to and from the cloud, ho wmuch is being sent, how sensitive is it to latency? All questions we had before the demo and we still have now, were there any of the side sessions that covered this in more detail?

Edit: It is very impressive though
 
I basically agree with you but my point is not this is a fake it's that they haven't told us what they are doing.
Yep. We can take it as read that the physics as implemented is too slow on a single high-end PC, however it was implemented. This isn't a demo of how massive server networks can be faster than a PC. ;) The question is how the data gets from the cloud to the local machine, which MS haven't talked about. Again. They're yet to show a working real-world situation of a console or other machine at home on a broadband network getting such data from the cloud.

If they can really do it, they should release a demo and silence the critics. Or better yet, a game that uses cloud for realtime in-game destruction! :D

We can't really factor this demo into this discussion until we know the network situation.
 
Yep. We can take it as read that the physics as implemented is too slow on a single high-end PC, however it was implemented. This isn't a demo of how massive server networks can be faster than a PC. ;) The question is how the data gets from the cloud to the local machine, which MS haven't talked about. Again. They're yet to show a working real-world situation of a console or other machine at home on a broadband network getting such data from the cloud.

If they can really do it, they should release a demo and silence the critics. Or better yet, a game that uses cloud for realtime in-game destruction! :D

We can't really factor this demo into this discussion until we know the network situation.

Don't disagree, I am way more interested in how the data were transmitted.

The comment on the complexity is to address the difference on the demo chokes at 30K chunks versus how Havok can do 1M RBs easily in theirs.
 
Wouldn't the servers have to run a complete simulation of the local game. A complete copy of the game logic running on the server could enhance the effects and push that to the local game client that would still run everything just at a level of complexity that it can handle.

With prediction added the server could even push stuff before it happens making it possible to get real time effects without sync problems.

Since everything is independent from the server on the local client only the quality would suffer if there is dropped packets, low bandwidth, high latency, and even if the servers were overloaded the game would still run smoothly.

I have no idea if it's even possible but it's funny to think about how the "cloud" could work :)
 
My question would be... is the cloud demo "the same" physics simulations as the one running on the CPU alone...

I mean... anybody can develop a bad performing simulation and make it faster by... whatever means.
 
Back
Top