Server based game augmentations. The transition to cloud. Really possible?

No it doesn't. It doesn't touch on game streaming at all, except as a reference to bandwidth concerns.

The reference to latency and bandwidth concerns make up the first thousand words of the article immediately following the already lengthy intro. Latency concerns are discussed only in the context of rendering considerations (which again is a cloud streaming context). The bandwidth concerns are likewise discussed in the comparison to a cloud streaming solution which has nothing to do with MS's stated intentions.

Right. And the article explains this, saying the data sets have to be small because there's no enough BW to do otherwise, and that's how tasks that can be done in the cloud are limited.

The early chunks of the article is roughly formatted as follows:

1) Intro conveying PR claims from MS.
2) Latency concerns written in a manner as to suggest cloud rendering isn't plausible.
3) Same as above, but for bandwidth.

That's misleading in the sense that it lays out claims and then immediately attacks them in the context of a *totally* unrelated approach that has nothing to do with MS's aforementioned claims. It's a strawman, detailed as it may be. Don't express the claims, assert your article is going to judge them, and then dive into the challenges/criticisms of totally different solutions followed up with the suggestion MS was 'admitting' to be lying to ppl in that context. That's why you have ppl like GameNameFame and the legions of fools over on GAF making posts about how they've imagined DF is agreeing with Jonathan Blow when that is very, very far from their actual conclusion. I realize those ppl are being lazy and foolish, but I contend much of the blame also lies with the author misleading them.

And the article attempts to identify what those low-latency jobs are. If there are plenty more low-latency jobs you can think of, please suggest them here! That's what this thread is for.

I have (I even had sexy gifs)! :) And plenty of others have done so too. My issue with that particular part was moreso that they made this expanded list of stuff going on per frame and then took about half of them that could be sent to the cloud and grouped them all together and presented it as if almost nothing in their list was doable.

I don't mean to fault them too much for not thinking up the ideas I've had, they had some (as I noted) that I didn't consider either. And that's good! But they totally missed MS's point at the same time (to free up local resources by moving things to the cloud). THAT is problematic enough in and of itself.

Had the article been about general cloud gaming in next gen titles it'd be a much better article imho. In that context it would make more sense to go through the potential of general cloud gaming (not just MS's computing model) and talk about its challenges and how MS's approach seeks to avoid them as a compromise and/or where Sony/Gaikai's own solutions fit in and a broader medium-term forecast of where things could go.

Anyhow, the thread isn't really about their article and I've shared my criticisms of it enough as is. It's much more fun to discuss new ideas for this kinda thing. :D
 
The whole 300k server number is just marketing talk for saying they've hooked it up to the Azure backend, which Microsoft has available for hosting a large variety of global cloud services I'm sure? I know some large global companies use it for their central database storage and such.

Based on what Whitten says in the reveal it seems to directly suggest their 300k figure is only for XBL itself. The cloud computing stuff is offered to devs through Azure which isn't the same thing. Mary Jo Foley said on Windows Weekly last week that she asked the Azure ppl and that's also what she was told. If Whitten was just using the number of Azure servers he couldn't have said XBL today runs on 15k servers (which it does). He talks about the various expected XBL Gold services on offer with the new, 300k wide XBL setup and then speaks to cloud computing in separate, additive manner.
 
What if developers started thier own cloud servers and ditched microsoft,sony and nintendo?
Think about it.

EA network gaming service.
Activision network gaming service
Sega-Sammy network gaming service.
Capcom network gaming service.

Access to their whole back library and currently available games
.
Developers don't even have to maintain the cloud network they can just contract out a private cloud provider and just focus on the creating the content.
They can then warehouse thier content or lease them out to private cloud gaming publishers.
 
What if developers started thier own cloud servers and ditched microsoft,sony and nintendo?
Think about it.

EA network gaming service.
Activision network gaming service
Sega-Sammy network gaming service.
Capcom network gaming service.

Access to their whole back library and currently available games
.
Developers don't even have to maintain the cloud network they can just contract out a private cloud provider and just focus on the creating the content.
They can then warehouse thier content or lease them out to private cloud gaming publishers.

Yes, they could pay Microsoft, Amazon or Google to do that. Not sure how the financials work out. Probably better than trying to build your own data centers across the world, especially when they have no internal expertise in doing that kind of thing.
 
The reference to latency and bandwidth concerns make up the first thousand words of the article immediately following the already lengthy intro. Latency concerns are discussed only in the context of rendering considerations (which again is a cloud streaming context). The bandwidth concerns are likewise discussed in the comparison to a cloud streaming solution which has nothing to do with MS's stated intentions.
MS haven't stated intentions other than they'll provide a lot of power that can do amazing things. They left it to the public to imagine what that'd result in. Latency in the article is discussed to show that many rendering jobs cannot be moved to the cloud. That has nothing to do with server-based rendering and game streaming which is obviously possible as it exists in Gaikai and PS3 remote play.

2) Latency concerns written in a manner as to suggest cloud rendering isn't plausible.
3) Same as above, but for bandwidth.
It's not talking about cloud rendering. It's talking about what sort of jobs can be moved to the cloud. They cannot be time-critical jobs due to latency and BW concerns. Everything else is plausible - the article explicitly states this and even offers suggestions on using the cloud like procedural content creation.

I have (I even had sexy gifs)! :)
I'm unconvinced. Firstly, you assume the shot missile will impact with the building and the calculations can be computed ahead of schedule and sent down. Any number of actions could interrupt the missile's trajectory and change the requirements instantly, such as a jeep intersecting. Secondly, the explosion will be creating lots of particles with their own meshes and textures. That means data from the server. Perhaps lots of repeated stone fragments using the same textures would suffice, in which case I agree potentially the server could process some non-critical physics, but it's still sounding pretty niche. I'm also not sure the local saving is actually worth much. But this thread is more about what could be moved, whether it makes sense to or not. ;)

But they totally missed MS's point at the same time (to free up local resources by moving things to the cloud). THAT is problematic enough in and of itself.
The article was doing the same as this thread - trying to see what can be moved to the cloud.

Had the article been about general cloud gaming in next gen titles it'd be a much better article imho.
The motivation for creating that article was clearly with a view to filling in the basics about MS's cloud claims for those not informed enough to understand them. DF articles have done that with all extreme claims like Unlimited Detail. "A rule of thumb we like to use is that [for] every Xbox One available in your living room we'll have three of those devices in the cloud available," was a clear PR phrase meant to suggest 4x console power per user. To the uninformed, that sounds like 4x the rendering power and 4x better games.

The relevance to this platform-agnostic discussion however is that it outlines the issues affecting cloud-computing that determine what can and cannot be done with it. If you look at the investigation information about latency and bandwidth, those facts are true for all cloud-based services in future regardless of who's providing them.

In that context it would make more sense to go through the potential of general cloud gaming (not just MS's computing model) and talk about its challenges and how MS's approach seeks to avoid them...
How does it? How does MS's approach avoid latency and BW concerns?
and/or where Sony/Gaikai's own solutions fit in and a broader medium-term forecast of where things could go.
Gaikai isn't about cloud computing but remote gaming. It has little to do with the subject.
 
Yes, they could pay Microsoft, Amazon or Google to do that. Not sure how the financials work out. Probably better than trying to build your own data centers across the world, especially when they have no internal expertise in doing that kind of thing.

Owning data centers is a bad idea unless you have a truly enormous need.
It's one of those markets where it's a race to the bottom.
MS's play in the space is mostly about software as a service.
 
Sure, but so what? I mean, streaming in anims can take several mins for all the player cares. So long as the player isn't directly interacting immediately with the objects in the scene playing out these cloud-computed anim sequences it's not something players will notice. Some could even be streamed in and 'pseudo interactive' via triggering. I gave an example of an avalanche earlier where as you get close to a mountain perhaps the cloud computes a large scale physics-based avalanche/destruction animation that is streamed in and waiting to be triggered by the player.



Not necessarily true. There is a LOT more physics going on in a game than I think you realize. If my character jumps into the air, as he is in the air the game physics should know where/how he will land and can compute what impact that could have physically on the game world. For instance:

8782102157_8df154b868_o.gif


Note that the player could potentially be in the air for many frames before the effects of the landing need to be displayed. Depending on how long an object (player, car hurling through the air, etc) is in the air the impact animation on whatever it collides with would be there waiting to be displayed by the time the collision takes place.

Also, don't underestimate physics-based animations like the one below:

8782065639_6ea9b6dcec_o.gif


The reason this scene here with the bridge is so amazing visually is due to the richness of the detailed, realistic, physics-based animation playing out on screen. Sure, something like this wouldn't be interactive, but again, say in the game the player can influence the trajectory of the boat slamming into the bridge, what objects are on the bridge, etc. The boat would take a good chunk of time to actually initiate said collision, so that kinda thing should be doable int he cloud too. This again would be loosely interactive/dynamic. If ya don't like that terminology you are welcome to call it what you find more appropriate.

Hmmm...anyone happen to know how big a typical, fully scripted in-engine cutscene is in terms of memory? That is basically nothing but physics/animation scripting usually afaik, so maybe that can help us get more of an idea specifically on the really complex anims like the scene above.



I originally thought it couldn't directly affect player combat/interactivity too but someone else here made a good counterpoint to that assumption. You could collect data on player strategic tendencies, compute counter strategies, stream them into the game to the enemies, and that would directly affect their AI. It wouldn't be instant reactions, but in the real world it wouldn't be instant either and you would still be doing something dynamic and interactive. Just with some small delay which given the context would be appropriate anyhow.

Those are cutscene, and even if they are real game play, those cannot be used in server compute for many reason explained in DF article.

Even MS said non interactive...
 
Based on what Whitten says in the reveal it seems to directly suggest their 300k figure is only for XBL itself. The cloud computing stuff is offered to devs through Azure which isn't the same thing. Mary Jo Foley said on Windows Weekly last week that she asked the Azure ppl and that's also what she was told. If Whitten was just using the number of Azure servers he couldn't have said XBL today runs on 15k servers (which it does). He talks about the various expected XBL Gold services on offer with the new, 300k wide XBL setup and then speaks to cloud computing in separate, additive manner.

I believe that she's wrong. Xbox Live is basically running on that platform co-located in the data center. Windows Azure is the part of the network that is open for customers to develop on. Xbox Live is basically an application running in that environment. When Azure has had downtime in the past, it's impacted Xbox Live, Xbox Music etc.

I also watched the that podcast, or whatever you want to call it, and Paul Thurrott is a complete idiot. He gets RemoteFX almost totally and completely wrong as it applies to the Xbox One. The quality of the information on that show seems to be very poor.
 
Well, that is what I am curious about...depending on the amount of data involved, depending on your internet bandwidth, depending on which services you run besides the game on your X1/household that eat some of the bandwidth, I wonder if you may end up needing even longer loading screens to e.g. get your prebaked lighting data.


I don't think bandwidth will be much concern if you are computing stuff that can be done on loading. I am purely talking about preset animations, explosions, particle effects that are not interactive..

Those data are not going to take too much hit on your bandwidth. Latency will make it impossible to be done on real time effects. Hence, the prebaked or fake effects.

As you mentioned, world AI is probably the most interesting thing about this, but I was thinking, can you just have bit longer load screen and compute all this non interactive AI?

I am contributing, by pointing out your lack of knowledge and fanboy hypocrisy.



You're arguing that the destruction from explosives in BF can somehow be decoupled from the host and clients, sent to MS' servers, calculated, and the results shot back and there be no effect/delay from all the players' view? Are you serious? That's not a serious statement. That's just not based in reality.

Let's say you fire this hypothetical rocket and it won't hit anything for 10 seconds...until somebody drives a jeep from around a corner that you didn't see. Now there's a completely different set of results is required, delaying your mystical offloaded explosion physics. This does not work for any real system. Multiple people have told you this. DF told you this. sebbbi told you this. It's fantasy.

DO you have anything actual, factual to bring to a TECH discussion besides fantastical arguments not based in reality? Again, just link to one legit article that can back up 1/10th of your wild conjectures.

I agree. Even his argument to set chain of action to ending up bridge getting crashed by boat is not realistic. It is pure fantasy.

Mostly due to that if the preset actions that will result in something that is completely non interactive is going to happen, its just easier and better to do it in loading screen..
 
MS haven't stated intentions other than they'll provide a lot of power that can do amazing things. They left it to the public to imagine what that'd result in.

Nope. Several of their execs have been pretty explicit in explaining that this is for latency insensitive tasks and Matt Booty (who DF relied on heavily in the article) gave out several direct, explicit examples. So did Greenwalt at the panel.

Latency in the article is discussed to show that many rendering jobs cannot be moved to the cloud. That has nothing to do with server-based rendering and game streaming which is obviously possible as it exists in Gaikai and PS3 remote play.

...which is misleading as it has nothing to do with MS's claims. MS was clear that this is for latency insensitive stuff. When MS has been clear on this front it's not fair to frame things as if they claimed to be doing rendering over the network directly. Like I said, it's the structure of the article that is misleading. You can disagree all ya want. One of you mods closed a thread here that totally misunderstood DF's conclusions and there's one on GAF too full of ppl who were mislead because they were too lazy to keep reading. Some of that is the author's fault.
It's not talking about cloud rendering. It's talking about what sort of jobs can be moved to the cloud. They cannot be time-critical jobs due to latency and BW concerns.

Wat? The examples they are using are explicitly referencing graphics rendering. Hence the note about RAM bandwidths and comparing that to download speeds today.

Everything else is plausible - the article explicitly states this and even offers suggestions on using the cloud like procedural content creation.

Right, as I noted they get credit for that idea. My quibble with their content was that they totally missed MS's point on how it affects local capabilities. The rest of my issues with it are structural. I'm done with this issue. It's useless to the actual thread and many of us here can likely do a better job going through some of the design challenges than a single DF writer on the clock.




Maybe it's better to start trying to accumulate info from modern games that have accessible documentation.

KZ:SF Demo
75MB for animations
6MB for AI data
5MB for physics meshes (compared to 315MB for total meshes)

http://www.guerrilla-games.com/presentations/Valient_Killzone_Shadow_Fall_Demo_Postmortem.pdf


BF3 multiplayer levels
200MB-250MB for meshes (total, not just physics meshes)
Heightfields for terrain is procedurally generated (helpful?)

Here's a whole presentation on their destruction system btw:
http://www.slideshare.net/DICEStudio/siggraph10-arrdestruction-maskinginfrostbite2
 
I agree. Even his argument to set chain of action to ending up bridge getting crashed by boat is not realistic. It is pure fantasy.

Mostly due to that if the preset actions that will result in something that is completely non interactive is going to happen, its just easier and better to do it in loading screen..
It is better to do it on the fly. Physics/destruction might be augmented by cloud computing. The small and immediate would always have to be handled locally. It could scale depending on your connection. You get canned or simpler animations offline.
 
Last edited by a moderator:
What about some particle systems? Could the simulations of some of these systems be done in the cloud. They aren't always latency sensitive. Could easily scale with bandwidth(more bandwidth, more particles). The server could contain environment information so it could perform collision detection against certain things such as static structures. The only data being passed back from the cloud would be position data, an identifier to let the local console know what assets to use for each particle, and maybe an expire time if connectivity is lost.

Lets for each particle the data passed down to the console would be 3 32-bit floating point variables for position, maybe a 16-bit short for the identifier, and say a 16-bit short for expire time. Each particle would be
128 bits. Say an average 3:1(too high? looked around and this seemed to be a common ratio) lossless compression using LZ, then each particle would be ~43 bits. Given an average bandwidth in the developed world of 8mb/s, this would allow for 186,000 particle updates per second. Perhaps with some interpolation and better efficiencies like culling, this might be something that could save local CPU time. Thoughts?
 
particle systems are absolutely the last thing you want done on a cloud.
for the most part theyre unnecessary eyecandy
 
Those are cutscene, and even if they are real game play, those cannot be used in server compute for many reason explained in DF article.

Even MS said non interactive...

Strange thing...you need to actually *read* my post to be able to comprehend what I'm saying. THEN you can think on the points raised and reply. Yes, the ordering is important. It's pretty obvious you didn't read the post you quoted so take the time to go back and do so.




I believe that she's wrong. Xbox Live is basically running on that platform co-located in the data center. Windows Azure is the part of the network that is open for customers to develop on. Xbox Live is basically an application running in that environment. When Azure has had downtime in the past, it's impacted Xbox Live, Xbox Music etc.

I think you misunderstand what I'm saying. I know their data centers house both XBL and Azure. My point was those are two separate entities. The former for online gaming as a platform and the latter for cloud computing for various clients, etc.

Ok, so when MS says they are expanding XBL to 300k servers does that mean the cloud computing takes place on those 300k servers or does that mean they just have a lot more social/profile data in the cloud and want every game with dedicated XBL servers? My impression is that all the various profile stuff and MP servers for XBL will be on the 300k servers, but MS is offering devs Azure resources separate from that.

I also watched the that podcast, or whatever you want to call it, and Paul Thurrott is a complete idiot. He gets RemoteFX almost totally and completely wrong as it applies to the Xbox One. The quality of the information on that show seems to be very poor.

I would agree. I was pretty amazed how ignorant they were on lots of areas. However I don't think Mary Jo was lying either so if she said she asked the Azure ppl I'll believe her. She has a track record with them (she is cited in the DF article fwiw). Paul has a spotty track record though. I got the vibe he was bitter that his 'insider info' was...shakey at best and now the whole world knows it.
 
particle systems are absolutely the last thing you want done on a cloud.
for the most part theyre unnecessary eyecandy

That is a good reason why they could be a good fit for pushing out to the cloud. With more CPU time dedicated to them, more complex physics could be applied to them. A graceful degradation for disconnects or bandwidth changes. Not always latency sensitive.
 
...

I think you misunderstand what I'm saying. I know their data centers house both XBL and Azure. My point was those are two separate entities. The former for online gaming as a platform and the latter for cloud computing for various clients, etc.

Ok, so when MS says they are expanding XBL to 300k servers does that mean the cloud computing takes place on those 300k servers or does that mean they just have a lot more social/profile data in the cloud and want every game with dedicated XBL servers? My impression is that all the various profile stuff and MP servers for XBL will be on the 300k servers, but MS is offering devs Azure resources separate from that.

Well, maybe there will be 300k servers for Xbox Live. That's a huge number. I don't think she would have lied. Maybe a misunderstanding? Why would you need to use Azure if you had 300k physical servers for Xbox Live? I'm pretty sure Xbox Live is running on Azure, despite her saying she was informed it is not. It makes a lot more sense. I'm pretty sure I found a Microsoft website that said it was when I was first reading through some of this stuff a few days ago, but I can't find it now.


I would agree. I was pretty amazed how ignorant they were on lots of areas. However I don't think Mary Jo was lying either so if she said she asked the Azure ppl I'll believe her. She has a track record with them (she is cited in the DF article fwiw). Paul has a spotty track record though. I got the vibe he was bitter that his 'insider info' was...shakey at best and now the whole world knows it.

She may be pretty reliable. I don't know her. Paul Thurrott maybe has some insiders, but he doesn't seem to be particularly reliable or informed enough to understand the information he's given.
 
Well, maybe there will be 300k servers for Xbox Live. That's a huge number. I don't think she would have lied. Maybe a misunderstanding? Why would you need to use Azure if you had 300k physical servers for Xbox Live? I'm pretty sure Xbox Live is running on Azure, despite her saying she was informed it is not. It makes a lot more sense. I'm pretty sure I found a Microsoft website that said it was when I was first reading through some of this stuff a few days ago, but I can't find it now.

I couldn't find anywhere saying it ran on Azure online. Like I said, my impression was their data centers house Azure for cloud computing alongside their XBL servers. Same data center, but different servers for their cloud services. Remember, that video you posted earlier in the thread wasn't about Azure, it was about their data centers overall. Hopefully someone asks MS and gets a clarification either way.

Here is Mary Jo specifically laying it out for us:

Mary Jo Foley said:
Microsoft officials also mentioned Windows Azure during today's Xbox One reveal. Xbox Live does not run on Windows Azure; it runs on its own servers in Microsoft's datacenters. When Xbox Live launched in 2002, Xbox Live required 500 servers. It now requires 15,000. By the time Xbox One launches this holiday season, Micorsoft officials said it will be running across 300,000 servers.


We do know that the Halo game team at Microsoft has used a new cloud-programming model, codenamed "Orleans," which was developed by Microsoft Research. And during today's Xbox One reveal, the Redmondians noted that users will be able to store their movies, music, games and saves "in the cloud," which I am assuming means on Windows Azure.


Update No. 2: In an Xbox One frequently asked questions (FAQ) document, Microsoft officials also noted that the cloud (again, no mention of Azure specifically) will also allow for automatic game updating, game enhancements and saved preferences. VentureBeat's Dean Takahashi has more from the under-the-hood panel on how the cloud also will enable offloading tasks and freeing up more local console resources.


The aforementioned Wired piece states defnitively that "Xbox One gives game developers the ability to access Microsoft’s Azure cloud computing platform." Microsoft officials didn't say that today during the Xbox reveal event. However, Microsoft didn't say anything about the developer story for Xbox One today, presumably because that is going to be a big part of the messaging at the company's Build 2013 conference at the end of June.

http://www.zdnet.com/microsofts-xbox-one-whats-windows-got-to-do-with-it-7000015684/
 
I wouldn't expect the things they choose to push to the cloud to be that bandwidth hungry. 8Mb/s (1 MB/s) is asking a lot.

I think ppl may significantly underestimate exactly how latency insensitive this stuff can be. Like I mentioned, triggered events can be set up that can have massive latency without the player noticing if done right. It's more a design challenge than a technical one.

Not sure if you saw my other post or not but the physics meshes in the entire KZ:SF demo only took up 5MB and total anims were only 75MB. Wanted to point that out to ya since I know you were interested your waiving flag starting point.
 
Strange thing...you need to actually *read* my post to be able to comprehend what I'm saying. THEN you can think on the points raised and reply. Yes, the ordering is important. It's pretty obvious you didn't read the post you quoted so take the time to go back and do so.


I didnt think I needed to reiterate what others have said. Clearly, you do not understand anything that was on DF article. I imagine you are having a hard time understanding these concepts.

Here, it clearly explains why your theories are nonsense.

I am contributing, by pointing out your lack of knowledge and fanboy hypocrisy.



You're arguing that the destruction from explosives in BF can somehow be decoupled from the host and clients, sent to MS' servers, calculated, and the results shot back and there be no effect/delay from all the players' view? Are you serious? That's not a serious statement. That's just not based in reality.

Let's say you fire this hypothetical rocket and it won't hit anything for 10 seconds...until somebody drives a jeep from around a corner that you didn't see. Now there's a completely different set of results is required, delaying your mystical offloaded explosion physics. This does not work for any real system. Multiple people have told you this. DF told you this. sebbbi told you this. It's fantasy.

DO you have anything actual, factual to bring to a TECH discussion besides fantastical arguments not based in reality? Again, just link to one legit article that can back up 1/10th of your wild conjectures.


I am contributing, by pointing out your lack of knowledge and fanboy hypocrisy.



You're arguing that the destruction from explosives in BF can somehow be decoupled from the host and clients, sent to MS' servers, calculated, and the results shot back and there be no effect/delay from all the players' view? Are you serious? That's not a serious statement. That's just not based in reality.

Let's say you fire this hypothetical rocket and it won't hit anything for 10 seconds...until somebody drives a jeep from around a corner that you didn't see. Now there's a completely different set of results is required, delaying your mystical offloaded explosion physics. This does not work for any real system. Multiple people have told you this. DF told you this. sebbbi told you this. It's fantasy.

DO you have anything actual, factual to bring to a TECH discussion besides fantastical arguments not based in reality? Again, just link to one legit article that can back up 1/10th of your wild conjectures.

Strange thing...you need to actually *read* my post to be able to comprehend what I'm saying. THEN you can think on the points raised and reply. Yes, the ordering is important. It's pretty obvious you didn't read the post you quoted so take the time to go back and do so.






I think you misunderstand what I'm saying. I know their data centers house both XBL and Azure. My point was those are two separate entities. The former for online gaming as a platform and the latter for cloud computing for various clients, etc.

Ok, so when MS says they are expanding XBL to 300k servers does that mean the cloud computing takes place on those 300k servers or does that mean they just have a lot more social/profile data in the cloud and want every game with dedicated XBL servers? My impression is that all the various profile stuff and MP servers for XBL will be on the 300k servers, but MS is offering devs Azure resources separate from that.



I would agree. I was pretty amazed how ignorant they were on lots of areas. However I don't think Mary Jo was lying either so if she said she asked the Azure ppl I'll believe her. She has a track record with them (she is cited in the DF article fwiw). Paul has a spotty track record though. I got the vibe he was bitter that his 'insider info' was...shakey at best and now the whole world knows it.

It is better to do it on the fly. Physics/destruction might be augmented by cloud computing. The small and immediate would always have to be handled locally. It could scale depending on your connection. You get canned or simpler animations offline.


I imagine you mean non interactive physics, destruction and animation? I see extremely limited application for those.
 
Back
Top