David Scammel said:Crackdown 3's single-player campaign can be played entirely offline w/ limited destruction. 100% destruction only available in online co-op
It's also being co-developed by Sumo Digital. Multiplayer map is completely different to single-player map
A pair of tweets from Videogamer.com journalist David Scammel on Crackdown 3. It seems that they've gone for the remote hosted session approach with a client much as in non P2P FPS games or MMOs but with responsive physics as opposed to canned animations (as in BF4's 'level-ution'). I've not seen any firm articles as to whether there is any cloud compute involved in single player at all. The mention of a wholly separate map for online co-op suggests it is a very different thing to the main campaign map, whether this means it's different in size and scope or just layout is something I hope we get some more detail on but I wouldn't be surprised if they don't address that at this stage of dev.
So itt's not cloud in the way I imagined it when I first heard of the concept (essentially a remote co-processing resource) rather it's a whole game instance you connect to, still I look forward to them revealing how they're handling the transmission of all that data.
EDIT: Further tweets
David Scammel said:“We do not use that [cloud processing] in the campaign game. We wanted to create a very different experience for multiplayer.” - Dave Jones
"We wanted players to have the campaign game they always loved, and if they want to play it offline then they absolutely should be able to."
"Obviously they have to go online for co-op but that's still the same campaign game.” “The 100% destructible environments is limited to MP"
David Scammel also clarified his earlier tweets referred to 'co-op' when he meant to say 'multiplayer'.
Further EDIT:
http://www.videogamer.com/xboxone/c..._bandwidth_of_a_regular_multiplayer_game.html
Videogamer said:"We are optimising for between 2 and 4 megabits," Reagent Games producer David Jones told VideoGamer at Gamescom today. "That is our goal."
Jones added: "It's not significantly more [bandwidth than a regular multiplayer game], it's maybe two to four times as much. A standard is maybe around 1mbit, so it's a little bit more, yes."
So 2-4 Mbits of b/w for multi, given that the simulation is all happening server side I'm going to guess this is d/l bandwidth rather than upload given that the normal amounts of data are all that need to be passed (ie player velocity/location, weapon projectiles, etc) rather than the more complex data sets if it were some kind of cloud hybrid compute. Still it will be interesting to see if they can combat lag/warp in debris physics although it would be kind of cool to see a collapsing building do a Zack Snyder-esque fast-slow-fast motion effect
Edit 3 this time it's personal:
http://arstechnica.com/gaming/2015/...-the-cloud-to-make-whole-cities-destructible/
So Ars Technica got an interview too and they saw an interesting overlay that showed which servers are simulating which bits of debris and spoke to the dynamic scaling they have that allows them to spin up extra servers to support a game instance if the load gets too great.
Ars Technica said:Another overlay shows precisely which bits of debris are being powered by which server, some chunks of concrete pasted green and others blue. These objects are located on different servers that are all powering the same game, allowing for greater detail when necessary. Should you enact so much destruction that you need even more power, a new server will automatically come into play and distribute the processing workload further.
Last edited: