Server based game augmentations. The transition to cloud. Really possible?

Discussion in 'Console Technology' started by Shifty Geezer, May 22, 2013.

  1. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,540
    Likes Received:
    197
    Location:
    Somewhere out there
    Probably lots of computation offload "inter city", which was a catastrophe.
    The cloud updates really slowly too. If you've tried to build a super structure that required (actually easily doable by one player who knows how to exploit income, but anyway) cooperation, you know it takes forever for it to change states and complete.


    Plus
    Who puts a T intersection and traffic signals on a highway???
    Maxis :roll:

    (warning, large file)
    https://drive.google.com/file/d/0B0DB02tYXggEaFRLVEY1bDhfZnM/edit?usp=sharing
     
    #1101 Strange, Jan 16, 2014
    Last edited by a moderator: Jan 16, 2014
  2. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,367
    Likes Received:
    3,959
    Location:
    Well within 3d
    Let's not go insulting ants now. They are marvelously evolved creatures with specializations and innate behaviors that make them effective as individuals and resilient as a collective. They've also had millions of years for bug-fixing.

    My original concerns about the Glassbox engine's reliance on generic agents for everything seem to have been validated.
    For so many functions, even the stripped-down agents were excessive and injected unwanted behaviors into the simulation: see everything related to power, water, and sewage.
    At the other end, agents were too generic to really produce sane behaviors, and the looming problem over any game that relies on emergent behavior is that tweaking things by fiddling with rules and poorly limited scope is indirect, usually unsatisfactory, and incredibly fragile.

    There were mentions of exporting Glassbox to other games, which probably meant city simulation agents didn't have space in their design for city-specific traits that would interfere with the portability of said engine.

    For things relevant to the player, or the furtherance of their enjoyment, almost certainly EA was spewing crap.
    Analysis of the network traffic for the game broke down the broadcast city data to a limited set of data values corresponding to general totals for the city's various resources and utilities, as well as values serving as a snapshot for the known set of things that can be traded.
    For that functionality, the server served primarily as a message box for neighboring players to (eventually) download and then synthesize into simulation behavior wholly locally.
    There are global elements that should synthesize more general data for market prices and leaderboards and the like, but the server side's general dysfunction kept those off or broken for a very long time, and their impact on the game is somewhat limited or the game code explicitly overrode to keep them from wrecking the actual simulation (floors for various prices, for example), or they behaved in ways so nonsensical that to no style of play could use them.
    I did see complaints that in some cases the poorly thought out design did let the global data influence things, normally when commodity prices and the frequently prohibitive space or specialist building costs rendered whole swaths of city-building choices as game-breaking.

    The more in-depth speculation is that the servers spent a lot of their effort validating user inputs and the checkpoint saves--just frequently badly. Hence the problem with cities being declared invalid or rollbacks to arbitrary or nonsensical points in time. In that case, a vast amount of computation was there to make sure the player played the game in a way Maxis and EA declared acceptable, and then that code and their horrendous cloud implementation crapped the bed.

    Given how poorly the server side has done, one would wonder if the trouble is in extracting out of that mess actually relevant functionality. Perhaps by the effort they might make it more functional, because so much of this disaster is completely the result of the unnecessary dislocation of data to a remote system.

    Intra city would mean computation within a plot, which the cloud does almost nothing but inject spurious values to make worse.

    Intercity computation would be where the multiplayer aspect comes in, most of which has the data content and computational complexity of a few spreadsheet cells put into an email sent every few minutes, except slower.

    The likely bigger load was the mass scale save file management and player action/DLC validation.
     
  3. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,540
    Likes Received:
    197
    Location:
    Somewhere out there

    Sorry I meant intercity. words got messed up in my mind for a while.
     
  4. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    9,063
    Likes Received:
    2,675
  5. London-boy

    London-boy Shifty's daddy
    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    22,315
    Likes Received:
    6,743
    How can anyone trust someone who writes LOT'S?
    Seriously???
     
  6. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    9,063
    Likes Received:
    2,675
    :smile2:
    Maybe his keyboard is broken.
     
  7. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    9,063
    Likes Received:
    2,675
  8. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,765
    Likes Received:
    8,158
    Location:
    London, UK
    Maybe I misunderstood Titanfall but aren't the AI bots, which I'm guessing is the key code running on Azure's complete platform, just canon fodder? What other compute is done in the cloud? Accepting usual server hosting for communication between users.
     
  9. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,209
    Likes Received:
    5,634
    It's a good test of Azure's stability and scaling for a fairly large multiplayer launch. You're right in that the AI bots are pretty primitive. The Titan AI is pretty good when you have them in guard or follow mode, but the grunts are incredibly simple.

    A good explanation of the grunts is that they're like the creeps in DOTA2. Instead of farming gold from them, you farm seconds off of your titan timer.
     
  10. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    Being able to scale dynamically across multiple data centers over the globe based on live demand w/o the complexity of capacity planning and logistics is what the cloud's all about.
     
  11. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,633
    Likes Received:
    37
    That would be the old cloud, the cloud we have been using for a long time.

    The new thing, for games, though I actually thought that was how live was working, for titanfall is the dynamic server adaption. More people spawns more Servers.. No need to plan ahead, but I doubt that they didn't plan ahead. Either Azure isn't busy enough, or it is so big that titanfall hardly takes effort.

    And it's far from the cloud power we hoped for.
     
  12. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    ...and that's different than what i said?
     
    #1112 taisui, Mar 13, 2014
    Last edited by a moderator: Mar 13, 2014
  13. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,633
    Likes Received:
    37
    You forgot the last line.
     
  14. taisui

    Regular

    Joined:
    Aug 29, 2013
    Messages:
    674
    Likes Received:
    0
    so...just what exactly are the cloud power that you are hoping for?
     
  15. Renegade_43

    Newcomer

    Joined:
    Jul 23, 2005
    Messages:
    36
    Likes Received:
    10
  16. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    616
    Likes Received:
    303
    Very interesting, did any of the side sessions go into detail about how this was done? Real time physics is something I thought too latency sensitive for offloading to the cloud.

    Does anyone know which session this was from? I'd like to watch more to see if they elaborate. The YT channel has no info on which talk it is and I don't recognise the speakers.
     
  17. warb

    Veteran

    Joined:
    Sep 18, 2006
    Messages:
    1,057
    Likes Received:
    1
    Location:
    UK
  18. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    616
    Likes Received:
    303
    Cheers warb!

    Kind of hard to scrub through but there doesn't seem to be any other detail, hopefully there will be a more detailed session later on.
     
  19. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    What Phil Spencer said:

     
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,029
    Location:
    Under my bridge
    It's a great, optimistic demo, but sadly not all that useful until we know the specifics. I suppose the data details can be kept secret if part of the development is finding ways to pack massive amounts of particulate data (Vec3 position + rotation for each block, assuming each particle piece is premodelled and not computed realtime requiring mesh data to be included) and this software solution will be part of MS's cloud advantage. 30,000 particles, 32 bit floats per value, would be 192 bits * 30k = 5.5 MB of data per frame, or needing a 170 Mbps connection. MS would need a way to condense that data into less than 1/10th to be viable for real users.

    Two problems with this demo and understanding its application in real scenarios are the data connection in use (where these computers over the internet or in another room connected via Ethernet?), and that it wasn't a like for like comparison. Stationary particles won't need to be sent over the connection so the BW requirement is much reduced. They should have kept it a fair test.

    Absolutely, but that doesn't tell us the timeline. This cloud computing is going to happen. Just in 1 year, or 3, or 5, or 10, or 20? Pretty demos fail to address the very obvious technical limitations, and MS continue to not explain how they solve these. Given that this was just an aside, I don't think the tech is at all ready and MS aren't approaching devs to use this. It's just a glimpse of what the cloud can do in some idealised connection. MS will need to explain how this is applicable to real-world internet connections if such demos are realistic.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...