Server based game augmentations. The transition to cloud. Really possible?

Discussion in 'Console Technology' started by Shifty Geezer, May 22, 2013.

  1. Le Photographeur

    Newcomer

    Joined:
    Mar 22, 2013
    Messages:
    19
    Likes Received:
    0
    Ok, so technically it does not seems to be a problem for this..?
    because my wondering is, perhaps this is something that does not need to be mandatory but "could work" even if its optional?

    Is it possible to code games with "if-"situations? So for example.. "IF a second console is connected, offload X,Y,Z to it" etc etc?
    So games would be coded to what the standalone machines are capable of but COULD get improvements if a second unit is connected, improvements like we see in the PC space today by having a monster GPU etc etc?

    If something like this could work, then obviously it would not be something everybody would do, but I could imagine a scenario where many people would get a second unit just for improving gfx.
    And when the price of the machine goes down, even more people would double dip (IF the improvements are there).

    Sure, this would never be standard, but if games can be coded with this in mind and that the OS is prepared for this, to make it as easy as plug and play, then at least it would make more people more curious.

    At least, this is what I speculate :)
     
  2. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,666
    Likes Received:
    5,692
    It's certainly possible, but not very probable. It all comes down to economics.

    If the investment required is higher than the potential return then it no one will do it.

    For example, on PS2 it was possible to link 4x PS2's together to have one game render to 4x screens for a very large screen display. How many games supported that? :)

    Let's say 8 out of 10 consoles owners have the optional "thing," whatever it is. That might get implemented, there's a reasonable chance you could sell a copy of a game to one of them that would not have gotten sold otherwise. 5 out of 10? Now it's starting to get iffy. Will that additional support for the option "thing" get enough people to buy it that wouldn't have bought the game otherwise to justify the money and time investment to implement it. 2 out of 10? Highly unlikely. Less than 1 out of 10? Not likely other than a proof of concept by the console manufacturer, and most definitely not by any third parties.

    Now if it was mandatory, 10 out of 10 would have it, and you'd definitely see developer's experiement with it. Anything less than that and you'll start losing developer support as they decide it might not be worth the additional cost, development time, etc. versus potential additional sales.

    Regards,
    SB
     
  3. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,602
    Likes Received:
    56
    Location:
    In the land of the drop bears
    I wouldn't want to do graphics on a seperate console communicating over 100MBIT or a 1GBIT link, they are both woefully slow in comparasion to the internal bus's.

    Realsitically your going to get around 8MB/s out of 100mbit and around 80MB/s out of 1gbit.

    Per frame at a 30FPS game this is a time amount of bandwidth for a 60FPS game its even worse.

    Though this does no mean that you cant do other things on the other console which do no require as much bandwidth, but there are problems with that too as mentioned above
     
  4. liquidboy

    Regular Newcomer

    Joined:
    Jan 16, 2013
    Messages:
    416
    Likes Received:
    77
    wifi-direct improves this scenario http://www.ign.com/boards/threads/x...nderstated-feature-of-the-xbox-one.453139381/

    Btw I expect a future illumiroom like device that communicates to the xb1 that sits on a Coffe table beaming a render target 'light' to the TV/wall to communicate via WiFi-direct
     
    #944 liquidboy, Jul 21, 2013
    Last edited by a moderator: Jul 21, 2013
  5. astrograd

    Regular

    Joined:
    Feb 10, 2013
    Messages:
    418
    Likes Received:
    0
    I agree, but I think parsing things in terms of 'interactive' and 'non-interactive' misses a lot of stuff that are interactive but delayed. You can even (presumably) pair those delayed interactive moments (where you trigger something) with fully interactive actions, like shooting a rocket at a building where the initial explosion/destruction effect is done locally but the building takes a few seconds to actually start to topple. That's a window where potentially the cloud can be augmenting the destruction parameters, doing computations, and send the results to the console for display.

    I think if you even just take a single random screenshot of a game, odds are that you could probably find a TON of latency insensitive stuff going on in that scene. For instance, a game like watch dogs:

    [​IMG]

    Those gorgeous waves are physics driven, but the distortions as player-controlled boats drive through them is a local effect. All that static geometry bounding that channel there is non-interactive and since the waves are using some fluid simulation effect based on the shape of the bounding geometry that would mean the physics calculation for most of those waves doesn't need to be updated within a single frame. If you want to add ripples to the waves to spread outward after the player passes through the water, those ripples are time delayed as it takes time for them to propagate.

    There seems to be a lot of stuff that adds to the dynamicism of a scene that isn't local until the player takes the time to walk up to it and do something.

    I'd also add that the asteroid tech demo doesn't need to be pulling in 500k updates per second to account for all the asteroids at once. It takes some finite amount of time to move around in that kind of demo and you won't be viewing all of the asteroids in any given shot most likely. You can have delays in the updates. With the local box handling the near field objects you can gradually stream in the data from the far field objects in the time it takes you to actually move close enough to notice their dynamics.
     
    #945 astrograd, Jul 21, 2013
    Last edited by a moderator: Jul 21, 2013
  6. LightHeaven

    Regular

    Joined:
    Jul 29, 2005
    Messages:
    538
    Likes Received:
    19
    Why are you directly comparing the internal bus to the network bandwidth? Doesn't make much sense to me, when in games the output is usually a lot less demanding memory and bandwidth wise than what is required to create said output.

    The other console/server would still have lots of internal bandwidth working on the task, the slower bus would be usable to transfer a much more limited buffer.
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,439
    Likes Received:
    1,235
    EA's Frank Gibeau talks about cloud as part of a much larger interview. Sounds at least like they aren't dismissing it offhand Neogaf style

    http://venturebeat.com/2013/07/24/f...-and-respawns-titanfall-interview-part-two/2/

     
  8. humbertklyka

    Newcomer

    Joined:
    Aug 24, 2006
    Messages:
    36
    Likes Received:
    1
    Seems like NVIDIA is doing some research into cloud usage for indirect lighting which is being presented at SIGGRAPH:

    http://graphics.cs.williams.edu/papers/CloudLight13/

    Seems like its quite doable, even if there will be significant lag if there are sudden changes.

    Unfortunately its hard to evaluate the visual difference compared to only running a local solution.
     
  9. gurgi

    Regular

    Joined:
    Jul 7, 2003
    Messages:
    605
    Likes Received:
    1
    This raises another question. What happens in a home, roommate, or dorm situation with multiple xboxes trying to play titanfall or something else utilizing compute for more than AI pathing and matchmaking?
     
  10. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,614
    Likes Received:
    60
    It depends. Very often peer to peer network can be more tricky to handle than regular client server setup. In a complex setting like a university, the network admins may set up policies on their router to contain/partition the traffic.
     
  11. dumbo11

    Regular

    Joined:
    Apr 21, 2010
    Messages:
    425
    Likes Received:
    1
    If a company was to sell a 'xbox +' then it would come with 2 ethernet ports. Plug one directly into the original console and the other into the network.

    But my own feeling is that an 'xbox +' should be a full onlive/gaikai-style server rather than a 'compute resource'.
     
  12. DaveNagy

    Newcomer

    Joined:
    Jan 18, 2013
    Messages:
    51
    Likes Received:
    0
    Fantastic find. That paper seems like a direct response to many of the questions posed by this thread. I hadn't thought about the amortization angle where multiple clients can be taking advantage of (and contributing their "cloud allowance" toward) the same set of lighting calculations.

    The big disconnect here is that nVidia envisions a cloud of GPUs, while Microsoft's cloud is composed of CPUs, memory and storage space. Would a software (ie. CPU) version of these sorts of lighting calculations be worth the effort? I know that GPUs are vastly more parallel, but could several-to-many interconnected multi-CPU servers start to compete in that space? (On a per-watt or per-transistor or per-hardware-dollar basis?)
     
  13. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    6,879
    Likes Received:
    491
    The only relevant one was the irradiance map solution (the Voxel approach is basically remote rendering and the Photon approach needs high bandwidth). This what they said about irradiance mapping in closing :

    "Irradiance mapping requires the lowest bandwidth of our algo-
    rithms, with latency lower than voxels due to utilization of client
    computational resources. It also integrates easily into existing game
    engines. Unfortunately, irradiance maps require a global geometric
    parameterization. While decades of research have provided a mul-
    titude of parameterization techniques, these do not address prob-
    lems specific to global illumination: handling light leaking where
    texels lie below walls or keeping world-space samples close in tex-
    ture space for efficient clustering into basis functions. We see the
    authoring burden of parameterization as one reason developers are
    moving towards other techniques, e.g., light probes."
     
  14. Solarus

    Newcomer

    Joined:
    Jan 12, 2009
    Messages:
    156
    Likes Received:
    0
    Location:
    With My Brother
    can this be done on azure? i looked up the specs for azure machines and i coudln't find anything but they do use amd processors http://harutama.hatenablog.com/entries/2010/10/30 it's on slide 8. but that's old. so its possible that they moved to like intel machines with really powerful gpus.
     
  15. joker454

    Veteran

    Joined:
    Dec 28, 2006
    Messages:
    3,819
    Likes Received:
    139
    Location:
    So. Cal.
    Just curious, have Microsoft said if their cloud approach for video game use was purely cpu's rather than gpu's? I'd have thought for their purpose going with cpu's would be better to give them more general virtual computing resources anyways, but I wonder if they confirmed that.
     
  16. Xenus

    Veteran

    Joined:
    Nov 2, 2004
    Messages:
    1,316
    Likes Received:
    6
    Location:
    Ohio
    It's not that they said it is puely cpu but basically the fact that their Azure backbione is pretty much all CPU based that may change in the future but for now it's what they have.
     
  17. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
    Further up i posted a link that gave some more recent information on the hardware.
     
  18. LightHeaven

    Regular

    Joined:
    Jul 29, 2005
    Messages:
    538
    Likes Received:
    19
    Ms has some big compute servers (I think that's their names) that are either 8 core machines with 60GB, or 16 core machines with 120GB. They have very high network bandwidth and apparently are super efficient.

    I don't know if Ms have opened these servers to 3rd parties yet, but a preliminar test with 500 servers is even ranked at the top500 super computers and top50 in terms of efficiency.

    I would say it can most definitely be done. Dunno if they have enough of those servers to make that feasible to millions of concurrent players, though.
     
  19. Cjail

    Cjail Fool
    Veteran

    Joined:
    Feb 1, 2013
    Messages:
    2,027
    Likes Received:
    210
    Never mind.
     
  20. DaveNagy

    Newcomer

    Joined:
    Jan 18, 2013
    Messages:
    51
    Likes Received:
    0
    Well, MS made those hand-wavy comments about how every XB1 would have access to the equivalent of three more XB1's worth of CPU and memory up in the cloud. I took that to mean that there might be a few hundred GFLOPS of CPU on tap up there, but definitely not several TFLOPS of GPU resources. They certainly would have made a big deal about that, if it had been the case. And of course, Azure wasn't built primarily for gaming/graphics. It's essentially a bunch of Web/DB/Application servers.

    It would seem that Sony/Gaikai's idea of the "cloud" is much closer to that of nVidia's: Lots of GPU in the cloud, able to do actual (complete) game rendering, or failing that, at least do render-assist stuff like these "bonus" irradiance calculations that the nVidia paper talks about.

    Sadly, I take the grand plans from Sony-Gaikai (great cyberpunk corp name!) that we've heard about with a grain of salt. Building that up from scratch, and then monetizing it, sounds like a tall order given Sony's resources.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...