Can the cloud benefit Ray Tracing?

Discussion in 'Console Technology' started by Alucardx23, Oct 2, 2019.

  1. Alucardx23

    Regular

    Joined:
    Oct 7, 2009
    Messages:
    519
    Likes Received:
    67
    I have been looking online for ways developers might take advantage of the networked hardware in a cloud gaming server, to make the ray tracing calculations more cost effective. I found some information, but not a lot of details on how it would actually work. Can anyone share some info on how this might work or any other links that describe how it can be done?

    "When ray-tracing is distributed across multiple machines on the cloud, server side and thus tapping into massive resources, it can deliver the full blown capacity of ray-tracing rendering in realtime. The constraints studios face when building games are lifted. No longer must they navigate the restrictions of various hardware, making sacrifices when executing their gameplay vision, forced to deliver a compromised version of the original concept."

    "Imagine rendering one area in a game for 1,000 players, 1,000 times over. As you can imagine, this is extremely time-intensive and expensive. However, this is where cloud-based ray-tracing outperforms other ray-tracing methods. Aspects can be rendered once for an unlimited amount of people as ray-tracing becomes shared across many players, making it a highly cost-effective option."

    https://www.hadean.com/blog/cloud-based-ray-tracing-what-is-it-and-how-does-it-work

    "As ray tracing is a highly parallel algorithm, Intel MIC Architecture will help provide big gains in performance by increasing the number of available cores for highly parallel applications in the high performance computing (HPC) market and for datacenters. This leads us to the topic of this paper: bringing together the Intel MIC Architecture along with a cloud‐based gaming model to enable more advanced and realistic image rendering with real‐time ray tracing."

    "Distributing tile‐based rendering across all servers. This method splits the task of rendering a ray‐traced image into small tiles (like 32x32 or 64x64 pixels) and assigns them to a specific server. The benefit of this approach is that it has very low latency. The machines will work together to finish this one frame as fast as possible. The drawback is that a smart algorithm is required for accurate load balancing between the servers. Some tiles might be calculated much faster than others (e.g., displaying the sky without any geometry is very fast). Therefore it could happen that all but one of the machines are already done with their work but have to sit idle until the last one is finished. To solve that, there are approaches like task stealing, where an idle thread can grab work from the pipeline of another busy thread, that should perform well."

    http://wolfrt.de/pdf/Cloud-based_Ray_Tracing.pdf
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,891
    Likes Received:
    11,483
    Location:
    Under my bridge
    Before you answer can Cloud benefit ray-tracing, you need to answer how can Cloud benefit real-time gaming. Regardless how you generate the visual results, how do you get them to the user? Streamed video, or some huge streamed datasets? If the latter, how can you provide these assets over the network in a timely fashion? Let's say you have a bunch of realtime light sources and you try rasterising their light, and then ray-tracing it. Either way, you need to send the updated lightmaps for all affected objects and scenery to the user. How do you do that without the cloud-computed light effects lagging horribly behind the light-sources?

    My reaction to that blog is it's all hypothetical. Until they show it in action, it's a general concept they have with none of the real-world implementation issues solved.

    Edit: The second link, Intel's paper, is using game streaming and sending the game video.
     
    #2 Shifty Geezer, Oct 2, 2019
    Last edited: Oct 2, 2019
    Alucardx23 and JPT like this.
  3. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    513
    Likes Received:
    614
    Here some related work about streaming lightmaps / lit textures and model data in a client server model:


    I think the question should be more about lighting in general, not necessarily restricted to raytracing. (It does not matter how we calculate the lighting, only what to share, how to transfer, deal with lag, etc.)
    If we take a MMO game as example, i see those server side options (in the order they would be practical depending on lag):

    1. Compute indirect diffuse lighting, which can be shared for all players (e.g. lightmaps). Large lag acceptable, direct lighting is still done on powerful clients.
    2. Volumetric lighting, can be shared. (volumetric data - needs good compression?)
    3. Direct specular lighting (e.g. sharp reflections - distant stuff could still be shared for a group of nearby players)
    4. All lighting on server - in this case, does it still make sense to transfer scene and lighting data to clients that render a lag free image, or can we just transfer the final frame as Stadia accepting some latency? The above ideas could still be used server side.

    Personally i feel somehow resistant to any cloud based gaming. But can save power and money, so let it happen.
    It will be interesting if this is true also for single player games. What draws less power: Smaller number of large servers, or large number of consoles - most of them turned off most of the time?
     
    Alucardx23 and Shifty Geezer like this.
  4. Alucardx23

    Regular

    Joined:
    Oct 7, 2009
    Messages:
    519
    Likes Received:
    67
    I think that the original Crackdown 3 tech demo shows a pretty good idea of what will be possible on the cloud. This video below seems like an evolution of that and it would be pretty cool to see that type of destruction on something like a 1,000 player battle royale.



    On regards to ray tracing, I understand that the rays are calculated taking the point of view of the camera as the origin, in order to be more efficient and only calculate the rays that reach that point, but in a cloud scenario, wouldn't it be more efficient to calculate the rays from the light sources and then share that with the client instances?
     
    Mitchings likes this.
  5. Alucardx23

    Regular

    Joined:
    Oct 7, 2009
    Messages:
    519
    Likes Received:
    67
    There are around 100 million PS4 sold, to simplify the math lets say that all of them are 1.8TF. That is equal to 180,000,000TF of power, but all of that power cannot work together and it is split into 100 million pieces. The average gamer only uses their console around 6 hours every week. Think about the amount of plastic just sitting there. This means that the average console is not in use 162 hours a week, 648 hours every month, 7,776 hours every year and in the best case scenario are hopefully off instead of consuming energy in stand by mode. In a server environment you could cover the same amount of people with a lot less hardware and that hardware will be used almost a 100% of the time. I'm not even going into the monetary and environmental cost of manufacturing and distributing the games and consoles.

    It gets worse when you realize that not every game requires the full power of the console to run. As an example, lets say that the One X has a GPU that is capable of running 15 instances of Cuphead, but the One X hardware in a home environment will only be turned on to play a single instance of the game. The same amount of power in a server environment would be covering the needs of 15 people that want to play the same game.

    https://www.limelight.com/resources/white-paper/state-of-online-gaming-2018/
     
    #5 Alucardx23, Oct 3, 2019
    Last edited: Oct 3, 2019
    JoeJ likes this.
  6. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,891
    Likes Received:
    11,483
    Location:
    Under my bridge
    Final Crackdown toned down the cloud physics considerably. I don't think anyone has yet demonstrated a working, effective way to sync large datasets over the network in realtime.
    How do you sync it? There's no point highlighting the power of the cloud. We can consider that infinite, ignoring economic factors.The limiting factor is how to get that power on a local machine. The only fairly effective way used thus far is game streaming.
    Probably RT would fill in lighting data for local rendering which could then ray-trace locally. Maybe. But as above, how do you get that data down and sync'd? JoeJ's link above looks somewhat promising, but as both game streaming and cloud processing face the same issues, why not just go full streaming? In which case, thin clients and server tracing is the order of the day.
     
    Alucardx23 likes this.
  7. Alucardx23

    Regular

    Joined:
    Oct 7, 2009
    Messages:
    519
    Likes Received:
    67
    Reading your previous posts I understand that your opinion is that the original vision for Crackdown 3 would have been easier on a cloud streaming service. This is what I'm referring to when I mention the cloud on this thread. Services like Stadia and Geforce Now. Sorry for not making that clear, now that you mentioned it, I didn't specify what scenario for cloud processing I was referring too. The bandwidth available on a cloud gaming server is an order of magnitude faster than what the average internet connection is, so it should make all of this easier to accomplish.
     
  8. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    513
    Likes Received:
    614
    Necessarily this is only true for sharp specular. For the diffuse (and glossy specular) you can start rays also from the surface. Think of object space lighting which would fit the idea of transferring light maps / lit textures better anyways. (glossy specular would require a directional data structure like SH or better alternatives so the client can look up it's local direction.)

    Yes, if full streaming is possible for enough people with good latency, it's likely the easier way to go / to get started. (The idea to have shared diffuse GI but calculate specular on clients really makes sense, but still requires very powerful clients.)
    But the sync issues will just move entirely to the server side, if we then ask for combining multiple servers to get a really large or better, a rich world.
    Graphics wise i don't see a big problem at all, but physics is harder.
    Sync issues really seem to be a major and ever increasing challenge - multithreading - chiplets - networks... it's everywhere at all scales. I guess we'll spend a lot time on this anyways.

    The thing is, beside ecological advantages, i would dexpect new games from new tech. Replacing more smoke and mirrors with proper simulation would be my personal goal, to give more options to both players and game design.
    You want to take away from me having my own local machine i have control of, and not some company? Ok fine, but then give me something in return for that :)
     
    Shifty Geezer and Alucardx23 like this.
  9. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,891
    Likes Received:
    11,483
    Location:
    Under my bridge
    Okay. If you want to know about Cloud rendering, you want to go look up render farms and take this discussion to the Graphics/Rendering Tech forum as it's non-specific to (console) gaming. Hollywood has been working this angle for decades and they'll know all about the bottlenecks and pitfalls of distributing a scene overall multiple servers.
     
    Alucardx23 likes this.
  10. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    7,236
    Likes Received:
    1,346
    What I imagined is that the cloud will calculate and the game will download it while loading. So instead of "baking lightnaps" when loading like some games, it just downloads it from the server.

    But then I don't understand what's the benefit compared to baking the RT lights and shadows on the game data itself in development phase.

    Anyway, those will only work well for some kind of games. Something like Zelda links awakening can use it I think
     
  11. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,090
    Likes Received:
    3,161
    Location:
    Pennsylvania
    There isn't really any benefit. If you were doing something like a ToD change in an open world game every 10 minutes or something as the sun moves, updating all the lightmaps could potentially be a lot of data, not to mention trying to transition everything would be difficult to achieve smoothly.
     
  12. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,891
    Likes Received:
    11,483
    Location:
    Under my bridge
    It only has value being computed during gameplay if the scene is being generated/modified during gameplay. Any static environments are better off being baked and included in games. Cloud rendering is very valuable for production and generating these lightmaps quickly.
     
    Alucardx23, PSman1700 and Malo like this.
  13. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    7,236
    Likes Received:
    1,346
    Could be useful for minecraft "exhibition mode" or something.

    Those with rtx can Make something with local RT, but when other people load your map in "exhibition mode", where they only can experience the map, can't destroy anything, the RT will be also be enjoyable by all gamers, including on smartphones, because the RT was done by the cloud and was downloaded when loading the map
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...