Path tracing denoising

OCASM

Regular

Both NVIDIA and Otoy are also working on this with similar results although they take the neural network approach. With Brigade finally being released next year (and integrated in Unity for free) the era of ray-traced games is finally upon us (hopefully).
 
This is cool stuff, and I am sure we will patch trace games eventually, however...

10x visual quality of AAA games is debatable. Most current game environments are mostly static and have mostly static lighting. And I will argue that this will not radically change, because real life environments are mostly stable too (walls don't collapse in your office/home all the time, and you don't move light sources that often at high speed). Mostly static environments allow the use of (directional) lightmaps. Baked lighting looks as good as these real time techniques in environments commonly used in games, but only uses a fraction of performance budget. Patch tracing allows more dynamic environments and more freedom to the game design. That's great. However you need to rebuild your ray tracing acceleration structures if you do big changes to the environment, making some types changes much harder to implement versus mesh rasterization based systems.

Fast patch tracing (with good reconstruction filter) will simplify the rendering code a lot. But 100x improvement is too much. 100x improvement would mean that you cut your 20 person rendering team down to 1 person, and that 1 person finishes all the rendering tasks 5x faster than the old 20 person rendering team. Doesn't sound plausible.

100x more efficient art production isn't going to happen either. Big teams such as Assassin's Creed have 100+ artists. Can you replace their art team with just 1-2 persons? Polygon model authoring still takes as much time as before. You still can't model environments with huge distances without any geometric LOD. Without LOD, rays would basically hit random polygons in the billion polygon soup trashing the GPU caches. You still need to draw/capture all your PBR material textures. PBR already reduced the number or artist hacks required a lot. Patch tracing is a step further, but nowhere close to 100x productivity improvement for art side.
 
The improvement is so massive because artists no longer would have to wait minutes or even hours to see the results of their changes to the levels. Less time wasted = mony saved. I guess that's the logic there.

Also, part of the reason why levels are so static is due to the artists (and the consumers) wanting the lighting to be high fidelity at all times. With path tracing you can have that plus dynamic environments. Win win.

It seems a presentation was done yesterday and the performance improvement is great



And if you have no problem playing at a lower resolution you could use that in games right now. Oh, Brigade is going to be great.
 
The improvement is so massive because artists no longer would have to wait minutes or even hours to see the results of their changes to the levels. Less time wasted = mony saved. I guess that's the logic there.

Also, part of the reason why levels are so static is due to the artists (and the consumers) wanting the lighting to be high fidelity at all times. With path tracing you can have that plus dynamic environments. Win win.
We all know that dynamic lighting saves time vs baking lightmaps (no matter which technique is used). We have been using dynamic lighting (lower quality vs baked) in Trials games for ages partly because of this reason. My ballpark estimate was that this improved productivity by 2x or so for some persons in our art/content team...

However 100x improvement for art team total productivity doesn't have any backing of actual game project data. There's a big portion of artists that have very little gains from patch tracing: 3d modelers, animators and texture artists (already creating PBR textures) do not see any notable difference in their jobs. Lighting artists (who create final lighting for levels) see huge difference, but we are talking about a few people per project here. Level designers don't see that big difference either, because accurate lighting only is relevant in polish passes (gameplay passes use gray boxes instead of actual models), and that's when level artists (and lighting artists) take over. Rendering programmers will of course see a huge difference, but it would be stupid to assume that optimizing your ray intersection tests or BVH update algorithms wouldn't take any time versus optimizing a rasterizer based pipeline. The will be less hacks for sure, but 60 fps patch tracing on consoles would require lots of hacks too (denoising and reprojection aren't without issues too, so you likely need lots of hacks to hide their issues). A GPU based patch tracing pipeline is mostly highly optimized compute shaders and requires lots of tweaking and tuning to work most efficiently on each platform. Movie industry is different vs real time rendering. In real time rendering the performance tradeoffs simply can't be solved by adding more hardware. Concept of optimizing hardware cost vs software cost doesn't exist in gamedev, as the consumer pays for the hardware, not you. If you use advanced techniques that require more GPU cycles, you need to work around the performance issues and spend more time in optimizing the code to reach your goals. Reality is that slower algorithm needs more optimization effort to be usable in consumer products.

I just can't see how a single magic bullet would solve the game production costs (100x cheaper art budget). Patch tracing didn't make movies much cheaper to produce either. We have more consistent quality nowadays (even at lower budgets), but there was no dramatic 100x cheaper movie production cost.
 
Last edited:
It's true that for many jobs the switch to path tracing wouldn't be much of a difference. Morgan is a graphics researcher so I think it's pretty obvious that he's talking specifically about the rendering side of things. Obviously real time path tracing requires extensive tweaking but what doesn't? Having to tweak one thing is easier than tweaking 20 not to mention making sure all of them work fine together. It also means far less setup to prepare scenes for lighting, like in the case of lightmap UVs which can be quite a pain in the ass. Sure, maybe he's exaggerating but in some areas of the pipeline the benefits of real time path tracing are quite massive.
 
That is an awesome progress indeed, but I tend to agree with what sebbbi says, those numbers seem quite exaggerated...

Regarding workflow, modeling and level editing basically does not change. And performance wise, even with very optimized bvh trees, updating the world is still quite expensive -- when we talk about polygon-based worlds -- which is the main reason we don't see much, if any, polygon-based real-time ray-traced worlds with lots of moving objects.

I am also a bit skeptical, although I do respect the progress. But those worlds, if they render at 60fps 1080p with path-tracing, they probably render at 300fps with rasterization and still have breath for many moving objects. =)

Please don't get me wrong: I am all for path-tracing, of course! And it's a great achievement anyway!
 
So Otoy's Unity talk just ended and if I recall correctly it was mentioned that currently full real-time unbiased path tracing is achievable using 8 Volta cards. Running with just 2 high-end cards (don't know if Volta) they can achieve 1 frame per second renders using AI lights and AI denoising. The plan is having it run at 30fps at the same quality by the end of the year with that same hardware configuration. Will believe it when I see it.
 
Back
Top