The Complete List of PC Ray Traced Titles with Classification

Various areas are too dark. They would be receiving much more light with full GI as seen in Cyberpunk.
Post-processing always makes this very hard to actually know I find. Ever since I spent time talking to Epic about it, I realised that exposure settings of the game camera are actually even more important at times than the amount of real GI bounces. Exposure can make a scene brighter or lighter in recesses regardless of how many real light bounces should or could be occuring. It messes up the whole perception of it.
 
Last edited:
Added UE5 titles with Software Lumen only implementations (compute global illumination + compute reflections + screen space global illumination elements).

Fort Solis (Software Lumen)
Jusant (Software Lumen)
RoboCop Rogue City (Software Lumen)
Immortals of Aveum (Software Lumen)
Ark Survival Ascended (Software Lumen)
Lords of the Fallen (Software Lumen)
Ant Ausventure (Software Lumen)
The Talos Principle 2 (Software Lumen)
Brothers: A Tale of Two Sons Remake (Software Lumen)
Slender The Arrival (Software Lumen)
Stray Souls (Software Lumen)
Quantum Error (Software Lumen)
Mortal Online 2 (Software Lumen)
Cepheus Protocol (Software Lumen)
Satisfactory (Software Lumen)
 
Last edited:
Black Myth won't be using RTXGI, they will use ReSTIR GI + RTXDI + Reflections as well as Caustics. Just like the UE5 game Desordre.
Awesome, was expecting a longer time between drinks with regards to something as ambitious as cp77 path/ray tracing and then got aw2 and now this. Had been a bit worried with the AI boom nvidia might neglect gaming a bit but it seems my worries may have been premature at least.
 
Awesome, was expecting a longer time between drinks with regards to something as ambitious as cp77 path/ray tracing and then got aw2 and now this. Had been a bit worried with the AI boom nvidia might neglect gaming a bit but it seems my worries may have been premature at least.

Hopefully there are still senior people inside Nvidia who are passionate about graphics. Not everyone gets sweaty at the thought of AI.
 
What is left after real time Pathtracing in games? The next step is AI and creating a frame without the graphics pipeline.
 
What is left after real time Pathtracing in games? The next step is AI and creating a frame without the graphics pipeline.
I think a closer step before that is creating certain elements in the frame with generative AI. We've seen some great AI generated videos as of late, the models are very competent right now and basically create videos from scratch.

I think we can add high fidelity physics, particles (fire, smoke, sparks, debris), clothes simulation, fluid simulation, hair and fur simulation .. etc to the frame using gen AI. That would be a great step up in visuals for games. And most of these things don't really affect the environment that much, so they can be easily added without the involvement of the CPU.

Certain weather simulations are also possible, like high quality heavy rain, snow, fog, scenes with lots of air particles (leafs, sand, dirt).

Algorithms such RTX HDR and RTX Vibrance also show potential in adding advanced post processing to each frame, like per object motion blur, light shafts, sub surface scattering to all applicable elements, .. etc.

Gen AI for audio is also full of great possibilities to increase the richness of the sound experience of games.
 
Real time path tracing isn’t a solved problem though. Long road ahead on that. Animation needs tons of work still too. “AI” will likely play a role in all of it.
I think it is. Cyberpunk and Alan Wake 2 are running just fine. The pathtraced GI solution in Alan Wake 2 costs around 20% performance with "low" on Lovelace.
 
I think it is. Cyberpunk and Alan Wake 2 are running just fine. The pathtraced GI solution in Alan Wake 2 costs around 20% performance with "low" on Lovelace.

Not sure what you mean. Cyberpunk and Alan Wake are nowhere near “endgame” for path tracing.
 
What is especially missing in Cyberpunk? I think Pathtracing works fantastic in Cyberpunk. At day it goes to the "end of the world" and at night it is transformative:
Day: https://imgsli.com/MjQ3OTEz
Night: https://imgsli.com/MjQ3OTEy

Lots of areas of improvement that will require better hardware and more efficient apis.

Geometric detail
Materials quality
Elimination of temporal and denoising artifacts
Higher resolution reflections
Volumetrics quality

We can claim real-time path tracing nirvana when games start looking like offline renders. Cyberpunk does not.
 
In multiple points Alan Wake 2 PT looks better than this CGI trailer

Of course there is still room for improvement in games but CGI quality can be achieved.
 
nVidia has released an update for RTXGI:
The latest addition, Neural Radiance Cache (NRC), is an AI-driven RTX algorithm to handle indirect lighting in fully dynamic scenes, without the need to bake static lighting for geometry and materials beforehand.

Adding flexibility for developers, NVIDIA is introducing Spatial Hash Radiance Cache (SHaRC), which offers similar benefits as NRC but without using a neural network, and with compatibility on any DirectX or Vulkan ray tracing-capable GPU.
 
What is left after real time Pathtracing in games? The next step is AI and creating a frame without the graphics pipeline.
Tracing in games still has lot of limitations.

Usually main view or first hit is rasterized.
Traced world is simplified.
Tracing depth is very shallow.

In many cases transparent surfaces are not traced and thus no refractions with IoR.
Tracing of volumetrics, scattering etc.
Caustics.

No motion blur / DoF.
Handling all different primitive/material/volumetric types in a way that they affect each other.
Etc.
 
I know it's a linear game but if the trailer doesn't lie, Marvel 1943 will graphically be a generation above cyberpunk in terms of lighting, material quality, geometric density and vfx (3d smoke).
 
Back
Top