GART: Games and Applications using RayTracing

Status
Not open for further replies.
I thought, that Nvidia implementation of ray tracing RTX (not DXR) uses OptiX https://developer.nvidia.com/optix-denoiser
Pretty sure it's not, because this aims for interactive updates but not realtime for games.

For games they did manual work, as presented here:

This is about things like using an elliptic kernel to denoise shadows from an area light for example.
This also results in specialized denoising passes per effect or even per light, so it can sum up and become expensive. (I leave it to you to watch it and listen if they mention tensor cores :) IIRC they do not, but not sure.)

The denoising done in Quake2 RTX is very different from that, as it works on the whole lighting at once. That's faster but may miss low frequency details.
Here i'm sure it's regular compute shader code becuse it's open source.


I can't remember where the initial assumption tensor cores would do denoising came from (i thought the same at first). Was it misleading marketing, or just everyone getting all this new stuff wrong?
However - IMO the existanse of tensor cores in gaming GPUs remains questionable. Likely the shrinked units on GTX 1660 would be enough. There has shown no killer application yet.
But maybe DirectML will change this... (Actually Cuda is the only way to fully utilize tensors - not sure.)

To come back on thread topic,
somebody here mentioned Navi has low precision dot product instructions. If true, i would guess that's part of the new specialized function units, and functionality should be similar to NV tensor cores (Make your matrix multiply from some dot products).
I would further guess performance is slower than NV but less chip area is taken, so more general purpose performance on the other hand. (Common assumption seems tensor cores take much more area than RT cores?)
That's quite interesting, seems a similar situation than with RT.

Edit: Damn - i need to find how to enable spelling auto correction in my browser again... :D
 
Pretty sure it's not, because this aims for interactive updates but not realtime for games.

For games they did manual work, as presented here:

This is about things like using an elliptic kernel to denoise shadows from an area light for example.
This also results in specialized denoising passes per effect or even per light, so it can sum up and become expensive. (I leave it to you to watch it and listen if they mention tensor cores :) IIRC they do not, but not sure.)

The denoising done in Quake2 RTX is very different from that, as it works on the whole lighting at once. That's faster but may miss low frequency details.
Here i'm sure it's regular compute shader code becuse it's open source.


I can't remember where the initial assumption tensor cores would do denoising came from (i thought the same at first). Was it misleading marketing, or just everyone getting all this new stuff wrong?
However - IMO the existanse of tensor cores in gaming GPUs remains questionable. Likely the shrinked units on GTX 1660 would be enough. There has shown no killer application yet.
But maybe DirectML will change this... (Actually Cuda is the only way to fully utilize tensors - not sure.)

To come back on thread topic,
somebody here mentioned Navi has low precision dot product instructions. If true, i would guess that's part of the new specialized function units, and functionality should be similar to NV tensor cores (Make your matrix multiply from some dot products).
I would further guess performance is slower than NV but less chip area is taken, so more general purpose performance on the other hand. (Common assumption seems tensor cores take much more area than RT cores?)
That's quite interesting, seems a similar situation than with RT.

Edit: Damn - i need to find how to enable spelling auto correction in my browser again... :D

Yes you ´re right. I wasn´t aware about that they doing handcrafted denoising in current games. And maybe my initial thought about using Tensor Cores AI for denoising came from prior Nvidia presentation about OptiX and Turing arch. because they talked about it like it was done that way. It will be interesting how the future implementation of AI denoising will work. Someone mentioned needs of interactive updates and not realtime for games, but DLSS is working that way too in realtime games, right ?

Anyway thanks guys for your input, I must admit I didn´t dive much into real implementation of RTX in games, so I thought denoising works the same way as DLSS (via Tensor Cores)
 
Anyway thanks guys for your input, I must admit I didn´t dive much into real implementation of RTX in games, so I thought denoising works the same way as DLSS (via Tensor Cores)
I think it's inevitable. Just need to wait a year or 2. DirectML should be the tool that they need to customize their own NN denoising but would be applicable to all GPU vendors and low level enough to remove most of the overhead in NN libraries. I'll be watching this space to see when this finally happens.
 
I think it's inevitable. Just need to wait a year or 2. DirectML should be the tool that they need to customize their own NN denoising but would be applicable to all GPU vendors and low level enough to remove most of the overhead in NN libraries. I'll be watching this space to see when this finally happens.

I agree, it´s the original plan and I don´t think Nvidia will drop Tensor Cores from their GPU ....
 
Making traditional objects and trace them is most likely the easiest choice.

Specifically I was thinking more in terms of taking old games that have a lot of normal mapped low-poly assets (something like Doom 3 comes to mind) and using the info from existing materials to behave as a displacement map, as ray intersections are pretty flexible and granular (in a software render pipeline, anyways). I guess another big question would be how destructive the original high poly asset -> normal map generation was, and whether there'd be a ton of discontinuities at the polygon edges if you tried to interpret a surface from the base mesh and texture offset, or clipping artifacts caused from animation and skinning. But yeah, if one were to need to rerig and animate the assets then you may as well just pay artists to manually refine and retop the meshes as well.
 
KitGuru: Control GamePlay
July 22, 2019
I got to play quite a bit of the game in an early pre-release build so today, I would like to share some gameplay footage and first impressions with all of you.
https://vimeo.com/349439613


RTX ray-tracing impressions:

Before digging into the game details, we’ll go over what RTX is doing and how it impacts Control’s visuals. So far, each major RTX title has done things slightly differently, with Battlefield V focusing on reflections, Shadow of the Tomb Raider focusing on shadows and Metro Exodus opting for full global illumination. From what we can tell, Control is ray-tracing reflections and shadows, so you will notice RTX throughout the environment.

RTX can be a subtle effect at times but based on what I got to play, Control might have the best implementation to date. This will vary from area to area but the section of the map I had access to featured lots of glass windows and metallic surfaces, which really show off ray-tracing well as light bounces around from various sources and casts realistic shadows. In some areas, the change to RTX lighting really helped bring out additional detail in the environment, particularly on objects scattered through the room and walls/floors.

We have reached a point with PC gaming where resolution is no longer a limiting factor for fidelity. There have been some stumbles getting ray-tracing looking its best in games but developers appear to be catching on quickly. So far, Control looks like the best showcase for the technology to date and as more developers get involved, we will be seeing more improvements in future games, like Call of Duty: Modern Warfare, Cyberpunk 2077 and Watch Dogs Legion.
https://www.kitguru.net/components/...sions-for-control-on-pc-with-rtx-ray-tracing/
 
Luxion Will Support NVIDIA’s RTX Ray Tracing And Denoising Acceleration In KeyShot 9
July 29, 2019

We wrote earlier about Blender’s support for RTX, which is going to include use of both the RT and Tensor cores. For myriad reasons, this move is notable, but we admit that we’re even more surprised at Luxion’s adoption of the tech. The reason is simple: Blender has supported GPUs for quite some time, whereas it’s brand-new territory for Luxion. How times can change.

It’s one thing to adopt GPU rendering after having focused around the CPU for an eternity. We’ve seen it with many others, including Arnold, which released its first GPU-powered plugin a few months ago. But Luxion didn’t just go that far; it jumped right into the deep-end with RTX support.
https://techgage.com/news/luxion-keyshot-9-nvidia-rtx-support/
 
Indie game "Bright Memory" to support RTX reflections within 2 months.
This time we bring the latest trailer of Bright Memory (Early Access) with RTX ray tracing which shows the stunning ray tracing effects in different environments in-game. The water and all metal objects are assigned with ray tracing properties which comprehensively strengthened the graphic quality. Actually we have met quite a few performance stability problems in the Unreal Engine 4, but finally we make it to maintaining the reliability performance while keeping the visual effect of RTX ray tracing thanks to the support from NVIDIA with tests for game performance, RTX features and hardware. The RTX technology is still in the experimental stage in Bright Memory (Early Access) at present. The features will be released to STEAM and GOG for experience soon. Players who have already purchased Bright Memory before can experience the latest visual effects after updates. Right now we are learning the most advanced graphics technology in the video game industry, we believe that will bring great improvements to our game when it is officially released. Bright Memory(RTX version) is planned to be released on STEAM and GOG within 2 months and we will make some adjustment on price. The full version is under good development. Please look forward to it!

 
The next trend for TWIMTBP games. Look we have rain...everywhere...in every game...indoors as well! Must show off RT reflections!

Kind of reminds me of how for a while all games had kind of a plasticky shine to them in order to show off the trendy 3D tech of the time.

Sorry, if I sound cynical, but it seems that's the only time that RT is currently really noticeable and a potential improvement over current rendering methods at the current level of RT power we have available (outside of redone simplistic old games with low rendering loads).

Looking forward to the tech advancing beyond this as well as more performant hardware.

Regards,
SB
 
August 13, 2019
Grimmstar is an action-packed space fighter/simulator with Action RPG and Fleet Management twists. NVIDIA RTX lighting provides incredibly stunning and vibrant ray-traced visuals as you are pursued by an overwhelming force through multiple open-world star systems, fighting for humanity's survival. Command the last fleet of mankind while you defend the stranded remnants of the human race with your fully modular fighter, growing your forces along the way with the hope of eventually taking down the planet-devouring behemoth, the Grimmstar.
 
Turning Up the Lights: Interactive Path Tracing Scenes from a Short Film
August 13, 2019
One problem: real-time renderers can’t afford to trace nearly as many rays as offline film-quality renderers, and path tracing typically requires many rays per pixel. RTX games shoot a handful of rays per pixel in the milliseconds available per frame; movies take minutes or hours to shoot thousands of rays. With low ray counts, path tracing gives characteristically grainy images. Essentially, each pixel has a different slice of information about the scene’s lighting. Ray traced games use sophisticated denoisers to remove this graininess, but path tracing complex dynamic content with a small number of rays presents additional denoising challenges.
...
With path tracing, rays need to find their way to lights in the scene in order to model their illumination. It’s hard to choose the right rays for lighting in anything but the simplest scenes. Choosing the right rays is really hard in a scene with thousands of light sources. It’s really really hard in a scene with thousands of moving light sources.
...
Therefore, the researchers decided to tackle a scene with thousands of moving light sources.

They set out to render scenes from a short film that had previously only ever been rendered using offline renderers. The short film, Zero Day, shown above, was created by artist Mike Winkelmann. It holds many challenges for real-time physically-based rendering. The scenes in Zero Day are lit by 7,203 – 10,361 moving emissive triangles; there is a lot of fast-moving geometry (so the lighting changes a lot from frame to frame, which makes it hard to reuse information from previous frames); and there is a wide variety of material types.

The combination of shadows and reflections from 1,000s of fast-moving lights with shiny materials exceeded the capabilities of current real-time denoising algorithms. The team dug in and reinvented ray sampling algorithms and deep-learning image denoisers. Many ideas were tried; some worked, and some did not. In the end, they made breakthroughs in both ray sampling and denoising.

The Measure 1 scene, rendered here with direct lighting (soft shadows) from 7,203 dynamic emissive triangles using 9 rays per pixel. The researchers render this at 20 frames per second using a new light importance sampling algorithm and prototype deep learning denoiser on Turing RTX 2080 Ti.
...

The Measure 1 scene, path tracing direct and one bounce of indirect lighting from 7,203 moving emissive triangles using 4 paths per pixel (17 rays per pixel), denoised with a prototype deep learning denoiser. The video appears brighter than the one using only direct lighting due to the reflected light illuminating surfaces that would otherwise be in shadow.
...

rendering two scenes from the short film interactively with both direct lighting (4 paths per pixel, 9 rays per pixel) and 1-bounce path tracing (4 paths per pixel, 17 rays per pixel). The demo sets a new standard for interactive path tracing, and further inventions are rapidly improving performance and quality.

https://news.developer.nvidia.com/t...active-path-tracing-scenes-from-a-short-film/


The team published a paper based on their initial discoveries—retargeting an offline light importance sampling rendering solution to fit real-time rendering constraints. They showed that building a 2-level bounding volume hierarchy (BVH) over the lights (a “light BVH”) was an effective way to choose light sources to trace rays to. This paper was published earlier this month at the High Performance Graphics 2019 conference in Strasbourg, France.
 
Last edited by a moderator:
So in tracking those indie RTX titles, we have several:
-Stay in the Light
-Bright Memory
-In The Black
-Grimmstar
-Enlisted

And several AAA games:
-Control
-Cyberpunk 2077
-Call Of Duty Modern Warfare
-Watch Dogs Legion
-Doom Eternal
-Wolfenstein Young Blood
-Vampires Bloodlines 2
-Atomic Heart
-MechWarrior 5
-Assetto Corsa Competizione

Also a couple of Chinese titles:
-Sword and Fairy 7
-Justice
-Dragon Hound
 
Status
Not open for further replies.
Back
Top