DirectX Ray-Tracing [DXR]

PICA PICA does the reflections ray tracing at half-res.

I'm sure many games will do RTAO at a quarter-res, just like SSAO.
 
PICA PICA does the reflections ray tracing at half-res.

I'm sure many games will do RTAO at a quarter-res, just like SSAO.
Its because reflections are a single ray. Ao requires multiple. Doing it with fewer rays per pixel at more pixels leaves more raw data for the post filtering to work with. If they can spend the memory on a larger buffer, they should even super sample the thing. As shifty said, with raytracing, you mostly pay for the rays. How you store their results is mostly just about having the memory for it.
 
Its because reflections are a single ray. Ao requires multiple. Doing it with fewer rays per pixel at more pixels leaves more raw data for the post filtering to work with. If they can spend the memory on a larger buffer, they should even super sample the thing. As shifty said, with raytracing, you mostly pay for the rays. How you store their results is mostly just about having the memory for it.
Resolutio is still very important in ray tracing. You have to trace rays for every pixel you want to shade so more pixels = more rays. That cost should be reduced as much as possible. It benefits the performance of the filtering as well.
 
Otoy's Unity talk:


Skip to 26:45 to watch the magic: unbiased path tracing with multiple bounces at 1080p, 50spp, 1 second, running on two (undisclosed) GPUs. Basically a generation ahead of the RTX demos.
Really surprised he thinks they might be able to reach 30fps rendered for demo (I assume again with those 2 next gen GPUs but he is not clear) real-time rendering and without seeing the noise by end of the year/early next year; would had expected it to be much later than that for full path traced.
Still considering that is an internal build (focused for offline rather than realtime rendering) using next gen 2xGPU with its drivers (multi-support and optimisationwould still be weak at this stage) not bad that it is even achieving 1 second rendering.
 
Skip to 26:45 to watch the magic: unbiased path tracing with multiple bounces at 1080p, 50spp, 1 second, running on two (undisclosed) GPUs. Basically a generation ahead of the RTX demos.
It's a neat demo but not sure what you mean by "a generation ahead" here. The demo is running 2 orders of magnitude slower than any of the DXR demos even assuming those GPUs are Titan V's and not prototypes of something newer. Indeed it is getting significantly fewer rays/second than any of the DXR demos, although that is of course to be expected to some extent with path tracing.

Not trying to downplay path tracing demos, but these sorts of demos running that far away from real time are nothing new really. Very useful for the target market (production) but the pref/quality tradeoff doesn't make sense for games and won't for a long time even with DXR.
 
Brace yourselves...

Another Metro: Exodus ray-tracing video! :D

This time they cared enough to make a side by side comparison.
 
Metro's overall style and assets are too "dirty" for proper RT showcase. I bet the first wave of RT titles will feature an overload of sterile/polished surfaces and directional light sources to "stick it" in your face with all the reflections and shadows.

"Superhot RT" would actually be the perfect halo-type of retail GPU bundle release. :p
 
Last edited:
Skip to 26:45 to watch the magic: unbiased path tracing with multiple bounces at 1080p, 50spp, 1 second, running on two (undisclosed) GPUs. Basically a generation ahead of the RTX demos.

Just listened to the start with
"It's basically a physics engine for light. There's [SIC] no shortcuts. It just renders reality as it should be".
So if we modelled the two slit experiment we'd get...? :devilish: ;-) :p

Listening to the rest of it now.
FWIW: Remembered there's some early info on the light map bakery here:

Interesting to see a mention of Imagination/PowerVR's Wizard (running at 2Watts) at 18:25, which is amusing given the description, in the preceding minute or so, of the multi-monster-GPU systems being employed.
 
Last edited:
It's a neat demo but not sure what you mean by "a generation ahead" here. The demo is running 2 orders of magnitude slower than any of the DXR demos even assuming those GPUs are Titan V's and not prototypes of something newer. Indeed it is getting significantly fewer rays/second than any of the DXR demos, although that is of course to be expected to some extent with path tracing.

Not trying to downplay path tracing demos, but these sorts of demos running that far away from real time are nothing new really. Very useful for the target market (production) but the pref/quality tradeoff doesn't make sense for games and won't for a long time even with DXR.

Consider this:

The RTX demos:
- Mostly rasterization with some ray-tracing.
- Only a subset of GI effects.
- Still fakes multiple light bounces with AO.
- Only a hanful of samples per pixel (reflections in PICA PICA are done at half res).

Octane 4:
- 100% path tracing.
- True global illumination.
- Unbiased, actually calculates several bounces.
- 50spp at full 1080p.

Limit Octane to the uses and quality of the RTX demos and it could be running at 60fps in current hardware.

Also consider that Brigade is something they've been working on for years so naturally it's far more optimized than standard ray/path tracing techniques.

Just listened to the start with
So if we modelled the two slit experiment we'd get...? :devilish: ;-) :p

Listening to the rest of it now.
FWIW: Remembered there's some early info on the light map bakery here:

Interesting to see a mention of Imagination/PowerVR's Wizard (running at 2Watts) at 18:25, which is amusing given the description, in the preceding minute or so, of the multi-monster-GPU systems being employed.
AFAIK path-tracing can't simulate that.
 
Consider this:
I think Andrew knows what the Picapica DXR demos is doing to get the frame rate it did ;)

PicaPica has a pure path-tracing mode for testing purposes (ground truth) so we know how slower path tracing is versus the hybrid that PicaPica uses and the visual tradeoff it makes to get that rendering speed.

Also as Simon pointed out the idea that a path tracer is physically correct or unbiased is pure PR. All renderers (outside Uni physics dept) chose a variety simplifications to meet the demands of its users. A path tracer goes for fairly high quality for most materials in real world scenes, it chooses to throw various things away that usually don't matter.

The slit experiment is one example where you could see this but in fact every real material is wrong and biased due to instantaneous energy transfer in almost every renderer. One of the common saying is that unbiased renderers are 'energy conserved' as a good thing. However in reality surfaces aren't energy conserved in an instance, many material absorb and re-emit energy over time (the reason black things got hotter than white things) that isn't modeled in a path tracer (that I know off) as its usually apparent mostly outside the visual spectrum. However if you rendering in the IR wavelength or chose a fluorescent material, it would look total wrong in a path tracer.

Or to put it simply it is all smoke and mirrors even in offline path tracers... demos like PicaPica choose a variety of techniques to keep the framerate high. Having RT allows different strategies with different quality/time tradeoffs for both realtime and offline renderers but it doesn't magically make it physically correct.
 
I think Andrew knows what the Picapica DXR demos is doing to get the frame rate it did ;)

PicaPica has a pure path-tracing mode for testing purposes (ground truth) so we know how slower path tracing is versus the hybrid that PicaPica uses and the visual tradeoff it makes to get that rendering speed.

Also as Simon pointed out the idea that a path tracer is physically correct or unbiased is pure PR. All renderers (outside Uni physics dept) chose a variety simplifications to meet the demands of its users. A path tracer goes for fairly high quality for most materials in real world scenes, it chooses to throw various things away that usually don't matter.

The slit experiment is one example where you could see this but in fact every real material is wrong and biased due to instantaneous energy transfer in almost every renderer. One of the common saying is that unbiased renderers are 'energy conserved' as a good thing. However in reality surfaces aren't energy conserved in an instance, many material absorb and re-emit energy over time (the reason black things got hotter than white things) that isn't modeled in a path tracer (that I know off) as its usually apparent mostly outside the visual spectrum. However if you rendering in the IR wavelength or chose a fluorescent material, it would look total wrong in a path tracer.

Or to put it simply it is all smoke and mirrors even in offline path tracers... demos like PicaPica choose a variety of techniques to keep the framerate high. Having RT allows different strategies with different quality/time tradeoffs for both realtime and offline renderers but it doesn't magically make it physically correct.
Sure, no renderer matches reality 100% but Octane gets pretty close. The point is that while on the surface it runs much slower than the DXR demos the reality is that it's also doing FAR more. I think that applying some optimizations like reduced resolution, lower number of spp and less light bounces you could get performance comparable to the DXR demos while still being quite ahead of them not only in quality but in rendering features.

Could be wrong though. I guess we'll see by the end of the year.
 
if they applied that denoisoct
Sure, no renderer matches reality 100% but Octane gets pretty close. The point is that while on the surface it runs much slower than the DXR demos the reality is that it's also doing FAR more. I think that applying some optimizations like reduced resolution, lower number of spp and less light bounces you could get performance comparable to the DXR demos while still being quite ahead of them not only in quality but in rendering features.

Could be wrong though. I guess we'll see by the end of the year.

i think if they were using that temporal neural net ai de noise filter they would be way ahead of anything else here.

look at brigade engine from 4 years ago
 
if they applied that denoisoct


i think if they were using that temporal neural net ai de noise filter they would be way ahead of anything else here.

look at brigade engine from 4 years ago
Apparently that was running on 80 GPUs at the time. They've now fully integrated Brigade into Octane proper. Supposedly now the priority is to optimize it to get it running in real-time speeds.
 
Back
Top