GART: Games and Applications using RayTracing

Status
Not open for further replies.
RX 6800 XT has 110 AVG FPS in 1440p + RT, so no CPU limitation in 4K at least for AMD GPUs.
Moreover, SAM is all about pushing latency sensitive descriptors/constants/etc stuff to video memory, hence if application is doing something stupid and is PCIE latency limited, then pushing that latency limited stuff to video memory will help.

I'm talking about the slide above. They have 6800XT, 2080ti, 2080ti OC and 3900 all capping out at the same max fps. That doesn't make any sense unless it's cpu limited. SAM then gives the 6800XT a boost, but 6800XT SAM and 6800XT SAM + Auto OC score the same. It very much looks like that particular benchmark run is cpu limited.
 
They have 6800XT, 2080ti, 2080ti OC and 3900 all capping out at the same max fps
Please watch the video.
6800 XT can't be CPU limited at 110 FPS in 1440p and then all of a sudden became CPU limited at ~80 FPS in 4K, this doesn't make any sense.
 
Please watch the video.
6800 XT can't be CPU limited at 110 FPS in 1440p and then all of a sudden became CPU limited at ~80 FPS in 4K, this doesn't make any sense.

Ok, these results just look weird to me. 2080ti can't even beat itself when overclocked. +100/+1000 should be a noticeable gain. And 3090 can't beat either. They basically have exactly the same performance profile. What would the bottleneck be?

Also, the 2080ti and 3090 lose 3 fps going from 1440p to 4k? And the 1% lows ... increase?
 
Last edited:
Ok, these results just look weird to me. 2080ti can't even beat itself when overclocked. +100/+1000 should be a noticeable gain. And 3090 can't beat either. They basically have exactly the same performance profile. What would the bottleneck be?

Maybe they didn’t actually run benchmarks and just made up the numbers. It is a year for conspiracies after all.
 
AMD doing well in WoW:Shadowlands RT
Faulty test, this channel is worthless at best, trash numbers and trash results, this is grasping at straws at this point. Wait for proper benchmarks from trusted sources, it's not useful to post results from dubious sources just to prove a point.
 
Faulty test, this channel is worthless at best, trash numbers and trash results, this is grasping at straws at this point. Wait for proper benchmarks from trusted sources, it's not useful to post results from dubious sources just to prove a point.
I'm assuming you have some evidence to support your claims other than AMD performing better than expected?
I find it curious how you become aggravated and start accusing others just because someone posts a link that doesn't seem to fit your chosen narrative.
 
I'm assuming you have some evidence to support your claims other than AMD performing better than expected?
I find it curious how you become aggravated and start accusing others just because someone posts a link that doesn't seem to fit your chosen narrative.

Something looks weird with the results. There should be a bigger hit going from 1440p to 4k with the 2080ti and the 3090. It also doesn't make sense that the 1% lows would actually increase going from 1440p to 4k. The 3090 really should beat the 2080ti as well. There's no reason for it not to, based on any specs and benchmarks from any other RT game. There's something weird. I wouldn't say it's malicious, but I'd like to see another site verify it.
 
I'm assuming you have some evidence to support your claims other than AMD performing better than expected?
You know better than to post some random stuff from YouTube, any idiot with a channel (without even the cards) can do the same, in fact YouTube is full of these random benchmarks, and you know most of them are trash.
 
Something looks weird with the results. There should be a bigger hit going from 1440p to 4k with the 2080ti and the 3090. It also doesn't make sense that the 1% lows would actually increase going from 1440p to 4k. The 3090 really should beat the 2080ti as well. There's no reason for it not to, based on any specs and benchmarks from any other RT game. There's something weird. I wouldn't say it's malicious, but I'd like to see another site verify it.
It's clear something is limiting NVIDIAs performance at 1440p RT hard, could be limiting it even at 4K, but at least the RX 6800 XT performance seems to be in the correct range based on this video which goes through various areas of the game with RT on
 
Hitman 3 Game play trailer ... IOI working with Intel to support RT effects.
December 8, 2020
 
Last edited:
Digital Foundry deep dive on RT in CP2077.


You have to squint really hard to see some of the differences but I thought one shot in particular really highlighted the difference between RT and light probe + cubemaps for GI and reflections.

cp2077.png
 
  • Like
Reactions: HLJ
Does anyone know what Cyberpunk is doing for its RT GI implementation? Is it per-pixel or something else?
Diffuse illumination (area lights / sky irradiance) seems to be per pixel (Metro like), otherwise it would have failed the cases below where rasterization probes and fake lights were not placed:
https://imgsli.com/MzQ0Njc
https://imgsli.com/MzQ0NjY
https://imgsli.com/MzQ0Njg
https://imgsli.com/MzI1NTc
https://imgsli.com/MzQ0NzE
https://imgsli.com/MzQ0NzA
https://imgsli.com/MzQ0NzI
https://imgsli.com/MzQ0NzM
Also diffuse AO shadows (part of the Diffuse illumination system I suppose) capture even the smallest geometry details (trash cans, garbage, etc), so it's definitely not just probes.
The ultimate real time GI system would be these two methods combined - Metro and Quake II RTX style for the first bounce to capture small objects and "high frequency" indirect lighting, followed by updated in real time with RT probes to capture the second and following bounces where you don't need all the small indirect shadows since the second bounce lighting will be super diffuse anyway.
Though, Minecraft RTX already does something similar, but instead of probes it uses per vertex irradience cache - the same stuff as probes, just even more low frequency.
 
Last edited:
Status
Not open for further replies.
Back
Top