6900 XT is on par with 2080 Ti here because it has way better raster performance, obviously, not because of the nonsense "first gen RT".And that result is still in parity VS the 2080ti, Nvidia's first gen RT vs AMD first gen RT.
6900 XT is on par with 2080 Ti here because it has way better raster performance, obviously, not because of the nonsense "first gen RT".And that result is still in parity VS the 2080ti, Nvidia's first gen RT vs AMD first gen RT.
6900 XT is on par with 2080 Ti here because it has way better raster performance, obviously, not because of the nonsense "first gen RT".
I responded that there are other games that imply the real performance difference is neither just 30% worse nor 300% worse. Someone came back two weeks later and posted a 74% result.
The church scene shows the same performance delta like Control and Cyberpunk. Something you thought would be more outliners than normal: https://forum.beyond3d.com/posts/2203009/
I came back because i saw the numbers and remembered your posting.
It was never "massacred" because of that.unfair criticism towards RE8 which was being massacred for its better than expected performance
The 3090 has a much bigger lead in RT over the 3080 than the 6900 XT has over the 6800 XT (And the 3090 has a significant lead at 4K even in rasterization!). So it's pretty easy to find huge leads even in relatively better performing titles. In that context, you should look at the 6800 XT in comparison to the 3080/3070. And here we see that the 6800 XT is close to the 3070/ 2080 Ti at 1440p, which is a notably better performance than seen in Cyberpunk/Control.In Metro Exodus, the 3090 is 67% faster than 6900XT @1440p Ultra RT, according to HUB test. I expect the difference to grow beyond 80% @4K.
Who said Metro is disturbing the RT performance detla trend again?
Benchmarking RT is highly scene dependent, what Eurogamer tests is different from HUB or other outlets, which yields different results.The 3090 has a much bigger lead in RT over the 3080 than the 6900 XT has over the 6800 XT (And the 3090 has a significant lead at 4K even in rasterization!). So it's pretty easy to find huge leads even in relatively better performing titles. In that context, you should look at the 6800 XT in comparison to the 3080/3070. And here we see that the 6800 XT is close to the 3070/ 2080 Ti at 1440p, which is a notably better performance than seen in Cyberpunk/Control.
Once more, they are SEVERELY AMD biased, downplaying RT and DLSS to the best of their abilities. Even now, they just can't admit that their choice of 5700XT over RTX 2070 Super was a grave mistake.The best part was at the end when he offered that Nvidia was “blown out of the water” by FSR at computex and that DLSS is dead. Lol what? Did he just get back from the future where FSR has already been tested?
Once more, they are SEVERELY AMD biased, downplaying RT and DLSS to the best of their abilities. Even now, they just can't admit that their choice of 5700XT over RTX 2070 Super was a grave mistake.
Once more, they are SEVERELY AMD biased, downplaying RT and DLSS to the best of their abilities. Even now, they just can't admit that their choice of 5700XT over RTX 2070 Super was a grave mistake.
Yes. It's a bit more complex than that though - the complexity of the scene is less impactful than the amount of dynamic objects in it.RT put more stress on the cpu too, right ? And the more the scene is complex, the more cpu power will be requiered ?
IMO they have excellent data collection, good-to-mediocre analysis, and the most cringeworthy editorialization I've seen at this (high production value) Youtube tier.I like HWUB but that video wasn’t their best work.
That can be true and it can also be the case that on average Metro Exodus is less reliant on RT than some other titles, leading to better average performance on AMD cards. And that the "gap" between AMD and Nvidia depends on which product SKU you are comparing, since performance scaling on AMD and Nvidia is different.Benchmarking RT is highly scene dependent, what Eurogamer tests is different from HUB or other outlets, which yields different results.
Generally speaking, the more you use RT in any scene the higher the difference between AMD and NVIDIA. This remains true irrespective of the game or the RT implementation.
That can be true and it can also be the case that on average Metro Exodus is less reliant on RT than some other titles, leading to better average performance on AMD cards. And that the "gap" between AMD and Nvidia depends on which product SKU you are comparing, since performance scaling on AMD and Nvidia is different.
The 67% faster seems like a lot in isolation, but we have Cyberpunk benchmarks showing the 3090 87% faster at 1440p (https://www.kitguru.net/components/...x-3080-ti-review-ft-gigabyte-inno3d-palit/21/). So if it's the case that the most RT heavy scenes in Metro are still less demanding than the most RT heavy scenes in Cyberpunk, then Metro can still be an "outlier", without doing anything to disprove the basic relationship you describe.
I dunno seem to run about the same in my benchmarks, maybe slightly slower. I'd say that it shows the progress which was made in applying RT to games during the last couple of years - being smarter in how you use RT h/w. The fact that they've dropped non-RT h/w support is not necessarily relevant here.Enhanced edition runs surprisingly well with all the additional RT stuff when comparing against the non enhanced edition with its more limited RT implementation.
I dunno seem to run about the same in my benchmarks, maybe slightly slower. I'd say that it shows the progress which was made in applying RT to games during the last couple of years - being smarter in how you use RT h/w. The fact that they've dropped non-RT h/w support is not necessarily relevant here.