AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

So is the resolution for RT defined independently from game resolution?

Sort of. Depends on the game and how they distribute the rays. Battlefield V has a ray budget that it divides between areas of the screen where it thinks it needs them, but it still must be tied to resolution so you can get close to 1 sample per pixel where needed.
 
No, of course not. Big Navi is some 60% faster in pixel fill and more than twice as fast in some triangle throughputs, not so much in others.

Thank you for the information?
Is it possible that you can share the informations with us in table? Maybe you can write an article about it? ;)
 
@Flappy Pannus Meanwhile here they are advocating for DLSS to be always enabled when possible, ray tracing or not. I just don't see the bias. People can complain about which titles they benchmarked etc, but I think calling them biased towards AMD is a step too far.



Like I said, I don't think people should be calling them biased, there's no point. The review is just poor, and their reasoning is faulty.

I mean this tweet in particular - they think DLSS is a thing you should always turn on if you have Nvidia - but in the review, they don't mention it at all. So it's a feature you should always enable because it delivers great performance, but...don't mention it in a review where you actually benchmark 5 games that have it? So what happened to going for maximum performance then? Hell, Control and Death stranding look better with DLSS than native, even if there was no performance improvement, you're arguably getting a better experience on Nvidia just in image quality alone.

I get being annoyed with insufferable fanboys, but it seems like they're tripping over themselves trying to play defense at this point.

And DLSS does absolutely 'fix' the performance impact of RTX if you're already at the refresh rate limit of your display, not all of us have 144hz+ monitors. The benefit (and some would say, detriment) of PC gaming is the options you have. You might want medium ray tracing with performance DLSS at a lower base resolution if you feel the RT effects are transformative enough, DLSS gives you that option without sacrificing resolution too much. Again, it doesn't mean that Nvidia wins because of it if their price/performance isn't up to snuff in other titles (the 6800 non-XT looks fantastic, and it may end up being the card I get), DLSS + RTX is still too sparse and the future too unknown to disregard the RX series based on it. But you at least have to address it.

Here's a good way to cover RTX/DLSS in a game btw:

 
Sort of. Depends on the game and how they distribute the rays. Battlefield V has a ray budget that it divides between areas of the screen where it thinks it needs them, but it still must be tied to resolution so you can get close to 1 sample per pixel where needed.

Yeah, but I meant: can I have 1440 RT and play at 2160p? Like, if the game allows this combination. I guess we'll see when it gets released.
 
Yeah, but I meant: can I have 1440 RT and play at 2160p? Like, if the game allows this combination. I guess we'll see when it gets released.

Well, typically if you lower ray tracing from "high" to "medium," for example, it will lower the number of rays, so essentially ray tracing at a "lower resolution."
 
Maybe there is driver bugs or something along that line preventing ray tracing from working? Or perhaps it's minecraft/control type of performance situation? Those games are barely playble in 4k without dlss. In other news godfall has rt enabled only on amd hw at the moment.

View attachment 4956
https://www.tomshardware.com/news/the-amd-radeon-rx-6800-xt-and-rx-6800-review

I fully expect Nvidia to be very ahead of AMD on this game. With that said, my point is that there is not a single reason why they listed a RTX2060 from Nvidia for RT gaming but not at least the 6800XT.
 
Games like Cyber Punk will be great for performance reviews that create an RT matrix and test rt effects separately and in combination, especially after AMD's driver/accelerator issues are resolved.

I don’t think any reviewer has stepped up to do even the most basic exploration of the performance hit of individual RT effects in Control so I wouldn’t hold my breath. This is the sorta thing we would expect from Anandtech if they still did GPU reviews.
 
So it turns out CD Projekt Red re-confirmed that the game will only support NVIDIA RT-hardware on launch

I would love to see a technical explanation of what's going on. Godfall only works on AMD for now, and Cyperpunk will only work on Nvidia at launch, yet both are supposedly running on hardware agnostic DXR. DXR does not support extensions, as far as I know, which means the performance of RT on each hardware platform is so divergent that they need to develop two rendering paths to optimize for each. That does not bode well for RT.

Edit: Also, what's the point of a hardware agnostic API if you have to fully develop two different render paths anyway. At that point, why not just allow each vendor to provide their own RT API and get the best performance out of each.
 
I don’t think any reviewer has stepped up to do even the most basic exploration of the performance hit of individual RT effects in Control so I wouldn’t hold my breath. This is the sorta thing we would expect from Anandtech if they still did GPU reviews.

I'd even like to see that for screen space effects. I would not be shocked if AMD had a lower hit from screen space reflections than Nvidia because of infinity cache. RT will be really interesting. We might see big performance differences per setting (relfections, GI, AO, Shadows) on AMD vs Nvidia.
 
I would love to see a technical explanation of what's going on. Godfall only works on AMD for now, and Cyperpunk will only work on Nvidia at launch, yet both are supposedly running on hardware agnostic DXR. DXR does not support extensions, as far as I know, which means the performance of RT on each hardware platform is so divergent that they need to develop two rendering paths to optimize for each. That does not bode well for RT.

Edit: Also, what's the point of a hardware agnostic API if you have to fully develop two different render paths anyway. At that point, why not just allow each vendor to provide their own RT API and get the best performance out of each.
Could be as simple as a whitelist, with excuses being made about QA. In fact, that's the most likely spin.
 
Edit: Also, what's the point of a hardware agnostic API if you have to fully develop two different render paths anyway. At that point, why not just allow each vendor to provide their own RT API and get the best performance out of each.
It's likely more about optimizing each of the RT implementations to better suit the target device? Like reducing the number of days or resolution of reflections on AMD for example.

Why assume it has anything to do with the api? Could be as simple as having insufficient time to do QC on AMD hardware that launched 2 seconds ago.
Sorry, I meant about the comments made a lot lately that if it's DXR then it must support either vendor. Not a limitation of the API itself but rather a false assumption that any game would automatically work for both. We now have 2 high profile games that are vendor locked.
 
I'd even like to see that for screen space effects. I would not be shocked if AMD had a lower hit from screen space reflections than Nvidia because of infinity cache. RT will be really interesting. We might see big performance differences per setting (relfections, GI, AO, Shadows) on AMD vs Nvidia.

Yeah that would be really interesting. I was playing with Nvidia’s profiler in Doom and in pixel shaders near the end of the frame the hit rates in L2 are very high. I assume those late shaders are running screen space effects like DOF and SSR. Given the already high L2 hit rates I’m not sure a big L3 cache will help much. But there’s certainly the potential for drastic performance differences depending on whether the workload is IC friendly.
 
Sorry, I meant about the comments made a lot lately that if it's DXR then it must support either vendor. Not a limitation of the API itself but rather a false assumption that any game would automatically work for both. We now have 2 high profile games that are vendor locked.

It works on nVidia hardware without problems. AMD should have released DXR drivers in the last twolve months. I would not blame developers to blacklist AMD as long as their drivers are not on par with nVidia. On the other hand blacklisting nVidia is just political because Turing is out since two years...
 
Back
Top