AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

How much of this is down to whether or not the RT is inline?

But both GPUs should support DXR 1.0 and DXR 1.1. I find it very weird. AMD has a shadows library they've built on top of DXR 1.1 (I assume) for Godfall, and then they have to patch in Nvidia support later. Cyberpunk was supposedly DXR without any custom nvidia proprietary libraries, but it won't support AMD without a patch. Why? Only thing I can think of is the performance is so divergent in different RT workloads that they're worried about getting flamed for bad performance on one platform vs the other. So basically the solution will be one RT implementation for Nvidia and one RT implementation for AMD. I suppose we'll be able to tell if there are subtle visual differences. Or maybe they'll just look pretty much the same but operate different under the hood. Benchmarking RT will be very interesting going forward.
 
AMD was hesitant from the start about showcasing RX 6xxx ray-tracing. Perhaps their drivers and/or acceleration mechanism has issues and just need worked on. Could also be AMD making the call regarding RT in CyberPunk.
 
In the case of Cyberpunk, even the PS5 and Series S/X versions will run via BC mode and get the actual next-gen patch sometime next year, so it didn't come as a shock for me that raytracing won't be supported on PC RDNA2 either out of the gate.

I think it's the right call from either AMD or CDPR, first impressions are important and there's a risk AMD performing better in newer benchmarks still wouldn't break the common conception of AMD having bad RT performance from all the previous benchmarks.
 
I don’t think any reviewer has stepped up to do even the most basic exploration of the performance hit of individual RT effects in Control so I wouldn’t hold my breath. This is the sorta thing we would expect from Anandtech if they still did GPU reviews.

Wait Anandtech isn't doing more GPU reviews?
 
But both GPUs should support DXR 1.0 and DXR 1.1. I find it very weird. AMD has a shadows library they've built on top of DXR 1.1 (I assume) for Godfall, and then they have to patch in Nvidia support later. Cyberpunk was supposedly DXR without any custom nvidia proprietary libraries, but it won't support AMD without a patch. Why? Only thing I can think of is the performance is so divergent in different RT workloads that they're worried about getting flamed for bad performance on one platform vs the other. So basically the solution will be one RT implementation for Nvidia and one RT implementation for AMD. I suppose we'll be able to tell if there are subtle visual differences. Or maybe they'll just look pretty much the same but operate different under the hood. Benchmarking RT will be very interesting going forward.

I was thinking not in terms of support, but as you say, severe performance divergence. Is there a connection between inline RT and cache locality for properly leveraging rdna2's infinity cache?
 
Not really surprised. So much for "DXR supports anything" mantra.

I now wonder what this means for raytracing performance as reviewed. If support is so divergent you can't even run things properly despite following spec then... well the performance implications between the two are obvious.

Either way this also bodes very, very poorly for future crossplatform support. "You have to significantly change the entire codepath for raytracing to get performant cross vendor support, or even for it to work at all" isn't a confidence booster for use of the feature. Not that I'm totally surprised, looking at the CoD guys describing just how finicky the BVH behavior was for shadows alone in Modern Warfare was a trip. I wouldn't be surprised if a very different traversal and collision detection hardware setup performed significantly differently.

Well, I can be hopeful it'll mean everyone will just ditch triangles and use SDFs.
 
I would expect any API, architecture, and driver idiosyncrasies to get more thoroughly unpacked if/when we see CAD renderers implement DXR, as well as general hobbyist poking and prodding with hand-rolled RT benchmarks. Right now there's only a small handful of AAA games (which are pretty sparsely using RT), not to mention that this is happening amidst a console generation transition, two GPU launches, plus the covid-related disruptions. Growing pains were inevitable here.
 
I would expect any API, architecture, and driver idiosyncrasies to get more thoroughly unpacked if/when we see CAD renderers implement DXR, as well as general hobbyist poking and prodding with hand-rolled RT benchmarks. Right now there's only a small handful of AAA games (which are pretty sparsely using RT), not to mention that this is happening amidst a console generation transition, two GPU launches, plus the covid-related disruptions. Growing pains were inevitable here.

Ye, some seem to expect the hardware RT in the consoles to just go unused for the rest of the gen.
 
This is the sorta thing we would expect from Anandtech if they still did GPU reviews.

I wish that we have more Synthatic Values. Since Anandtech is gone only Heise.de is doing syntethic benchmarks.

Ok you guys almost made me believe this.
They're still doing GPU reviews!
https://www.anandtech.com/comments/...rgb-optical-mechanical-keyboard-review/730442

Ryan Smith said:
I have the cards. Just not the time. AMD and NVIDIA have been nothing but spectacular in making sure I have all the hardware I need (which ironically, is kind of part of the problem).
 
Even AMD said before release that any DXR title should run. It's not, so I guess drivers are not on point yet. Nothing more.

No reason to say dxr norm mean nothing all of the sudden...

It's entirely possible Nvidia has been allowing out of spec behavior without throwing any warnings or etc. This would hardly be the first time for either to do this.

And considering the utter disparity between Watchdogs Legion RT performance and Miles Morales on the same console, with fantastically similar settings, min spec targets, and use of RT, it appears the spec itself, while intended to run as a shared API, gives very little idea of good performance behavior between the two archs.

Further it appears incredibly likely that AMDs non fixed traversal structure gives opportunities the spec does not, and it's also highly likely at least some of this behavior can't be replicated at all on Nvidia. To me these latter two just add up to what Cyberpunk indicates as well. That developers may need to rebuild large parts of their RT pipelines to get good performance on both. That's a big task for a very expensive feature with rather limited uses. One of the biggest limits to visual quality is already dev time; the question I'd just ask of hardware RT is, "is it worth it?"
 
Even AMD said before release that any DXR title should run. It's not, so I guess drivers are not on point yet. Nothing more.

No reason to say dxr norm mean nothing all of the sudden...
CDPR didn't say anything other than CP2077 won't work on AMD cards for RT yet, and that support for AMD will come soon. RT works in other DXR titles so that means CDPR is locking to Nvidia only, not that support for RT isn't available in drivers.
 
Updated system requirements. Looks like the 6800/6800XT/3070/2080TI would be enough for "Ultra" RT at 1440p

TBH I think this is very "optimistic" about HW performance. We will have to wait 3 weeks to find out. One thing is for sure, I won't play the game until I can run it at ultra.

Edit: Image was too big.


Edit2: Looks like it will be very optimized for Nvidia...

Curious they don't even mention the ambient sky lighting. That is the one I'm most interested in seeing. Has there been any clarification on what framerates that table is targeting?
 
How is the 6800 binned? I assume the 6800 is just a cut down (binned) 6800/6900xt with 60 out of 80 CUs enabled
But in order to disable 20CUs (10DCUs) you'll be left with a asymmetric CU per SE configuration since 10(DCU) isn't a exact multiple of 4(SE)

I remember CUs per SE needing to be symmetrical back in GCN days, is this no longer a limitation with RDNA2?
 
At this point I'm surprised they didn't recommended the IGP to play at 720p.
Indeed CPU recommendations seem off : https://www.cpubenchmark.net/compare/Intel-i7-4790-vs-AMD-Ryzen-5-3600/2226vs3481 (or they have a multi-threading problem and only care about single thread performance)
But at least GPU are similar in performance:
https://www.videocardbenchmark.net/compare/Radeon-RX-590-vs-GeForce-GTX-1060/4025vs3548
https://www.videocardbenchmark.net/compare/GeForce-RTX-2060-vs-Radeon-RX-5700-XT/4037vs4111
 
Back
Top