AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

I’m looking forward to a DF deep dive on Dirt 5. Seems the RT shadows are all but unnoticeable a la Tomb raider. Definitely not a showcase for AMD RT.
Wait what?
I thought the point of RT is to bring the graphics closer to realism, not flashly lights and fancy mirrors and signs yelling "THIS IS RTRT!11".
I don't see how it takes anything away from the actually correct RT shadows that we could do pretty convincing shadows before, too. Most of the things done with RT can be "faked" pretty well anyway.
 
They have limited ray tracing actually. One could argue they barely have it, as opposed to what we already have/had when that comment was made.

Spiderman: Miles Morales has some of the best Ray traced visuals out there, on consoles. With a hardware vastly inferior to a 6800, not even talking about a 6800XT. Not saying that things may look even better, but the "barely have it" seems not to add up with evidence.
 
Wait what?
I thought the point of RT is to bring the graphics closer to realism, not flashly lights and fancy mirrors and signs yelling "THIS IS RTRT!11".
I don't see how it takes anything away from the actually correct RT shadows that we could do pretty convincing shadows before, too. Most of the things done with RT can be "faked" pretty well anyway.
I don't think I'm alone in preferring the flashy lights, fancy mirrors, etc.. that will be present in CyberPunk instead of a highly optimized, visually compromised version.
It comes down to preference, and you will have always have the option of turning off rt and reverting back to what you prefer.
 
Spiderman: Miles Morales has some of the best Ray traced visuals out there, on consoles.

Like you say, on consoles. Since consoles are not equiped with Turing hardware, thats valid. But some where comparing between AMD and NV, which doesnt really apply to consoles (in an AMD dGPU discussion).
You'd have to see a potentional Ampere (in PS5) version of the game and then we can discuss that game.

the "barely have it" seems not to add up with evidence.

Theres alot of compromises, which i personally dont think fit with ray tracing as supposed what it should be doing, like reflection in relfection in windows, for example. But ye, its ray tracing. Even the OneX and PS4 Pro have itin Crysis.
 
FWIW, I don't think Infinity Cache is so great (performance per watt is not impressive). It's falling apart at 4K. The unknown is whether most rendering becomes more Infinity Cache friendly as time goes by, but nothing out there right now indicates that things will get better.
If you happened to watch Scott's Interview with HotHardware, he said the goal of IC is not just performance. It was a tradeoff vs die area, performance and power.
He specifically said if they would have needed a wider bus to get the same BW for more performance. And the power needed by wider bus and more memory chips means higher TBP. He also added that the memory controllers + PHY would occupy a significant footprint on the chip.
A downclocked N22/N23 in mobile form would be very efficient looking at the chart below
upload_2020-11-21_16-31-19.png
And according to banned member Navi 2x is getting a a lot of interest from Laptop OEMs for its efficiency which is what Scott also mentioned.
They asked some questions which he dodged regarding a cut down IC for low power form factors but it seems obvious.


One additional point from that call was that, Pro variants will come and from Linux commits, they will carry 2048bit HBM.
 
Last edited by a moderator:
Like you say, on consoles. Since consoles are not equiped with Turing hardware, thats valid. But some where comparing between AMD and NV, which doesnt really apply to consoles (in an AMD dGPU discussion).
You'd have to see a potentional Ampere (in PS5) version of the game and then we can discuss that game.



Theres alot of compromises, which i personally dont think fit with ray tracing as supposed what it should be doing, like reflection in relfection in windows, for example. But ye, its ray tracing. Even the OneX and PS4 Pro have itin Crysis.

No, sorry. This is a strawman. You said that consoles have barely Ray tracing. And I showed you an example of great visuals running on console hardware. There are optimizations and compromises? Yes, and if you have no optimization and compromises like DLSS some games (not even raching the same visuals as Spiderman) will run like crap at 4K even on latest Nvidia hardware. What matters is the visual result. But you seems to consider Ray tracing OK only if it runs on Nvidia hardware, this is completely silly.
 
I don't think I'm alone in preferring the flashy lights, fancy mirrors, etc.. that will be present in CyberPunk instead of a highly optimized, visually compromised version.
It comes down to preference, and you will have always have the option of turning off rt and reverting back to what you prefer.
You clearly didn't get the point at all, or chose to ignore it.
I believe RT is supposed to be great for bringing graphics closer to realism, which includes doing correct shadows too. It can include those fancy mirrors and flashy lights too, but it's not only about them.
You're downplaying RT shadows in Dirt 5 because they could "fake" convincing enough shadows without RT too, but choose to ignore that most of those other things RT is used for can be faked pretty convincingly too.
 
You said that consoles have barely Ray tracing.

High end RDNA2 GPUs have about 1/4th of the RT performance as compared to Ampere. At half the power the console RT performance isnt going to be better then say a 6800XT's.
And thats why you see results as they are on them.

Anyway, this isnt the topic for console comparisons to begin with. Aside from that being impossible since Ampere isnt in consoles.

I believe RT is supposed to be great for bringing graphics closer to realism

While i agree on that, games also should look as amazing as possible. Just realism isnt going to provide that i think. RT does both, kinda.

you seems to consider Ray tracing OK only if it runs on Nvidia hardware, this is completely silly.

Absolutely not, consoles you where talking about. Am sure a 6800XT/6900XT will fare much better against NVs products in all regards then the consoles do. Remember, its RT + settings that counts, not just ray tracing. Lowering everything to low/medium to get some RT isnt sounding too exciting, since the normal rasterization takes such a hit.
If say the 6800XT is comparable to 3070 RT performance, or even 3060Ti, hell even somewhat lower, thats a whole different to 2060 RT perf.

Its called hybrid rendering for a reason.
 
If they were 'portable' then why is everything falling apart before our very own eyes ? Why do some games show graphical problems on another vendor ? Why can performance be wildly different between games on different vendors ? Moreover, why are we now devolving into a state where games are locking ray tracing features behind specific vendors for different games using the very same APIs ?!

The most likely and charitable explanation is that one vendor had more time to refine and improve their implementation of DXR while the other had considerably less time.
 
High end RDNA2 GPUs have about 1/4th of the RT performance as compared to Ampere. At half the power the console RT performance isnt going to be better then say a 6800XT's.
And thats why you see results as they are on them.

Anyway, this isnt the topic for console comparisons to begin with. Aside from that being impossible since Ampere isnt in consoles.
How exactly did you come to that conclusion?
 
There you go, assuming that what's written for NVidia is how it will continue to be. Games are not synthetics and code written for one GPU can easily be an order of magnitude off in performance for another architecture.

This is how you spent quite a long time saying that console would not have ray tracing. And then they did. You make far too many assumptions.

I've shown earlier today how mesh shading requires substantially different optimisation on NVidia versus XSX.


Why talking about assumptions when his sources are devs ? You just straight up don't believe him ?
 
High end RDNA2 GPUs have about 1/4th of the RT performance as compared to Ampere. At half the power the console RT performance isnt going to be better then say a 6800XT's.
And thats why you see results as they are on them.

When you'll link a reliable source for these informations, we can discuss them. What I see is some unfounded argument. I am certain that peak RT performance is higher on Nvidia hardware, I am also quite sure 1/4 is a figure you pulled from.. you know where.
 
The most likely and charitable explanation is that one vendor had more time to refine and improve their implementation of DXR while the other had considerably less time.

They seem to forget that this is AMDs first generation RT hardware while NV is over two years ahead on their second gen RT hw.

you pulled from.. you know where.

Just read the discussions in this thread. But i know what some's problems are, perhaps relocate to the section of gaming you like the most.
 
Dirt 5? Both RX 6800 and 6800 XT lose relatively less performance from RT than NVIDIA parts
There is barely any RT shadows at all in Dirt 5, it's using RT sparingly and selectively, yet the hit to AMD hardware is similar to NVIDIA hardware, AMD GPUs are just starting off from a higher fps position. This alone speaks volumes about AMD RT capabilities.
 
Last edited:
If they were 'portable' then why is everything falling apart before our very own eyes ?
That's certainly an API issue, for sure.
AMD had just 3 years to write a driver for it, after all, an unimaginable thing to accomplish in such a short amount of time:yep2:

RT shadows that we could do pretty convincing shadows before, too.
Please do pretty convincing area lights shadows from multiple light sources, which visible in the very beginning of SoTR, with "fakes".
You would certainly be a rock star at next SIGGRAPH if you manage accomplishing this.

I’m looking forward to a DF deep dive on Dirt 5
It would require profiling just to see whether there is any tracing at all or RT cores are just sitting idle 99% of time while the most heavy workload is simply building BLAS for the car model.
WoT managed to do this type of shadows (though they look more diffuse) for many models in SW, so the ray tracing itself likely takes minimum amount of time and you don't even need the complex spatiotemporal variance guided filtering since shadows are hard anyway and simple accumulation would do the trick.
 
I thought the general consensus was, that AMD disabled one entire SE.
Makes sense from a technical point of view however seems counter intuitive to get the most yields out of the binned chip
Defects are more likely to appear over a bigger area (4SE) than a smaller one (1SE).

To justify disabling 1SE means there are 3 or more DCUs with defects on a single SE while the rest of the other three SEs are intact.
 
Wait what?
I thought the point of RT is to bring the graphics closer to realism, not flashly lights and fancy mirrors and signs yelling "THIS IS RTRT!11".
I don't see how it takes anything away from the actually correct RT shadows that we could do pretty convincing shadows before, too. Most of the things done with RT can be "faked" pretty well anyway.

Whoa there cowboy. The point was AMD hasn’t demonstrated anything convincing so far. Has nothing to do with shadows vs reflections.
 
Back
Top