AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

The 3dmark technical guide says “This test uses features from DirectX Raytracing Tier 1.1 to create a realistic ray-traced depth of field effect.”

DXR 1.1 supports both inline and dynamic shader linking but the guide doesn’t explicitly state which approach it uses. Same for Minecraft.

Forget 3dmark this benchmark is usless. you don't know how much intersctions are used, how and which denosing technic are used. this is total usless benchmarch. Its crap...

this benchmark are telling you no storry...
 
The visual inconsistencies and performance problems just shows that ray tracing APIs were hardly ever portable between different hardware vendors ...

Seeing as how DXR supported games are seemingly relying on vendor HW/driver specific behaviour, it's an overall plus that they dodged a bullet here by not implementing their competing vendor's ray tracing extension on Vulkan ...
 
The visual inconsistencies and performance problems just shows that ray tracing APIs were hardly ever portable between different hardware vendors ...

Seeing as how DXR supported games are seemingly relying on vendor HW/driver specific behaviour, it's an overall plus that they dodged a bullet here by not implementing their competing vendor's ray tracing extension on Vulkan ...

The DirectX ray tracing apis are really simple so inherently very portable. It’s basically buildBVH() and castRays(). All the interesting stuff happens in drivers and hardware.

Whatever is happening with individual game support it’s not Microsoft’s fault.
 
The DirectX ray tracing apis are really simple so inherently very portable. It’s basically buildBVH() and castRays(). All the interesting stuff happens in drivers and hardware.

Whatever is happening with individual game support it’s not Microsoft’s fault.

If they were 'portable' then why is everything falling apart before our very own eyes ? Why do some games show graphical problems on another vendor ? Why can performance be wildly different between games on different vendors ? Moreover, why are we now devolving into a state where games are locking ray tracing features behind specific vendors for different games using the very same APIs ?!

I can only offer two assumptions so far. What is 'simple' supposed to mean in your case ? Because it sure might not very well look to be the case from their driver's implementation perspective. Are developers relying on a specific driver behaviour only available from a single vendor ? Even DXR has "undefined behaviour" or other possible edge cases in it's own specs that can be implementation dependent on specific HW/drivers ...
 
Every discussion is worthless here when we don't have the basics to understand where are the limits of each architecture are.

I'll be honest, I gave up on synthetics being meaningful a long time ago.

The only really useful synthetic right now, in my opinion, is where we can see NVidia's tile-deferred pixel shading. It's a beautiful thing and I struggle to see how AMD or Intel can be competitive over the long run without doing the same.

FWIW, I don't think Infinity Cache is so great (performance per watt is not impressive). It's falling apart at 4K. The unknown is whether most rendering becomes more Infinity Cache friendly as time goes by, but nothing out there right now indicates that things will get better.

In my opinion, if you aren't doing what NVidia is doing with tile-deferred pixel shading, you're doomed. Shade smarter not harder. Relying upon Infinity Cache in this race is not smart.
 
From now on, though, NVidia is the minority platform.

Ray traced enabled ones maybe. But NV aint stupid, their playing in the biggest market, the pc gaming market, and want to capture as much as possible there. Just like Sony does in the console space.
Its no different then the consoles having quite different hardware in terms of SSD, IO, audio and even controls.

I'll be honest, I gave up on synthetics being meaningful a long time ago.

This view i had in an Apple thread, which didnt get too well recieved. I agree on your post though :)

Ultra settings and ultra raytrace reflections..

Wow its a been a while since i played BFV (or any BF), still looks so nice with the RT.
 
If they were 'portable' then why is everything falling apart before our very own eyes ? Why do some games show graphical problems on another vendor ? Why can performance be wildly different between games on different vendors ? Moreover, why are we now devolving into a state where games are locking ray tracing features behind specific vendors for different games using the very same APIs ?!

I can only offer two assumptions so far. What is 'simple' supposed to mean in your case ? Because it sure might not very well look to be the case from their driver's implementation perspective. Are developers relying on a specific driver behaviour only available from a single vendor ? Even DXR has "undefined behaviour" or other possible edge cases in it's own specs that can be implementation dependent on specific HW/drivers ...

Ubisoft released Watch Dogs on PS5/Xbox Series/nVidia with Raytracing and there arent any problems. Couldnt it be just a driver problem on AMD's side?
 
Ubisoft released Watch Dogs on PS5/Xbox Series/nVidia with Raytracing and there arent any problems. Couldnt it be just a driver problem on AMD's side?

PS5/XSX have different APIs compared to PCs. There's nearly zero driver work involved for console vendors so they can't exactly be compared to AMD ...

How do you isolate a "portability issue" from a "driver problem"? Have you considered that API design can be the source of the driver problems ?
 
If they were 'portable' then why is everything falling apart before our very own eyes ?

APIs are just interfaces. They don’t guarantee performance or correctness. Those are determined by Nvidia’s and AMDs implementations. All of your questions should be directed there not at the API.

A good portable api provides simple and sensible data structures and interfaces. It should obviously be opinionated (the entire reason it exists) but unbiased. DXR seems to check all those boxes.

Are developers relying on a specific driver behaviour only available from a single vendor ?

Not sure what you mean. Developers interact with the DXR api, not drivers.

Even DXR has "undefined behaviour" or other possible edge cases in it's own specs that can be implementation dependent on specific HW/drivers ...

Yep just like every software library that ever existed if you do dumb things like accessing arrays out of bounds then stuff will break. It’s not a graphics or DXR or IHV specific thing.
 
Minecraft got a DXR1.1 update and the 3DMark raytracing feature test is DXR1.1 only. In both cases Ampere is twice as fast.
Yes - AMD RT is just slower in like for like scenarios. It is somewhat frustrating here to read the assumptions that it is smehow a purposeful Sabotage or something that this is the case. AMD RT Hardware is less refined and accelerates less than the NV solution, plain and simple. Hence why the performance diverges more greatly the more RT there is, and the more incoherent the rays are. I cannot stress enough that this is not a Sabotage, but just the way things are from what I have Heard from devs. This is not even a bad thing necessarily, as the DXR spec allows for growth and adding more acceleration over time. It also means we will get games targetting AMD slower RT which will do neat things with the lower more coherent ray Budget. I find that interesting.
 
How do you isolate a "portability issue" from a "driver problem"? Have you considered that API design can be the source of the driver problems ?

In theory yes. Can you point to something in DXR that would unfairly punish AMD? Check the docs that you linked earlier. It’s a super simple api. All the complexity is in AMDs drivers.
 
Yes - AMD RT is just slower in like for like scenarios. It is somewhat frustrating here to read the assumptions that it is smehow a purposeful Sabotage or something that this is the case. AMD RT Hardware is less refined and accelerates less than the NV solution, plain and simple. Hence why the performance diverges more greatly the more RT there is, and the more incoherent the rays are. I cannot stress enough that this is not a Sabotage, but just the way things are from what I have Heard from devs. This is not even a bad thing necessarily, as the DXR spec allows for growth and adding more acceleration over time. It also means we will get games targetting AMD slower RT which will do neat things with the lower more coherent ray Budget. I find that interesting.

Thanks for pointing this out.
 
I'll be honest, I gave up on synthetics being meaningful a long time ago.

The only really useful synthetic right now, in my opinion, is where we can see NVidia's tile-deferred pixel shading. It's a beautiful thing and I struggle to see how AMD or Intel can be competitive over the long run without doing the same.

But didn't amd beet nvidia this time in tile-deferred pixel shading when they are everywhere faster then nvidia in normal rasterizsation games?

Also tiled deffered pixel shading belongs to culling belongs how it handls small and big poligons, if its strip or list polygons.. So many thinx have an influenece.

The best example is tesselation. Everybody asked why AMD runs bad at Crysis. The looked at synthetic tesselation benchmarked and saw that amd is only bad at tasselation. Then they check crysis again and bingo. Crysis was totaly overtassaleted. How will you find out when you have any hints where AMDs weeknes is?

Of cause synthetics don't show you all information but it give you realy strong hints what could go wrong in games.
 
Yes - AMD RT is just slower in like for like scenarios. It is somewhat frustrating here to read the assumptions that it is smehow a purposeful Sabotage or something that this is the case. AMD RT Hardware is less refined and accelerates less than the NV solution, plain and simple. Hence why the performance diverges more greatly the more RT there is, and the more incoherent the rays are. I cannot stress enough that this is not a Sabotage, but just the way things are from what I have Heard from devs. This is not even a bad thing necessarily, as the DXR spec allows for growth and adding more acceleration over time. It also means we will get games targetting AMD slower RT which will do neat things with the lower more coherent ray Budget. I find that interesting.
When you bring evidence to back up your assumptions, I guess you'll have an argument.
 
When you bring evidence to back up your assumptions, I guess you'll have an argument.
I will not Name a source or the specifics of it, but I was told like for like Raster GPU crossing the aisle to AMD has nearly 1/4th Performance for incoherent rays. For direct evidence we have to wait till I get a Big Navi GPU and run different Tests with different effects types and ray type in a suite of games and Unreal Engine 4 with profiling.

It should be very interesting to see how performance degrades on, EG. a 2070 S vs. Big Navi equiv when changing the roughness threshhold for stochastic RT reflections in a various titles. 3 engine test for example (BF V, WDL, UE4).
 
I will not Name a source or the specifics of it, but I was told like for like Raster GPU crossing the aisle to AMD has nearly 1/4th Performance for incoherent rays. For direct evidence we have to wait till I get a Big Navi GPU and run different Tests with different effects types and ray type in a suite of games and Unreal Engine 4 with profiling.

It should be very interesting to see how performance degrades on, EG. a 2070 S vs. Big Navi equiv when changing the roughness threshhold for stochastic RT reflections in a various titles. 3 engine test for example (BF V, WDL, UE4).
There you go, assuming that what's written for NVidia is how it will continue to be. Games are not synthetics and code written for one GPU can easily be an order of magnitude off in performance for another architecture.

This is how you spent quite a long time saying that console would not have ray tracing. And then they did. You make far too many assumptions.

I've shown earlier today how mesh shading requires substantially different optimisation on NVidia versus XSX.
 
There you go

And there you go, trying to assasinate the one DF member we have on this forum. Isnt it enought with the other devs leaving this place along time ago?
DF have the facts, he just hinted to that. Its a hardware difference.

This is how you spent quite a long time saying that console would not have ray tracing. And then they did. You make far too many assumptions.

They have limited ray tracing actually. One could argue they barely have it, as opposed to what we already have/had when that comment was made.
 
Back
Top