No DX12 Software is Suitable for Benchmarking *spawn*

6 years ago, DX12 came with the promise of reducing CPU load, while also increasing scene complexity, we achieved little of that sadly. There are still games that suffer fps losses due to DX12 to this day. Even when the game has DXR, the developers can't get the DX12 to perform better than DX11 in many cases, I recently tried the game Deliver Us The Moon without DXR, and DX11 gave me 72fps in one scene, but DX12 gave me just 64fps on my 2080Ti, that's a 12% difference in fps I could have used elsewhere. Ghostrunner behaves the same, Control too, Resident Evil and several others.

This is really bad, on the PC space, we sometimes lose fps due to Hyper Threading, CPU security patches, and now DirectX 12 as well. Those fps can really come in handy if we enable Ray Tracing, where every little bit helps.

There are other games that have DX12 equal to DX11, or slightly better: like Metro Exodus, Division 2, Hitman 3 or Shadow of the Tomb Raider, and this needs to become the norm.

 
DXR straight up requires D3D12's binding model and even extends it because rays can index any arbitrarily large number of geometry, shaders, or resources. D3D11 doesn't have the concept of bindless resources so exposing DXR is not an option over there ...
 

That's a good article and I do appreciate TPU's work and high level summaries they put together. Really useful reference points IMO. I also think the selection of games they used is pretty good, at least the 2021 selection which seems to include most of the big hitters on PC (Kena and The Medium are two more I would have liked to see in there).

I do feel compelled to point out though that although the trend they show for the 6800XT getting relatively faster post next gen console launch in relation to the 3080 for rasterized graphics is unsurprising given the shared RDNA2 architecture, it really does need to be taken in the context of the increasing use of Ray Tracing the further we move into this generation. So while on the one hand it's fair to say the 6800XT is faster (and may grow faster still) in pure rasterized graphics, it's also fair to say, that it will grow relatively slower on average over time as a higher percentage of games make use of RT.

In fact even in TPU's own 2021 test set, 4 of the 9 games tested support RT. They were tested with it off for this comparison, but had it been turned on, I suspect the average would have turned back quite heavily into Nvidia's favour. Especially since the 2 nvidia wins from that set are actually 2 of the 5 non RT games with another of those 5 being a near draw. So of the 9 tested games the result would likely have been 6 3080 wins, 2 6800XT wins and one draw. That's in a 2021 test suite which includes most of the years big hitters so far. The Medium would have added another win to the 3080 while Kena would probably have been roughly a draw.
 
AMD traditionally does better in multiplatform titles which aren't using PC engines. This also absolutely falls apart once you move out of this category - notice the results of two UE4 titles in 2021 there; there are much more than two UE4 titles released in 2021 but not all of them are "AAA".

Summaries per year are also for 4K which means that they are likely skewed in favor of 3080 since N21 kinda looses steam above 1440p.

And yeah 2021 results without at least including a separate summary with RT enabled are fairly meaningless, especially at this performance tier.
 
That's a good article and I do appreciate TPU's work and high level summaries they put together. Really useful reference points IMO. I also think the selection of games they used is pretty good, at least the 2021 selection which seems to include most of the big hitters on PC (Kena and The Medium are two more I would have liked to see in there).

I do feel compelled to point out though that although the trend they show for the 6800XT getting relatively faster post next gen console launch in relation to the 3080 for rasterized graphics is unsurprising given the shared RDNA2 architecture, it really does need to be taken in the context of the increasing use of Ray Tracing the further we move into this generation. So while on the one hand it's fair to say the 6800XT is faster (and may grow faster still) in pure rasterized graphics, it's also fair to say, that it will grow relatively slower on average over time as a higher percentage of games make use of RT.

In fact even in TPU's own 2021 test set, 4 of the 9 games tested support RT. They were tested with it off for this comparison, but had it been turned on, I suspect the average would have turned back quite heavily into Nvidia's favour. Especially since the 2 nvidia wins from that set are actually 2 of the 5 non RT games with another of those 5 being a near draw. So of the 9 tested games the result would likely have been 6 3080 wins, 2 6800XT wins and one draw. That's in a 2021 test suite which includes most of the years big hitters so far. The Medium would have added another win to the 3080 while Kena would probably have been roughly a draw.
For sure Nvidia’s superiority in RT tech is greater than AMD’s superiority in rasterization tech. It’s still not a given that RT usage will increase throughout the gen though, at least not for multiplatform games targeting consoles. DLSS is the real selling point for Nvidia cards IMO.
 
Crysis 2 and 3, Guardians, Farcry 6, Bright Memory Infinity, Dying Light 2 (Feb 22), Halo, Deathloop, Battlefield 2042 (and Forza)... Latest multiplattform games with Raytracing support.
 
For sure Nvidia’s superiority in RT tech is greater than AMD’s superiority in rasterization tech. It’s still not a given that RT usage will increase throughout the gen though, at least not for multiplatform games targeting consoles. DLSS is the real selling point for Nvidia cards IMO.
It has already increased enough to completely invalidate these results for 2021.
As pjbliverpool pointed out above 5 out of 9 games from 2021 in this benchmark make use of RT h/w.
5 out of these 5 would demonstrate either a parity or Nv's advantage in that same benchmark.
Which would likely leave only 2 out of 9 favoring AMD - with at least one of them (Hitman 3) presumably getting some form of RT support alongside the launch of Intel's dGPUs soon.

There are no reasons to suspect that RT usage adoption will slow down in the future.
On quite the contrary it is highly likely that games will provide additional RT quality levels and/or uses compared to consoles.
Everything we've seen thus far points into this direction.
I mean RT was added even into the PC version of Far Cry 6 - an AMD sponsored release, missing RT on consoles.
 
RT was left out from CoD Vanguard, AFAIK, no? Also a pretty good selling multi platform title - and one whose franchise for the last two iterations did use RT.
 
There are no reasons to suspect that RT usage adoption will slow down in the future.
On quite the contrary it is highly likely that games will provide additional RT quality levels and/or uses compared to consoles.
Everything we've seen thus far points into this direction.
I mean RT was added even into the PC version of Far Cry 6 - an AMD sponsored release, missing RT on consoles.

I'd say once the next gen GPU's arrive next year with hopefully significantly more capable RT hardware it will start to almost become standard/expected for games to have at least some form of RT. Especially as techniques are developed/refined (like RTAO) that can be used almost as drop in replacements for higher quality versions of traditional rasterization techniques at virtually no performance loss. Plus those new PC GPU's are going to need some way to soak up all that extra power over the console baseline and RT is the perfect way to do that. Essentially becoming the new "ultra" settings but in a way that actually makes a significant difference.
 
RT was left out from CoD Vanguard, AFAIK, no? Also a pretty good selling multi platform title - and one whose franchise for the last two iterations did use RT.
True but that's one example on dozens of titles which are adding RT this year. It's also hard to say why they've done this, may be due to time pressure to release a game on schedule more than anything else.

Especially as techniques are developed/refined (like RTAO) that can be used almost as drop in replacements for higher quality versions of traditional rasterization techniques at virtually no performance loss.
There will always be a performance loss from using RT. The only way RT can be free is if some game would swap some very expensive rasterization approach for a faster RT one - but I don't think that many games are using such rasterization options since they are simply a no go for consoles in the first place.
Otherwise I agree with you.
 
It has already increased enough to completely invalidate these results for 2021.
As pjbliverpool pointed out above 5 out of 9 games from 2021 in this benchmark make use of RT h/w.
5 out of these 5 would demonstrate either a parity or Nv's advantage in that same benchmark.
Which would likely leave only 2 out of 9 favoring AMD - with at least one of them (Hitman 3) presumably getting some form of RT support alongside the launch of Intel's dGPUs soon.

There are no reasons to suspect that RT usage adoption will slow down in the future.
On quite the contrary it is highly likely that games will provide additional RT quality levels and/or uses compared to consoles.
Everything we've seen thus far points into this direction.
I mean RT was added even into the PC version of Far Cry 6 - an AMD sponsored release, missing RT on consoles.
I was referring to an increase in the level of RT in a game as opposed to the number of titles which receive RT support. It’s still what I’d consider token support in a lot of these multi-platform, console focused titles. The ones listed by Troyan I feel are examples of that.
 
I was referring to an increase in the level of RT in a game as opposed to the number of titles which receive RT support. It’s still what I’d consider token support in a lot of these multi-platform, console focused titles. The ones listed by Troyan I feel are examples of that.
Level of RT usage will increase of course. Look at FH5, they've locked it in photo mode on PC for now reason, and these things will be in the past very soon.
Once the RT support is there in the engine it will be a lot easier to increase its usage beyond that which will be available on consoles. Again, many examples of that from this year alone.
 
Neat blog post: https://themaister.net/blog/2021/11/07/my-personal-hell-of-translating-dxil-to-spir-v-part-3/

D3D12 Descriptor aliasing is a scary unintended feature ...

The mistake of VOLATILE-by-default was recognized by the time 1.1 rolled around, and STATIC was now made the default. STATIC is the nicest mode possible for drivers, since it fully allows hoisting of descriptors, but you actually lose robustness guarantees (!). Most big game engines probably looked at this for 2 seconds and noped out.

Engines must like their robust out of bounds access behaviour very much! Things are starting to come full circle ...
 
Microsoft implemented a DX12 beta patch for Flight Simulator 2020, it runs like an absolute garbage on top of the line 5950X CPU and RTX 3090, with fps 50% slower than DX11, and with massive stuttering and hitching on scenes with extensive geometry.

 
Microsoft implemented a DX12 beta patch for Flight Simulator 2020, it runs like an absolute garbage on top of the line 5950X CPU and RTX 3090, with fps 50% slower than DX11, and with massive stuttering and hitching on scenes with extensive geometry.

I keep saying it. DX12 and the low level API thing was the worst "evolution" ever. Even Microsoft can't make it right for its most CPU limited game ever. What an irony :runaway::runaway::runaway:
 
Back
Top