No DX12 Software is Suitable for Benchmarking *spawn*

I was able to play deus ex mankind divided a high details thanks to the DX12 port on a very old i5 750 (though overclocked at 3.8GHz) and a R9 280X. In DX11 mode it was unplayable. A good DX12 port can significally lower the CPU overhead, which means you will see benefits only CPU limited scenarios. If you want more, the developers have to re-thing completely the game engine and the rendering algorithms, not just making a port adding a D3D12 backend..

You should play Deus Ex on a nVidia card. DX12 is 33% slower in cpu limited scenarios than DX11. :D
 
You should play Deus Ex on a nVidia card. DX12 is 33% slower in cpu limited scenarios than DX11. :D
Probably true for a lot of those early DX12 games, which lets be honest AMD was more proactive-engaged with developers on.
Does raise the questions which games should be considered good DX12 implementations, maybe such as Gears of War 4 and Sniper Elite 4 but neither of those are perfect as well.

Battlefield series with the Dice engine is another that seems more suited towards AMD, but it would be interesting to see how it performs on TitanV as some unoptimised games (Battlefield V is probably not that bad) do run notably better on it rather than Pascal from a trend-behaviour perspective.

Just to say separately.
But then all of this is further compounded by scene,game options, and even possibly such as exlusive full screen/UWP/etc to whether on Ryzen or Intel platform.
My biggest gripe is that any publication these days must test on both CPU platforms and not just one, because we sometimes see quirky behaviour such as what happened with RoTR, and it would be fairer for consumers as gives a more complete picture on the route they want to go with purchasing a complete PC solution.
 
Battlefield series with the Dice engine is another that seems more suited towards AMD,
Frostbite engine has had obnoxious DX12 implementation in all of it's games so far: BF1, SWBF2, and FIFA18. The situation is the same in BFV too. Deus Ex still has problems to this day, DX11 is faster and more reliable on all GPUs, at least on good CPUs.

Probably the best implementation so far is Hitman and Sniper Elite 4. Both have DX12 boosting fps on all GPUs. Forza series and Gears 4 come a close second. But these games are DX12 only, so knowing whether DX11 will benefit them or not is not possible.

This is why I do not play games on NVIDIA GPUs.
This way you are being bottlenecked more in DX11 games, as AMD driver overhead is higher there. An NVIDIA GPU works better with low end CPUs. That was proven multiple times through the tests carried out by Digital Foundry.
 
I really do not care. I am not going to use in the gaming PC a GPU that needs to install a driver-as-a-spyware (otherwise need to wait too much to update to the last driver). People are obsessed spending thousands of bucks having high end CPUs (though most of them run low-tier silicon only), I am not, I do not change the basic hardware components every couple of year. Last system I changed MB+CPU+RAM the system was more then 7 years old. Last gaming GPU I changed was a R9 280 (7950 rebrand if I remember correctly). DX12 helped that old system a lot. I do not care if it is only due AMD driver overhead, GCN implementation non-monolitich API friendly, both or whatever... Anyway I will give Serious Sam Fusion a try with this new system, though both Vulkan and DX12 implementations should be still in beta (another never ending beta? XAUDIO 2.9 implementation also is still beta too, and it doesn't run smoothly)... Actually I am running a pretty Rysen 5 1500X, maybe I will change in the near future for development purposes, having 16 threads spending a modest sum is really attractive (or maybe I will double the RAM amount if the price will be decent... 16GB are becoming tight when running multiple virtual machines..)..
It would be also nice to see how much this spectre fixes really impacts on average bob/joe/etc system performance: on my Surface Pro 3 (which runs a 4th generation Intel Core CPU) the security fix wasn't so much appreciated, the I/O has been devastated (maybe is a ULP SoC thing only...or maybe not), and sadly, more microcode updates for security fixes should coming :|
 
Last edited:
„… by way of Nvidias PR firm…“?
Yeah, seems like massive issues for the PR firm to steer people clear of using Radeons.
RX 580 vs. GTX 1060
https://www.pcgamesn.com/amd-nvidia-battlefield-5-graphics-performance
And also RX V64 vs. GTX 1080 Ti seems desastrous radeon performance
https://www.hardwareluxx.de/index.p...ield-v-closed-alpha-erste-gpu-benchmarks.html
[\sarcasm]
What is interesting both of those are Ryzen platforms and with 1080ti with poor performance possibly hit with a big DX12 optimisation-quirk (such as but not necessarily identical to RoTR) penalty, while some I checked with AMD issues were using Intel platforms but Nvidia seems to be doing ok; too early to be conclusive as ideally publications would use same settings and test on both Intel and AMD platforms, unfortunately no-one is doing that and to me it is pretty critical of any game/GPU testing going forward.

Multiple platforms are creating a distinction IMO when it comes to development-optimisation, and this has a potential (not always but occationally) knock-on effect with the GPUs, especially when it involves DX12.
 
World Of Warcraft has finally received the promised DX12 patch, Blizzard notes that DX12 is only for AMD and Intel GPUs, NVIDIA GPUs will stick to DX11 by default until the DX12 renderer is updated for them.

DX12 increased performance of AMD GPUs by 2-3% at best, However NVIDIA GPUs remained dominant with the DX11, a GTX 1080 and a GTX 1060 are 23% faster than a Vega 64 and RX 580, respectively @1080p, and 18% faster @4K.
https://www.computerbase.de/2018-07/world-of-warcraft-wow-dx12-directx-12-benchmark/

The benefits of the DX12 patch then? Currently if you have a low end CPU with a high end AMD GPU, then you can have your fps improve by ~10%. Though a comparable NVIDIA GPU will achieve more fps @DX11 with that very same low end CPU.
https://www.computerbase.de/2018-07...sor-benchmarks-auf-einer-rx-vega-64-1920-1080
 
Last edited:
Most of the user base do not update the cpu every 2-3 years, and most of them do not have a 250+ €$£ CPU anyway. Most of them change the main components (CPU+MB) every 5-6-7 years (a Ryzen 5 1600X cannot called "low end CPU!", otherwiser what about celeron and pentium?). WoW DX12 backend may not be perfect, but could be still appreciated by a good share of the userbase.
 
Battlefield V Open Beta: PC performance benchmarks
While we'll follow the DX12 path, DX11 was running better and more stuffer free, please keep that in mind as a good tip as image quality wise, it will not matter.
...
So yes, this performance analysis is more limited then I wanted it to be. Frametimes I might still look into, however, perfmon crashes in the game and FCAT needs to be checked out still (yet I got locked out of the game for 24 hrs again). We tested DX12 mainly, as mentioned for the aforementioned reasons. We, however, recommend you using DX11 for now, it's far more stutter free and is a notch faster as well, we measured anything from 5 to 15 percent faster perf on DX11.
https://www.guru3d.com/articles-pag...-graphics-performance-benchmark-review,1.html
 
They tested the game @DX12, even though it has worse performance than DX11 across all GPUs. Right now DX12 in frostbite is absolute trash. But DICE is working to fix it for DXR and RTX.
 
DICE said they want to get DX12 in BFV working as good as in BF1, which is a huge joke. I really don't get how they can be running with that monkey implementation for that long? The Mantle implementation back in BF4 was smooth as butter, and DX12 is trash since day one. I really don't get how they can be that incompetent? I thought Mantle was developed in cooperation with Johan Andersson, so he should know how low level APIs work - doesn't he work for DICE anymore, or was it all AMD who did the work?
 
Has anybody ever done DX12 tests on 2C4T CPUs in BF1 multiplayer? Or even 4C4T?
 
DICE said they want to get DX12 in BFV working as good as in BF1,
Right now DX11 is 12% faster than DX12 on both Vega and Pascal (according to HardwareUnboxed). If that statement remains true, the best we can hope for is a tie with DX11 at the most!


Has anybody ever done DX12 tests on 2C4T CPUs in BF1 multiplayer? Or even 4C4T?
What would be the point? these games require 4 cores at the least anyway. Playing with a high end GPU also requires a relatively high end CPU. Even if DX12 is only helping low end CPUs, it would still be a moot point for PC gamers.
 
Last edited:
DX12 is supposed to enable lower CPU overhead, so it'd be interesting to see whether that materialises. And even though gaming enthusiasts tend to play games on high-end hardware, it would be interesting at least in the context of integrated chips that have a common power budget for the CPU and GPU cores.
 
Back when Mantle was released for BF4, people with Phenom II X4 got a huge performance boost. One person I knew who played Dragon Age Inquisition on an i5 650 got minimum 30 FPS using Mantle, compared to around 20 averages in DX11.

It's a huge flaw for PC gaming that you're expected to have a high-end CPU if you have a high-end GPU. Someone who got an i5 2500 + GTX 670 back in the day could just pop in an GTX 1080 Ti (even a GTX 1060 would be a good improvement, though still bottlenecked), but the CPU would bottleneck. Even people with high-end CPUs can be CPU limited if they have 144 hz monitors.
If proper DX12 support for BFV could be even remotely as good as Mantle was for BF4, it would be awesome for everyone.
 
Last edited:
For maybe the first time ever, Shadow of Tomb Raider appears to have a very solid DX12 implementation that is 230% faster than DX11 in one test!
https://www.purepc.pl/karty_graficz...pc_test_wydajnosci_kart_i_procesorow?page=0,3

Though this appears to be a result of a massively broken DX11 renderer. Exhibiting massive CPU limitations on all GPUs.

15138
15141

https://www.sweclockers.com/test/26...-och-forza-horizon-4-demo-pa-geforce-rtx-2000
 
Back
Top