No DX12 Software is Suitable for Benchmarking *spawn*

I guess built with DX12/Vulcan in mind until 2018 or 2019, possibly later.
Judging from current and past trends. I second that. Now that BF1 is over, This is the last DX2 title for 2016 (maybe even for the first good half of 2017?). The Year is done, it is time to look back and compare how DX12 stands against DX11 and DX10,

DX12 was launched July 2015, 18 months later (December 2016) we would have 13 titles for it. no IQ enhancements in site, shaky performance enhancements, and fps drops on many hardware for no reason at all.

Compared to DX10, launched January 2007, 18 months later (till August 2008), 16 titles sported it, most with Image quality enhancements (softer better shadows, better AA support, better post processing), and performance drops as a results of them. They were limited though and left much to be desired. DX10 was retired prematurely.

Age of Conan: Unchained
Assassin's Creed
BioShock
Call of Juarez
Company of Heroes
Crysis
Devil May Cry 4
Gears of War
Hellgate: London
Lost Planet: Extreme Condition
World in Conflict
Microsoft Flight Simulator X
The Lord of the Rings Online
Universe at War: Earth Assault
Halo 2
Fury

Compared to DX11, launched October 2009, 18 months later (till March 2010), 18 titles sported it, most with visual enhancements as well, performance dropped according to these visual effects, DX11 grew in popularity, and became the defacto API to this day.

BattleForge
Colin McRae: Dirt 2
S.T.A.L.K.E.R.: Call of Pripyat
The Lord of the Rings Online
Aliens vs. Predator
Battlefield: Bad Company 2
Metro 2033
Civilization V
F1 2010
Lost Planet 2
Medal of Honor
Tom Clancy's H.A.W.X 2
Dungeons & Dragons Online
The Lord of the Rings Online
Dragon Age II
Homefront
Total War: Shogun 2
Crysis 2

So, DX12 uptake is slower compared to even DX10, despite enjoying a much bigger installer base of GPUs, and coming in a time where a lot of games indie and otherwise, are being made. It is also distinctively lacking in the visual enhancements field.
 
You left RE5 off of the DX10 list. Didn't really do much of anything for it outside of running slightly faster for reasons no one ever understood. It was patched out for some oddball reason when the game's gold version came to Steam last year.
There's also Saints Row 3 which has, IIRC, settings for DX10 and DX11 or just DX10 and up and they look the same on DX10.

Not sure what you'd call that one.
 
DX12 was launched July 2015, 18 months later (December 2016) we would have 13 titles for it. no IQ enhancements in site, shaky performance enhancements, and fps drops on many hardware for no reason at all.
There's a reason why feature level 12_0 and 12_1 were back ported to DX11. And there is at least one game out there using 12_1 features even though it runs on a custom IHV hack of DX11.
 
The problem is that big studios have existing engines, some of which are still D3D9 based (even though they use D3D11, they are not structured to benefit from it), there's terrible inertia, lack of trust in employees being able to write new engines, fright it will turn bad, cost a lot and provide little...
Basically game companies don't innovate at any level anymore, except for rare exceptions and a few people empowered to take risks because of past success.
Sad but true.
There's a reason why a number of highly skilled people went indy, or left the industry...
 
I will make it easy: if You see a game who have both DX11 and DX12 modes, then it is not a DX12 titles... you think they backport DX12 titles to DX11 or vice versa ?

Knowing today, most have allready not the time to complete the DX11 versions, its a miracle to see them with DX12 too...

Battlefield 1 run extremely well ( whatever is the DX version. ), ~90fps average @ 1440p on a Fury X or 980TI from last generation. 50+ fps average on 4K ( ultra ).. As with every release there will be some bug, but when you compare this and Mafia 3 .. well you know what i mean.
 
Last edited:
The problem is that big studios have existing engines, some of which are still D3D9 based (even though they use D3D11, they are not structured to benefit from it), there's terrible inertia, lack of trust in employees being able to write new engines, fright it will turn bad, cost a lot and provide little...
Basically game companies don't innovate at any level anymore, except for rare exceptions and a few people empowered to take risks because of past success.
Sad but true.
There's a reason why a number of highly skilled people went indy, or left the industry...
There was no reason to innovate. DX11 doesn't have anything radically new. Rendering is still mostly vertex + pixel shaders. Tessellation was a dud. Lack of multidraw in DX11 (compared to OpenGL) limited the usage of compute shaders to post process effects and lighting. Yes, we have now more efficient lighting and post processing, etc but scene setup and culling are still mostly done by CPU. Some engines hacked around the DX11 limitations, but hacks have downsides. You can't expect big general purpose engines to choose narrow "hacky" rendering techniques to avoid DX11 limitations.

DX12 has bindless resources, tiled resources and ExecuteIndirect. Tiled resources were already in DX11.2, but that feature was limited to Windows 8. So no sane developer built their resource management around tiled resources. Not being able to sell your game to Windows 7 customers is a huge deal breaker. DirectX 12 centric engines have the same problem. You can't easily emulate bindless resources, tiled resources and/or ExecuteIndirect on DirectX 11.1 (Windows 7). If you want to design an engine that is built on top of these features, games using that engine will be limited to consoles + Windows 10. We need to wait for at least one more year before we get AAA games that get the most of DirectX 12. Windows 10 adaptation rate isn't yet high enough for big AAA devs to drop Windows 7 support. This is exactly the same situation we have with DirectX 10 + Windows XP, except that this time consoles support DX12 -> console devs would be eager to support DirectX 12 on PC.

I would have liked to see DirectX 12 on Windows 7. This would have increased DX12 adaption rate a lot and allowed developers write pure DX12 centric renderers. Now DirectX 12 is used mostly to buy some extra CPU cycles (mostly helps low end CPUs). This is similar to DirectX 11 compute shader adaptation on last gen games. Compute shaders were only used for additional PC specific high end effects, while 99% of the pipeline was reused from consoles (DX9 based code & design).
 
Vulkan is on Windows 7.

yeah, that is a curious question. In my limited experience, biga** game engines have an heavy frontend with some #ifdefs and a generic backend layer where they implement the effective renderer (i.e. DX, OGL, PS).
While I understand the costs of adding one is not small, I miss the reason of why not adding a Vulkan layer which would seamlessy take care of all win-machine park (shader language is somewhat easily convertible already).
 
Vulkan is on Windows 7.
It would be better to drop DX12 for a platform agnostic alternative.
Vulkan is great for PC and mobile, but console shaders aren't GLSL. Nobody wants to maintain two copies of all of their shaders. There are some GLSL <-> HLSL translators, but not ones with full SM 5.1 feature set. It would be optimal if everybody used SPIR-V as their intermediate code, but Microsoft has already announced that they release their own new intermediate code format (to replace DX ASM) along SM 6.0. Luckily both the new format and the tool chain is open source. It shouldn't be that hard to plug a SPIR-V output somewhere in the tool chain to output Vulkan compatible code for PC and mobile.
 
Vulkan is great for PC and mobile, but console shaders aren't GLSL. Nobody wants to maintain two copies of all of their shaders. There are some GLSL <-> HLSL translators, but not ones with full SM 5.1 feature set. It would be optimal if everybody used SPIR-V as their intermediate code, but Microsoft has already announced that they release their own new intermediate code format (to replace DX ASM) along SM 6.0. Luckily both the new format and the tool chain is open source. It shouldn't be that hard to plug a SPIR-V output somewhere in the tool chain to output Vulkan compatible code for PC and mobile.

Im sowewhat pretty sure, We could find really soon a translators on AMD GPUopen. They seems really prompt to provide library, tools really fast when needed.
 
Last edited:
There was no reason to innovate. DX11 doesn't have anything radically new. Rendering is still mostly vertex + pixel shaders. Tessellation was a dud. Lack of multidraw in DX11 (compared to OpenGL) limited the usage of compute shaders to post process effects and lighting. Yes, we have now more efficient lighting and post processing, etc but scene setup and culling are still mostly done by CPU. Some engines hacked around the DX11 limitations, but hacks have downsides. You can't expect big general purpose engines to choose narrow "hacky" rendering techniques to avoid DX11 limitations.

DX12 has bindless resources, tiled resources and ExecuteIndirect. Tiled resources were already in DX11.2, but that feature was limited to Windows 8. So no sane developer built their resource management around tiled resources. Not being able to sell your game to Windows 7 customers is a huge deal breaker. DirectX 12 centric engines have the same problem. You can't easily emulate bindless resources, tiled resources and/or ExecuteIndirect on DirectX 11.1 (Windows 7). If you want to design an engine that is built on top of these features, games using that engine will be limited to consoles + Windows 10. We need to wait for at least one more year before we get AAA games that get the most of DirectX 12. Windows 10 adaptation rate isn't yet high enough for big AAA devs to drop Windows 7 support. This is exactly the same situation we have with DirectX 10 + Windows XP, except that this time consoles support DX12 -> console devs would be eager to support DirectX 12 on PC.

I would have liked to see DirectX 12 on Windows 7. This would have increased DX12 adaption rate a lot and allowed developers write pure DX12 centric renderers. Now DirectX 12 is used mostly to buy some extra CPU cycles (mostly helps low end CPUs). This is similar to DirectX 11 compute shader adaptation on last gen games. Compute shaders were only used for additional PC specific high end effects, while 99% of the pipeline was reused from consoles (DX9 based code & design).

I failed to precisely and concisely express myself, hopefully that doesn't happen when programming ;p

I should have said that during the D3D9 era engines were generally/mostly/most often designed around the API, not the hardware, and it's only with D3D10/11 that engines were/could be designed around (more capable) hardware which architecture documentation was (more readily) available.

An engine designed around the D3D9 API (or equivalent OpenGL version) is not a good start to make a simple, fast, efficient and elegant engine for today's GPU.

(Although I failed to mention it, when I say D3D12 I usually mean recent lower level API, that is D3D12/Vulkan. The latter working on plenty of OS and devices, is a viable solution to target "latest" hardware without arbitrary market limitations.)
 
Why not on Intel GPUs? Skylake is FL 12_1. Has everything that Pascal/Maxwell has and more. Skylake's resource binding is tier 3 (NV = 2) and conservative raster is tier 3 (NV = 2).
Yeah Skylake has the highest tiers of all DX12 GPUs, however HFTS worked through a hack that enabled it to run through DX11 API, as such it only worked on NV GPUs.
 
More BF1 tests from computerbase:
https://www.computerbase.de/2016-10...ramm-battlefield-1-auf-dem-i7-6700k-2560-1440

BF1_1080p_SP.jpg


When CPU limited the picture is reversed somewhat:
BF1_1080p_SP.jpg

However, The site joins pcgameshardware and sweclockers in expressing how horrendous DX12 frame times are, on both NV and AMD, especially during multiplayer.

All other constellations, on the other hand, do not approach the experience under DirectX 11. Graphics cards from Nvidia run in single player mode for the most part, but there are always hangs, which interfere. In the multiplayer mode, the difference is less, because DirectX 11 is faster, but there is no reason to switch to DirectX 12.

Unplayable is DirectX 12 in the multiplayer mode with graphics cards from AMD. Here the game jerks with no break, the Frametimes illustrate this. Of the advantage with the FPS, players at this point therefore have nothing.

DirectX 12 remains a good alternative to DirectX 11 only for players of the campaign with graphics cards from AMD on weak processors.

https://www.computerbase.de/2016-10...s-auf-dem-fx-8370-radeon-rx-480-einzelspieler
 
Yeah Skylake has the highest tiers of all DX12 GPUs, however HFTS worked through a hack that enabled it to run through DX11 API, as such it only worked on NV GPUs.
DirectX 11.3 supports conservative raster (no hacks needed). However 11.3 is also limited to Windows 10. I don't know why.

So Nvidia has some DX 11.0 hack APi that allows conservative raster (on Windows 7)? Wasn't aware of that. Both AMD and Nvidia have DX 11.0 hack APIs for multidraw and UAV overlap. It starts to feel like DX9 all over again. So many IHV specific API hacks to bring DX10 features to Windows XP. Now the same is true for Windows 7 and DX12 :)
 
More BF1 tests showing the usual NV is faster at DX11, loses fps @DX12, AMD gains some fps @DX2:

6mJxhdM.png

http://www.hardwareunboxed.com/battlefield-1-benchmarks-20-gpus-tested-at-1080p-1440p-4k/

bf1-benchmark-1440p-dx11.png

bf1-benchmark-1440p-dx12.png



Unfortunately, DirectX 12 is variable on each vendor, with generally poor low performance values and occasional stuttering. The stutters are buried a bit as multiplayer sessions are stretched over a longer period of time, but still present. Averages are effectively identical for AMD between Dx12 and Dx11 (when using a high-end CPU), and they're better for nVidia with Dx11 than Dx12. Dx12 also has more variance overall, between both vendors, and so we're basing this conclusion on Dx11 performance. Again, even without the variance, the average framerate is effectively equal on AMD and is slightly negatively scaled on nVidia. This will make the performance and dollar arguments a little bit less complicated.
http://www.gamersnexus.net/game-bench/2652-battlefield-1-graphics-card-benchmark-dx11-vs-dx12

We have benchmark.pl test as an outlier, showing NV with similar fps as AMD in DX12:
http://www.benchmark.pl/testy_i_rec...wydajnosci-kart-graficznych/strona/26943.html
 
Back
Top