DX12 increases VRAM usage in all DX12 games I have, Tomb Raider, Deus Ex MD, Division.. Etc. The increase ranges from 300MB, to a full 1GB.Thx for the bench. What about VRAM usage ? I notice that in Deus EX MD, DX12 use more VRAM (which is "bad" cause the game already eat vram a lot in dx11 ...)
They seem to be oblivious to the fact that visual settings are bugged under DX12.Computer base posted a benchmark review. Gains for at high fps low resolution for both AMD and NV.
Why don’t developers love DX12?
"But interestingly that doesn’t mean that it is needed by everyone. The extra level of control brings with it a certain amount of extra complexity – and that means that some developers might be reluctant to move to DirectX 12 – indeed, a game which is not really limited by GPU horsepower, and which isn’t bottlenecked by just one CPU thread isn’t usually going to gain much from moving to DirectX 12.
"In those cases DirectX 11 or DirectX 9 likely represents a perfectly acceptable way to design titles. But titles which are highly graphically ambitious almost invariably benefit from DirectX 12.“
In short, game developers have decided that DX12 isn’t worth the extra time and effort on all but the most demanding titles at this point. This despite the general consensus that DX12 is better on many titles over DX11, especially for those with true DX12 empowered video cards.
http://www.techradar.com/news/the-forgotten-api-just-what-is-going-on-with-dx12
Everybody seems to be missing the obvious answer: You still need to support DX11 on PC. There are too many Windows 7 and Windows 8 customers out there. DX12 doesn't allow you to cut DX11 support. With DX11 + DX12, you have to support and maintain two versions of the game (including two set of IHV driver bugs). Once DX11 becomes obsolete (= Windows 7 is no longer popular), DX12 or Vulkan will become the preferred API for AAA cross platform development, since these APIs match the consoles much closer in resource management and feature set. Having to support DX11 on PC is already causing unoptimal design choices on consoles (similarly than DX9 on consoles slowed down compute shader transition on PC).Why don’t developers love DX12?
"But interestingly that doesn’t mean that it is needed by everyone. The extra level of control brings with it a certain amount of extra complexity – and that means that some developers might be reluctant to move to DirectX 12 – indeed, a game which is not really limited by GPU horsepower, and which isn’t bottlenecked by just one CPU thread isn’t usually going to gain much from moving to DirectX 12.
"In those cases DirectX 11 or DirectX 9 likely represents a perfectly acceptable way to design titles. But titles which are highly graphically ambitious almost invariably benefit from DirectX 12.“
In short, game developers have decided that DX12 isn’t worth the extra time and effort on all but the most demanding titles at this point. This despite the general consensus that DX12 is better on many titles over DX11, especially for those with true DX12 empowered video cards.
http://www.techradar.com/news/the-forgotten-api-just-what-is-going-on-with-dx12
I think the big problem is more that with a low level API, it becomes very easy to shoot yourself in the foot. This is particularly true when you're dealing with manual memory management. This means that unless you have a lot of resources in order to test on many different platforms and fix problems (usually nasty hard to figure out and fix ones involving drivers), you're probably going to end up actually loosing performance compared to DX11, as well as ending up with severe problems on a couple platforms. This isn't even mentioning API complexity issues.
In addition, Huddy does not see DirectX 12 in any case as useful. The possibilities that DirectX 12 offers to a developer in which he gets more control over the hardware through a low-level access increases the complexity not inconsiderably. This, according to him, is already enough to explain the hesitant attitude of developers. In addition, it is not very useful to handle DirectX 12, if this is absolutely unnecessary from the performance point of view.
Huddy therefore recommends that all developers who do not want to improve performance through DirectX 12 continue to work with DirectX 9 or 11. Graphically demanding games always benefit from Huddy. But this presupposes that the developers have mastered the handling of DirectX 12. The practice looked a lot different recently and DirectX 12 made more problems than it brought advantages.
It seems to me that developers who lack the necessary expertise or incentive to use DX12 will tend to use off-the-shelf graphics engines anyway—the CryEngine, Unity3D, the Unreal Engine, FrostBite etc.—which now come with Vulkan and/or DX12 support.
The huge problem is new hardware. The development of a AAA title takes at least as long as it takes for the OEMs to bring out a new generation of hardware, which might bring new capabilities and might require different programming solutions. In DX11 it is mostly up to the OEM´s driver team to handle this. In DX12 the game developer is forced to make those changes, which means after launch support becomes much more of an effort. And if consoles also go to a 3 year refresh cycle, this means you will need to have to adjust already released games to the new console hardware. This is a much smaller task on DX11 when you look at the workload of the game developer.
LLAPI were great when the hardware refresh cycles were slower and when the development of a game took much less time and resources. If you start with CGN1.1 and Maxwell those are the first cards that need to work with your DX12 application, then you add anything sold today and anything coming out this year plus anything coming out in 2019 + consoles. That is a lot of work. With DX11 it is not so much your problem.
Certainly bodes well for AMD, both for CPU and GPU. Performance at 6 cores / 6 threads was noticeably higher than 4 cores / 8 threads.Sniper Elite 4 is perhaps the best DX12 implementation to date, all video cards receive performance boost fron DX12, even Kepler based GPUs. Fury cards recieve the largest boost, but that's,s because they have very bad DX11 performance to begin begin with. Also all GPUs receieve a moderate boost from Asynchronous Compute, except GCN1, Kepler and Maxwell GPUs.
http://www.pcgameshardware.de/Sniper-Elite-4-Spiel-56795/Tests/Direct-X-12-Benchmark-1220545/
@CarstenS any chance these tests could be re-run at least for GCN1 with the new Radeon Software 17.2.1 drivers? Apparently those driver re-enable async compute on GCN1 https://forum.beyond3d.com/posts/1965805/Sniper Elite 4 is perhaps the best DX12 implementation to date, all video cards receive performance boost fron DX12, even Kepler based GPUs. Fury cards recieve the largest boost, but that's,s because they have very bad DX11 performance to begin begin with. Also all GPUs receieve a moderate boost from Asynchronous Compute, except GCN1, Kepler and Maxwell GPUs.
http://www.pcgameshardware.de/Sniper-Elite-4-Spiel-56795/Tests/Direct-X-12-Benchmark-1220545/