No DX12 Software is Suitable for Benchmarking *spawn*

Discussion in 'Architecture and Products' started by trinibwoy, Jun 3, 2016.

  1. ieldra

    Newcomer

    Joined:
    Feb 27, 2016
    Messages:
    149
    Likes Received:
    116
    Computer base posted a benchmark review. Gains for at high fps low resolution for both AMD and NV.

    At high resolution where performance critical code on the gpu is more important DX11 takes back the lead for NV.
     
  2. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,791
    Likes Received:
    2,602
    DX12 increases VRAM usage in all DX12 games I have, Tomb Raider, Deus Ex MD, Division.. Etc. The increase ranges from 300MB, to a full 1GB.

    In the case of Division, I registered a 500MB increase in the same area under DX12.
    They seem to be oblivious to the fact that visual settings are bugged under DX12.
     
    pharma and Razor1 like this.
  3. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,791
    Likes Received:
    2,602
    Just tried the Division at Ultra preset (it's a step down from maxed settings), which sacrifices Shadows (in the form of PCSS), Reflections, Object Details, Post AA, and Ambient occlusion (runs it at High instead of Ultra, or HBAO+). DX12 was like 2 fps faster than DX11 in the Internal Benchmark and an actual gameplay walkthrough.

    It seems we have another Rise of Tomb Raider situation on our hand, where DX12 offers worse image quality options compared to DX11 (no VXAO in DX12). For now Division has no HBAO+, PCSS, and for NV users no HFTS! We'll see if that also remains the case down the line! But I am not holding my breath.


    EDIT: GTX 1080 DX11 vs DX12 Ultra: DX12 is able to provide a 5fps boost to fps during gameplay!
     
    #823 DavidGraham, Dec 17, 2016
    Last edited: Dec 27, 2016
  4. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,791
    Likes Received:
    2,602
    Why don’t developers love DX12?

    "But interestingly that doesn’t mean that it is needed by everyone. The extra level of control brings with it a certain amount of extra complexity – and that means that some developers might be reluctant to move to DirectX 12 – indeed, a game which is not really limited by GPU horsepower, and which isn’t bottlenecked by just one CPU thread isn’t usually going to gain much from moving to DirectX 12.

    "In those cases DirectX 11 or DirectX 9 likely represents a perfectly acceptable way to design titles. But titles which are highly graphically ambitious almost invariably benefit from DirectX 12.“

    In short, game developers have decided that DX12 isn’t worth the extra time and effort on all but the most demanding titles at this point. This despite the general consensus that DX12 is better on many titles over DX11, especially for those with true DX12 empowered video cards.

    http://www.techradar.com/news/the-forgotten-api-just-what-is-going-on-with-dx12
     
    Razor1 and pharma like this.
  5. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    74
    Likes Received:
    107
    I think the big problem is more that with a low level API, it becomes very easy to shoot yourself in the foot. This is particularly true when you're dealing with manual memory management. This means that unless you have a lot of resources in order to test on many different platforms and fix problems (usually nasty hard to figure out and fix ones involving drivers), you're probably going to end up actually loosing performance compared to DX11, as well as ending up with severe problems on a couple platforms. This isn't even mentioning API complexity issues.
     
    DavidGraham likes this.
  6. Putas

    Regular Newcomer

    Joined:
    Nov 7, 2004
    Messages:
    392
    Likes Received:
    59
    These articles are just stating the obvious, we knew DX11 is going to coexist alongside DX12.
     
  7. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,288
    Location:
    Helsinki, Finland
    Everybody seems to be missing the obvious answer: You still need to support DX11 on PC. There are too many Windows 7 and Windows 8 customers out there. DX12 doesn't allow you to cut DX11 support. With DX11 + DX12, you have to support and maintain two versions of the game (including two set of IHV driver bugs). Once DX11 becomes obsolete (= Windows 7 is no longer popular), DX12 or Vulkan will become the preferred API for AAA cross platform development, since these APIs match the consoles much closer in resource management and feature set. Having to support DX11 on PC is already causing unoptimal design choices on consoles (similarly than DX9 on consoles slowed down compute shader transition on PC).

    Right now is seems that Vulkan will have the upper hand in the near future, since it supports Windows 7 and 8. Most AAA games have already dropped support for Fermi (GeForce 480) and Terascale 3 (Radeon 6970). So Vulkan minimum requirements should no longer be a problem for AAA games. It could be the sole API on PC. I'd guess that the biggest problem is the shaders written in HLSL. Big AAA games/engines have huge amount of HLSL code. But tools to solve that issue are already in pretty good shape. It is interesting to see how this pans out. Will we have another Vista & DX10 situation = most developers are just waiting for the OS problem to solve itself, or will we see more developers jumping to the Vulkan bandwagon.
     
    Razor1, Alexko, Rodéric and 4 others like this.
  8. seahawk

    Regular

    Joined:
    May 18, 2004
    Messages:
    511
    Likes Received:
    141
    That would mean MS would allow Vulkan on Xbox.
     
  9. troyan

    Newcomer

    Joined:
    Sep 1, 2015
    Messages:
    120
    Likes Received:
    181
    In one of the Q&A sessions about Vega's memory configuration a AMD employee said the same thing:
     
  10. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,933
    Likes Received:
    1,629
    http://www.pcgameshardware.de/DirectX-12-Software-255525/News/ist-nicht-immer-sinnvoll-1220210/
     
  11. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,496
    Likes Received:
    910
    It seems to me that developers who lack the necessary expertise or incentive to use DX12 will tend to use off-the-shelf graphics engines anyway—the CryEngine, Unity3D, the Unreal Engine, FrostBite etc.—which now come with Vulkan and/or DX12 support.
     
  12. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    Their LLAPI support, still need expertise to use them, they aren't just plug and play here ya go :), cause not everyone uses the same shaders. Better to stay away from LLAPI's if the developers aren't comfortable with that.
     
    pharma likes this.
  13. seahawk

    Regular

    Joined:
    May 18, 2004
    Messages:
    511
    Likes Received:
    141
    The huge problem is new hardware. The development of a AAA title takes at least as long as it takes for the OEMs to bring out a new generation of hardware, which might bring new capabilities and might require different programming solutions. In DX11 it is mostly up to the OEM´s driver team to handle this. In DX12 the game developer is forced to make those changes, which means after launch support becomes much more of an effort. And if consoles also go to a 3 year refresh cycle, this means you will need to have to adjust already released games to the new console hardware. This is a much smaller task on DX11 when you look at the workload of the game developer.

    LLAPI were great when the hardware refresh cycles were slower and when the development of a game took much less time and resources. If you start with CGN1.1 and Maxwell those are the first cards that need to work with your DX12 application, then you add anything sold today and anything coming out this year plus anything coming out in 2019 + consoles. That is a lot of work. With DX11 it is not so much your problem.
     
    pharma, DavidGraham and Razor1 like this.
  14. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    And thus why the market has diverged instead of come together. There is no easy way around it, the work just has to be done.
     
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,791
    Likes Received:
    2,602
  16. Infinisearch

    Veteran Regular

    Joined:
    Jul 22, 2004
    Messages:
    739
    Likes Received:
    139
    Location:
    USA
    Thats nice to see... I wonder if they'll be any papers on there implementation.

    edit - their
     
    #836 Infinisearch, Feb 14, 2017
    Last edited: Feb 14, 2017
  17. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,045
    Likes Received:
    3,114
    Location:
    Pennsylvania
    Certainly bodes well for AMD, both for CPU and GPU. Performance at 6 cores / 6 threads was noticeably higher than 4 cores / 8 threads.
     
  18. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,186
    Likes Received:
    1,841
    Location:
    Finland
    @CarstenS any chance these tests could be re-run at least for GCN1 with the new Radeon Software 17.2.1 drivers? Apparently those driver re-enable async compute on GCN1 https://forum.beyond3d.com/posts/1965805/
     
  19. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,933
    Likes Received:
    1,629
    #839 pharma, Feb 15, 2017
    Last edited: Feb 15, 2017
    Lightman likes this.
  20. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,001
    Likes Received:
    4,574
    Impressive showing of Fiji cards in Sniper Elite 4.
    And once again the multi-GPU performance scaling in DirectX12 is nothing short of impressive, just as I've been saying for months given my personal experience in ROTR, Deus Ex Makind Divided and AotS.

    AMD is claiming up to 100% mGPU scaling and just look at these results:

    [​IMG]


    The R9 290 can be found at less than 150€ in ebay. Who would say a 300€ mGPU combo could ever match a recent 650€ single card in 4K.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...