No DX12 Software is Suitable for Benchmarking *spawn*

Discussion in 'Architecture and Products' started by trinibwoy, Jun 3, 2016.

  1. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Judging from current and past trends. I second that. Now that BF1 is over, This is the last DX2 title for 2016 (maybe even for the first good half of 2017?). The Year is done, it is time to look back and compare how DX12 stands against DX11 and DX10,

    DX12 was launched July 2015, 18 months later (December 2016) we would have 13 titles for it. no IQ enhancements in site, shaky performance enhancements, and fps drops on many hardware for no reason at all.

    Compared to DX10, launched January 2007, 18 months later (till August 2008), 16 titles sported it, most with Image quality enhancements (softer better shadows, better AA support, better post processing), and performance drops as a results of them. They were limited though and left much to be desired. DX10 was retired prematurely.

    Age of Conan: Unchained
    Assassin's Creed
    BioShock
    Call of Juarez
    Company of Heroes
    Crysis
    Devil May Cry 4
    Gears of War
    Hellgate: London
    Lost Planet: Extreme Condition
    World in Conflict
    Microsoft Flight Simulator X
    The Lord of the Rings Online
    Universe at War: Earth Assault
    Halo 2
    Fury

    Compared to DX11, launched October 2009, 18 months later (till March 2010), 18 titles sported it, most with visual enhancements as well, performance dropped according to these visual effects, DX11 grew in popularity, and became the defacto API to this day.

    BattleForge
    Colin McRae: Dirt 2
    S.T.A.L.K.E.R.: Call of Pripyat
    The Lord of the Rings Online
    Aliens vs. Predator
    Battlefield: Bad Company 2
    Metro 2033
    Civilization V
    F1 2010
    Lost Planet 2
    Medal of Honor
    Tom Clancy's H.A.W.X 2
    Dungeons & Dragons Online
    The Lord of the Rings Online
    Dragon Age II
    Homefront
    Total War: Shogun 2
    Crysis 2

    So, DX12 uptake is slower compared to even DX10, despite enjoying a much bigger installer base of GPUs, and coming in a time where a lot of games indie and otherwise, are being made. It is also distinctively lacking in the visual enhancements field.
     
  2. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    You left RE5 off of the DX10 list. Didn't really do much of anything for it outside of running slightly faster for reasons no one ever understood. It was patched out for some oddball reason when the game's gold version came to Steam last year.
    There's also Saints Row 3 which has, IIRC, settings for DX10 and DX11 or just DX10 and up and they look the same on DX10.

    Not sure what you'd call that one.
     
  3. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Yep, however both RE5 and SR3 were released well after the first 18 months of DX10 era. I only count those games in that period.
     
  4. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    Gotcha.
     
  5. MDolenc

    Regular

    Joined:
    May 26, 2002
    Messages:
    696
    Likes Received:
    446
    Location:
    Slovenia
    There's a reason why feature level 12_0 and 12_1 were back ported to DX11. And there is at least one game out there using 12_1 features even though it runs on a custom IHV hack of DX11.
     
  6. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.
    The problem is that big studios have existing engines, some of which are still D3D9 based (even though they use D3D11, they are not structured to benefit from it), there's terrible inertia, lack of trust in employees being able to write new engines, fright it will turn bad, cost a lot and provide little...
    Basically game companies don't innovate at any level anymore, except for rare exceptions and a few people empowered to take risks because of past success.
    Sad but true.
    There's a reason why a number of highly skilled people went indy, or left the industry...
     
  7. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    I will make it easy: if You see a game who have both DX11 and DX12 modes, then it is not a DX12 titles... you think they backport DX12 titles to DX11 or vice versa ?

    Knowing today, most have allready not the time to complete the DX11 versions, its a miracle to see them with DX12 too...

    Battlefield 1 run extremely well ( whatever is the DX version. ), ~90fps average @ 1440p on a Fury X or 980TI from last generation. 50+ fps average on 4K ( ultra ).. As with every release there will be some bug, but when you compare this and Mafia 3 .. well you know what i mean.
     
    #727 lanek, Oct 19, 2016
    Last edited: Oct 19, 2016
  8. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,296
    Location:
    Helsinki, Finland
    There was no reason to innovate. DX11 doesn't have anything radically new. Rendering is still mostly vertex + pixel shaders. Tessellation was a dud. Lack of multidraw in DX11 (compared to OpenGL) limited the usage of compute shaders to post process effects and lighting. Yes, we have now more efficient lighting and post processing, etc but scene setup and culling are still mostly done by CPU. Some engines hacked around the DX11 limitations, but hacks have downsides. You can't expect big general purpose engines to choose narrow "hacky" rendering techniques to avoid DX11 limitations.

    DX12 has bindless resources, tiled resources and ExecuteIndirect. Tiled resources were already in DX11.2, but that feature was limited to Windows 8. So no sane developer built their resource management around tiled resources. Not being able to sell your game to Windows 7 customers is a huge deal breaker. DirectX 12 centric engines have the same problem. You can't easily emulate bindless resources, tiled resources and/or ExecuteIndirect on DirectX 11.1 (Windows 7). If you want to design an engine that is built on top of these features, games using that engine will be limited to consoles + Windows 10. We need to wait for at least one more year before we get AAA games that get the most of DirectX 12. Windows 10 adaptation rate isn't yet high enough for big AAA devs to drop Windows 7 support. This is exactly the same situation we have with DirectX 10 + Windows XP, except that this time consoles support DX12 -> console devs would be eager to support DirectX 12 on PC.

    I would have liked to see DirectX 12 on Windows 7. This would have increased DX12 adaption rate a lot and allowed developers write pure DX12 centric renderers. Now DirectX 12 is used mostly to buy some extra CPU cycles (mostly helps low end CPUs). This is similar to DirectX 11 compute shader adaptation on last gen games. Compute shaders were only used for additional PC specific high end effects, while 99% of the pipeline was reused from consoles (DX9 based code & design).
     
  9. Rikimaru

    Veteran

    Joined:
    Mar 18, 2015
    Messages:
    1,060
    Likes Received:
    426
    Vulkan is on Windows 7.
    It would be better to drop DX12 for a platform agnostic alternative.
     
  10. pMax

    Regular

    Joined:
    May 14, 2013
    Messages:
    327
    Likes Received:
    22
    Location:
    out of the games
    yeah, that is a curious question. In my limited experience, biga** game engines have an heavy frontend with some #ifdefs and a generic backend layer where they implement the effective renderer (i.e. DX, OGL, PS).
    While I understand the costs of adding one is not small, I miss the reason of why not adding a Vulkan layer which would seamlessy take care of all win-machine park (shader language is somewhat easily convertible already).
     
  11. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,296
    Location:
    Helsinki, Finland
    Vulkan is great for PC and mobile, but console shaders aren't GLSL. Nobody wants to maintain two copies of all of their shaders. There are some GLSL <-> HLSL translators, but not ones with full SM 5.1 feature set. It would be optimal if everybody used SPIR-V as their intermediate code, but Microsoft has already announced that they release their own new intermediate code format (to replace DX ASM) along SM 6.0. Luckily both the new format and the tool chain is open source. It shouldn't be that hard to plug a SPIR-V output somewhere in the tool chain to output Vulkan compatible code for PC and mobile.
     
  12. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    Im sowewhat pretty sure, We could find really soon a translators on AMD GPUopen. They seems really prompt to provide library, tools really fast when needed.
     
    #732 lanek, Oct 19, 2016
    Last edited: Oct 19, 2016
  13. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.
    I failed to precisely and concisely express myself, hopefully that doesn't happen when programming ;p

    I should have said that during the D3D9 era engines were generally/mostly/most often designed around the API, not the hardware, and it's only with D3D10/11 that engines were/could be designed around (more capable) hardware which architecture documentation was (more readily) available.

    An engine designed around the D3D9 API (or equivalent OpenGL version) is not a good start to make a simple, fast, efficient and elegant engine for today's GPU.

    (Although I failed to mention it, when I say D3D12 I usually mean recent lower level API, that is D3D12/Vulkan. The latter working on plenty of OS and devices, is a viable solution to target "latest" hardware without arbitrary market limitations.)
     
  14. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Yeah, The Division has HFTS (Hybrid Frustum Traced Shadows) which uses Conservative Rasterization (only available through NV GPUs) to calculate the effect, It's pretty neat.
     
    #734 DavidGraham, Oct 19, 2016
    Last edited: Oct 19, 2016
  15. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,296
    Location:
    Helsinki, Finland
    Why not on Intel GPUs? Skylake is FL 12_1. Has everything that Pascal/Maxwell has and more. Skylake's resource binding is tier 3 (NV = 2) and conservative raster is tier 3 (NV = 2).
     
  16. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Yeah Skylake has the highest tiers of all DX12 GPUs, however HFTS worked through a hack that enabled it to run through DX11 API, as such it only worked on NV GPUs.
     
    sebbbi likes this.
  17. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    More BF1 tests from computerbase:
    https://www.computerbase.de/2016-10...ramm-battlefield-1-auf-dem-i7-6700k-2560-1440

    [​IMG]

    When CPU limited the picture is reversed somewhat:
    [​IMG]
    However, The site joins pcgameshardware and sweclockers in expressing how horrendous DX12 frame times are, on both NV and AMD, especially during multiplayer.

    https://www.computerbase.de/2016-10...s-auf-dem-fx-8370-radeon-rx-480-einzelspieler
     
    ieldra likes this.
  18. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,296
    Location:
    Helsinki, Finland
    DirectX 11.3 supports conservative raster (no hacks needed). However 11.3 is also limited to Windows 10. I don't know why.

    So Nvidia has some DX 11.0 hack APi that allows conservative raster (on Windows 7)? Wasn't aware of that. Both AMD and Nvidia have DX 11.0 hack APIs for multidraw and UAV overlap. It starts to feel like DX9 all over again. So many IHV specific API hacks to bring DX10 features to Windows XP. Now the same is true for Windows 7 and DX12 :)
     
  19. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Yeah, it works on Windows 7 indeed.
     
    pharma likes this.
  20. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    More BF1 tests showing the usual NV is faster at DX11, loses fps @DX12, AMD gains some fps @DX2:

    [​IMG]
    http://www.hardwareunboxed.com/battlefield-1-benchmarks-20-gpus-tested-at-1080p-1440p-4k/

    [​IMG]
    [​IMG]


    http://www.gamersnexus.net/game-bench/2652-battlefield-1-graphics-card-benchmark-dx11-vs-dx12

    We have benchmark.pl test as an outlier, showing NV with similar fps as AMD in DX12:
    http://www.benchmark.pl/testy_i_rec...wydajnosci-kart-graficznych/strona/26943.html
     
    Lightman likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...