No DX12 Software is Suitable for Benchmarking *spawn*

Discussion in 'Architecture and Products' started by trinibwoy, Jun 3, 2016.

  1. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    11,052
    Likes Received:
    6,733
    Location:
    Cleveland
    Those pictures are absolutely hard to rwad on the B3D Dark theme, what with the black on black text.
     
  2. Alessio1989

    Regular Newcomer

    Joined:
    Jun 6, 2015
    Messages:
    554
    Likes Received:
    264
    This is why I do not play games on NVIDIA GPUs.
     
  3. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    843
    Probably true for a lot of those early DX12 games, which lets be honest AMD was more proactive-engaged with developers on.
    Does raise the questions which games should be considered good DX12 implementations, maybe such as Gears of War 4 and Sniper Elite 4 but neither of those are perfect as well.

    Battlefield series with the Dice engine is another that seems more suited towards AMD, but it would be interesting to see how it performs on TitanV as some unoptimised games (Battlefield V is probably not that bad) do run notably better on it rather than Pascal from a trend-behaviour perspective.

    Just to say separately.
    But then all of this is further compounded by scene,game options, and even possibly such as exlusive full screen/UWP/etc to whether on Ryzen or Intel platform.
    My biggest gripe is that any publication these days must test on both CPU platforms and not just one, because we sometimes see quirky behaviour such as what happened with RoTR, and it would be fairer for consumers as gives a more complete picture on the route they want to go with purchasing a complete PC solution.
     
  4. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,427
    Likes Received:
    1,812
    Frostbite engine has had obnoxious DX12 implementation in all of it's games so far: BF1, SWBF2, and FIFA18. The situation is the same in BFV too. Deus Ex still has problems to this day, DX11 is faster and more reliable on all GPUs, at least on good CPUs.

    Probably the best implementation so far is Hitman and Sniper Elite 4. Both have DX12 boosting fps on all GPUs. Forza series and Gears 4 come a close second. But these games are DX12 only, so knowing whether DX11 will benefit them or not is not possible.

    This way you are being bottlenecked more in DX11 games, as AMD driver overhead is higher there. An NVIDIA GPU works better with low end CPUs. That was proven multiple times through the tests carried out by Digital Foundry.
     
  5. Alessio1989

    Regular Newcomer

    Joined:
    Jun 6, 2015
    Messages:
    554
    Likes Received:
    264
    I really do not care. I am not going to use in the gaming PC a GPU that needs to install a driver-as-a-spyware (otherwise need to wait too much to update to the last driver). People are obsessed spending thousands of bucks having high end CPUs (though most of them run low-tier silicon only), I am not, I do not change the basic hardware components every couple of year. Last system I changed MB+CPU+RAM the system was more then 7 years old. Last gaming GPU I changed was a R9 280 (7950 rebrand if I remember correctly). DX12 helped that old system a lot. I do not care if it is only due AMD driver overhead, GCN implementation non-monolitich API friendly, both or whatever... Anyway I will give Serious Sam Fusion a try with this new system, though both Vulkan and DX12 implementations should be still in beta (another never ending beta? XAUDIO 2.9 implementation also is still beta too, and it doesn't run smoothly)... Actually I am running a pretty Rysen 5 1500X, maybe I will change in the near future for development purposes, having 16 threads spending a modest sum is really attractive (or maybe I will double the RAM amount if the price will be decent... 16GB are becoming tight when running multiple virtual machines..)..
    It would be also nice to see how much this spectre fixes really impacts on average bob/joe/etc system performance: on my Surface Pro 3 (which runs a 4th generation Intel Core CPU) the security fix wasn't so much appreciated, the I/O has been devastated (maybe is a ULP SoC thing only...or maybe not), and sadly, more microcode updates for security fixes should coming :|
     
    #1205 Alessio1989, Jul 6, 2018
    Last edited: Jul 6, 2018
    ToTTenTranz likes this.
  6. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    843
    What is interesting both of those are Ryzen platforms and with 1080ti with poor performance possibly hit with a big DX12 optimisation-quirk (such as but not necessarily identical to RoTR) penalty, while some I checked with AMD issues were using Intel platforms but Nvidia seems to be doing ok; too early to be conclusive as ideally publications would use same settings and test on both Intel and AMD platforms, unfortunately no-one is doing that and to me it is pretty critical of any game/GPU testing going forward.

    Multiple platforms are creating a distinction IMO when it comes to development-optimisation, and this has a potential (not always but occationally) knock-on effect with the GPUs, especially when it involves DX12.
     
  7. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,427
    Likes Received:
    1,812
    World Of Warcraft has finally received the promised DX12 patch, Blizzard notes that DX12 is only for AMD and Intel GPUs, NVIDIA GPUs will stick to DX11 by default until the DX12 renderer is updated for them.

    DX12 increased performance of AMD GPUs by 2-3% at best, However NVIDIA GPUs remained dominant with the DX11, a GTX 1080 and a GTX 1060 are 23% faster than a Vega 64 and RX 580, respectively @1080p, and 18% faster @4K.
    https://www.computerbase.de/2018-07/world-of-warcraft-wow-dx12-directx-12-benchmark/

    The benefits of the DX12 patch then? Currently if you have a low end CPU with a high end AMD GPU, then you can have your fps improve by ~10%. Though a comparable NVIDIA GPU will achieve more fps @DX11 with that very same low end CPU.
    https://www.computerbase.de/2018-07...sor-benchmarks-auf-einer-rx-vega-64-1920-1080
     
    #1207 DavidGraham, Jul 19, 2018
    Last edited: Jul 19, 2018
    pharma, Lightman and Malo like this.
  8. Alessio1989

    Regular Newcomer

    Joined:
    Jun 6, 2015
    Messages:
    554
    Likes Received:
    264
    Most of the user base do not update the cpu every 2-3 years, and most of them do not have a 250+ €$£ CPU anyway. Most of them change the main components (CPU+MB) every 5-6-7 years (a Ryzen 5 1600X cannot called "low end CPU!", otherwiser what about celeron and pentium?). WoW DX12 backend may not be perfect, but could be still appreciated by a good share of the userbase.
     
  9. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,633
    Likes Received:
    1,374
    Battlefield V Open Beta: PC performance benchmarks
    https://www.guru3d.com/articles-pag...-graphics-performance-benchmark-review,1.html
     
    Lightman and Alexko like this.
  10. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,390
    Likes Received:
    802
    Interesting that Vega 64 is behind the GTX 1070 at 1080p, but right between the 1080 and 1080 Ti at 4K. I mean, this behaviour is not new, but it's particularly pronounced here. I'll be curious to see where they land once the dust settles, in the final version of the game, after a driver update or two.
     
  11. Magnum_Force

    Newcomer

    Joined:
    Mar 12, 2008
    Messages:
    96
    Likes Received:
    64
    Also look at the Fury vs Vega 56 results at 1080P. The Fury is what... only 15% behind the Vega 56?
     
  12. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,427
    Likes Received:
    1,812
    They tested the game @DX12, even though it has worse performance than DX11 across all GPUs. Right now DX12 in frostbite is absolute trash. But DICE is working to fix it for DXR and RTX.
     
  13. BoMbY

    Newcomer

    Joined:
    Aug 31, 2017
    Messages:
    61
    Likes Received:
    24
    DICE said they want to get DX12 in BFV working as good as in BF1, which is a huge joke. I really don't get how they can be running with that monkey implementation for that long? The Mantle implementation back in BF4 was smooth as butter, and DX12 is trash since day one. I really don't get how they can be that incompetent? I thought Mantle was developed in cooperation with Johan Andersson, so he should know how low level APIs work - doesn't he work for DICE anymore, or was it all AMD who did the work?
     
  14. Malo

    Malo YakTribe.games
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,557
    Likes Received:
    2,591
    Location:
    Pennsylvania
    Has anybody ever done DX12 tests on 2C4T CPUs in BF1 multiplayer? Or even 4C4T?
     
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,427
    Likes Received:
    1,812
    Right now DX11 is 12% faster than DX12 on both Vega and Pascal (according to HardwareUnboxed). If that statement remains true, the best we can hope for is a tie with DX11 at the most!



    What would be the point? these games require 4 cores at the least anyway. Playing with a high end GPU also requires a relatively high end CPU. Even if DX12 is only helping low end CPUs, it would still be a moot point for PC gamers.
     
    #1215 DavidGraham, Sep 6, 2018
    Last edited: Sep 6, 2018
    Heinrich4 and Malo like this.
  16. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,390
    Likes Received:
    802
    DX12 is supposed to enable lower CPU overhead, so it'd be interesting to see whether that materialises. And even though gaming enthusiasts tend to play games on high-end hardware, it would be interesting at least in the context of integrated chips that have a common power budget for the CPU and GPU cores.
     
    DavidGraham likes this.
  17. Svensk Viking

    Regular

    Joined:
    Oct 11, 2009
    Messages:
    491
    Likes Received:
    43
    Back when Mantle was released for BF4, people with Phenom II X4 got a huge performance boost. One person I knew who played Dragon Age Inquisition on an i5 650 got minimum 30 FPS using Mantle, compared to around 20 averages in DX11.

    It's a huge flaw for PC gaming that you're expected to have a high-end CPU if you have a high-end GPU. Someone who got an i5 2500 + GTX 670 back in the day could just pop in an GTX 1080 Ti (even a GTX 1060 would be a good improvement, though still bottlenecked), but the CPU would bottleneck. Even people with high-end CPUs can be CPU limited if they have 144 hz monitors.
    If proper DX12 support for BFV could be even remotely as good as Mantle was for BF4, it would be awesome for everyone.
     
    #1217 Svensk Viking, Sep 6, 2018
    Last edited: Sep 6, 2018
    Kej, ToTTenTranz, entity279 and 3 others like this.
  18. Rootax

    Regular Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    930
    Likes Received:
    426
    Location:
    France

    Maybe the game is bandwidth limited on some scenario en AMD cards ? So, Fury having more raw bandwidth than Vega 56 helps a little ?
     
    keldor likes this.
  19. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,427
    Likes Received:
    1,812
    pharma, Lightman and Heinrich4 like this.
  20. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    11,052
    Likes Received:
    6,733
    Location:
    Cleveland
    Damn, those translucent images are hard to read in the one true B3D theme (Dark).
     
    Garrett Weaving and trinibwoy like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...