No DX12 Software is Suitable for Benchmarking *spawn*

Discussion in 'Architecture and Products' started by trinibwoy, Jun 3, 2016.

  1. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,151
    Likes Received:
    571
    Location:
    France
    Ah yes indeed. Still "sad" for people who already done the game.
     
  2. Svensk Viking

    Regular

    Joined:
    Oct 11, 2009
    Messages:
    500
    Likes Received:
    55
    It's a pity Kepler DX12 isn't supported. I tried it anyway and it starts, but the game freezes at a black screen after loading the mission.

    DX12 made wonders already in Hitman 1 for this old Phenom II X4.


    Actually, Hitman 2 DX12 does work on Kepler now. The latest Nvidia driver mentioned a fix for Hitman 2 crashing, so I tried the game again and now it works in DX12.
     
    #1262 Svensk Viking, Mar 27, 2019
    Last edited: May 11, 2019
    pharma and Lightman like this.
  3. Alessio1989

    Regular Newcomer

    Joined:
    Jun 6, 2015
    Messages:
    582
    Likes Received:
    285
    CaptainGinger and Lightman like this.
  4. PizzaKoma

    Newcomer

    Joined:
    Apr 29, 2019
    Messages:
    39
    Likes Received:
    64
  5. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,972
    Likes Received:
    3,050
    Location:
    Pennsylvania
    What would we be benchmarking against? Different Nvidia models?
     
  6. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,166
    Likes Received:
    1,836
    Location:
    Finland
    And future Intel models and future AMD models.
     
  7. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,972
    Likes Received:
    3,050
    Location:
    Pennsylvania
    Of course. The point was there's no comparison yet, no way to quantify the efficiency of a GPU doing VRS compared to others.
     
  8. Florin

    Florin Merrily dodgy
    Veteran

    Joined:
    Aug 27, 2003
    Messages:
    1,644
    Likes Received:
    214
    Location:
    The colonies
    That’s useful enough by itself, right?
     
  9. troyan

    Newcomer

    Joined:
    Sep 1, 2015
    Messages:
    120
    Likes Received:
    181
    Wolfenstein Young Blood and the Modern Warfare remake will use VRS. UL is just catching up with game developers.
     
    pharma and Lightman like this.
  10. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,827
    Likes Received:
    4,450
    It's about as useful as the PCIe 4.0 test.

    But didn't Intel already show this test in their Ice Lake Gen11 GPU presentation?
     
  11. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,802
    Likes Received:
    473
    Location:
    Torquay, UK
    I know where your coming from, but these tests are showing real improvements which once game engines start tapping into, will give us more eye candy. In case of PCIe 4 we have been there already with PCI, AGP x1 x2 x4 x8, PCIe 1.0 2.0 ...
    If we were to go back now to AGP x4 with our modern game engines I think you would spot the difference ;)

    For now obviously it is a feature which gives something to professionals working with huge data sets, but for gamers it's a non-important logo on a box. In the next 5 years it will be of benefit even in games.
    I personally won't upgrade my Threadripper motherboards because of PCIe 4.0 as I have enough lanes not to be bothered with it, but when I decide to do it, it will be nice to be able to visually verify that it works as intended and tests like this are one way of doing it.

    But ... PCIe 5 might come before I will do that!
     
    pharma and BRiT like this.
  12. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,827
    Likes Received:
    4,450


    The thing is this test doesn't seem to be made to simulate realistic scenarios. It's made to show the tech's potential in an ideal scenario. We already have real in-game implementations for variable rate shading, and they don't show nearly as much of a performance upgrade as this test implies:
    https://techreport.com/review/34269...ading-with-wolfenstein-ii-the-new-colossus/3/

    Also:

     
  13. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,151
    Likes Received:
    571
    Location:
    France
    Actually this test allowed me to see that my x99 - Vega combo has terrible pcie performance, even if it's running at pcie3 16x :neutral:
     
  14. Per Lindstrom

    Newcomer Subscriber

    Joined:
    Oct 16, 2018
    Messages:
    19
    Likes Received:
    15
    Yes, only 5.2 GB/s here, a little bit better with HBCC enabled, round 7 GB/s then.

    http://www.3dmark.com/pcie/26027
     
    Lightman and Rootax like this.
  15. CaptainGinger

    Newcomer

    Joined:
    Feb 28, 2004
    Messages:
    92
    Likes Received:
    47
    Does anyone have any insight into why the PCIE speed test requires DX12?
     
  16. fellix

    fellix Hey, You!
    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,486
    Likes Received:
    397
    Location:
    Varna, Bulgaria
    Lightman, BRiT and CaptainGinger like this.
  17. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,151
    Likes Received:
    571
    Location:
    France
    Glad it's not a bug on my setup then. Still strange...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...