No DX12 Software is Suitable for Benchmarking *spawn*

Discussion in 'Architecture and Products' started by trinibwoy, Jun 3, 2016.

  1. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Sigh says a lot when debate come down to 'Strawman' accusation.
    You do know that Eurogamer in comparing the PS4 to the PS4Pro actually say the games they have tested are more playable with the higher fps than the 30fps and worth it in quite a few of them?
    Again 4k monitors you say detail over performance, then they should not be purchasing 60Hz monitors......
    IF you say they should lower settings to hit close to 60fps, then that just reinforces my point about 33fps-38fps test results and using that as a point on how well FuryX performs (we do not know if any of the post-processing effects reduce performance of cards and so this could change with lower settings).

    By my logic, I am just pointing out it is meaningless to conclude and use much about 4K result in a game when it is at a setting that will never be used; as you said you would cap to 30Hz and then FuryX/1070/1080 are all at parity.
    Nice of you to throw in Strawman accusation (usually I see this thrown around in tech-engineering audio forums where it comes from entrenched positions) without discussing this in a better way.
    Same way I do not bother arguing about FuryX and 1070 card performance at 1080 resolution as it is not really designed for that even though that is more likely than your 4K point as some want as close to as 144Hz as possible in their games.
     
    #861 CSI PC, Feb 17, 2017
    Last edited: Feb 17, 2017
  2. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    @CSI PC, this is not about that, it's about celebrating the FuryX 3 fps lead over the 1070 and the earth shattering difference it would make. At any rate we have had several discussions about this phenomenon before: AMD cards being affected less at max resolution, but faltering on their faces at lower resolutions due to some limitations with their architecture, 3dilettante explained it very well since it dates back to the HD 4870 era. Though I can't seem to locate that post right now.

    Though I seem to disagree with you in that statement:
    As modern games push the visual front harder (like Watch_Dogs 2, Dishonored 2, Ghost Recon Wildlands, .. etc), these cards became unable to run said games at max settings and 1440p, so they are demoted to 1080p instead.
     
    #862 DavidGraham, Feb 17, 2017
    Last edited: Feb 17, 2017
    CSI PC and pharma like this.
  3. It says a lot more when you blatantly use one:

    Anyone who reads the sentence above will understand it for what it is: a strawman.



    No, not all games. It depends on the ones who have frame pacing working correctly. E.g. Final Fantasy XV has this problem, where the Xbone version with generally lower framerate than PS4/Pro but has better gameplay because frame pacing is working correctly.

    Digital Foundry knows quite a bit more than to reduce their argument down to "moar framez = moar better".



    Why don't you list all these 4K PC monitors in the market that only work at 30Hz?



    And I'm pointing out you're awfully wrong about this setting never being used. From someone who's played other games in the series, I can say there's little reason why a person would be awfully bothered with playing single-player Sniper Elite 4 at 30 FPS, or low-30s using some smart vsync mode.
     
  4. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    And so do you at times but I do not go "Strawman!!!".
    I work with a team that has at least PhD with a degree in maths-Physics-engineering-computing science and do you know how many time they go "Strawman not arguing the point anymore!" in the years I have worked in the team?
    Never, and why I do not bother picking up others with that accusation when I see either illogical or strawman arguments.
    BTW you were the one making the case of locking to 30fps and deciding to ignore technical reasons why 60Hz/60fps is usually better and hence it is mainstream for PC (and even why many like 144Hz even over 60Hz) or the logical fact that at 30Hz there is no difference between a Fury X and 1070 or 1080 in Sniper Elite 4.
    But we are digressing.
     
    #864 CSI PC, Feb 17, 2017
    Last edited: Feb 17, 2017
  5. OMG you and your friends are so very smart!
     
    Ike Turner likes this.
  6. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    And again as I keep pointing out then logically there is no difference between Fury X/1070/1080.
    So your original post has no merit.
    Now your moving the goalposts to say using either FreeSync or GSync with low-to-mid variable 30s because they are the only solutions fitting your 'smart' mode for smoothness-latency...
     
    #866 CSI PC, Feb 17, 2017
    Last edited: Feb 17, 2017
  7. pMax

    Regular

    Joined:
    May 14, 2013
    Messages:
    327
    Likes Received:
    22
    Location:
    out of the games
    [​IMG]

    ...I apologize, but I had to do it ;-)
     
  8. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Yeah I agree with that, and one reason I find it frustrating monitor manufacturers push ever higher refresh rates with their monitors rather than upping build quality and component selection and putting out monitors say 90Hz to 120Hz.
    Nightmare these days to reach that 144fps with near max settings, even the custom EVGA 1080 cannot and has fps of 92-109fps at 1080p resolution in Sniper Elite 4.
    Is a dilemma whether to go 60Hz 4K or as close to one can get to 144fps as possible at 1440p with the latency advantages.
    Both need settings lowered a fair chunk on quite a few of the modern games, and as you say and shown with Sniper Elite 4 the reality is the EVGA 1080 is not enough even at 1080p if pushing for that 144fps or anywhere close to it.
    But originally it is fair to say it was never expected they would be needed for just 1080p and between 60fps to 85fps, also the Fury X used to be at a disadvantage at this resolution and was accepted back then scope was for higher resolutions.
    Cheers
     
    #868 CSI PC, Feb 17, 2017
    Last edited: Feb 17, 2017
  9. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    I never realised I said that or was making that point :)

    In my context, maybe you should ask Rebellion why they developed Sniper Elite 3 and Sniper Elite 4 aiming for 60fps where possible on consoles, more so PS4 (Sniper Elite 3) and then PS4 Pro rather than just locking these to 30fps that they had to do for Xbox/Xbox One.
    But then it is about context and what one uses the information for and its scope-limitations, not about renaming thread around 4K no good for DX12 software and suitable benchmarking.
    Case in point it would be interesting to know DX12 performance also with the additional post processing/bolt-on effects turned down as they are not usually optimised to the same extent as the game/rendering engine.
    But hey if people want to go all nuts on how the Fury X is beating a 1070 at 4k but with performance around 33-38.5fps and then ignore the 980ti is also beating the 1070 with very similar performance as the Fury X then go for it - again using PCGamesHardware that IMO has some of the best game testing and benchmarking and also using custom AIB cards.

    Cheers
     
    #869 CSI PC, Feb 17, 2017
    Last edited: Feb 17, 2017
  10. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,423
    Likes Received:
    10,316
    And here I thought the technological part of B3D was more about looking into the technical or architectural points of any given piece of hardware, including strengths and weaknesses. Whether it is playable in a game or not is certainly of interest, but it isn't the be-all end-all.

    The fact that the Fury X, a 4 GB card, loses less performance when more stress is put on the card should be of interest to everyone in this part of the forum.

    Does this mean the Fury X is a better card than the 1070? Who the hell cares? (It's not).

    Instead of people looking to see why it loses less performance at higher GPU loads than the 1070, in this particular instance, people are just dismissing it out of hand. On a forum where all data points should be evaluated to determine the architectural and technical capabilities of each piece of hardware.

    I had hoped that the NeoGAF style of posting (my thing is better than your thing, or that data is meaningless because I don't like it) wasn't infecting this forum, but I guess I must be wrong.

    Regards,
    SB
     
  11. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    To me your original post and Ike's were not about the technical aspect nor the results viability-scope limitations otherwise you would had commented how the 1070 is weaker than both 980ti and Fury X and mulled over why that was.
    Instead we had a post about 4GB HBM with a smiley, and a Ike's saying we should include 4K and that a 2 year old Fury X leapfrogs the 1070.
    That is how it comes across to me anyway, and while interesting from a bandwidth/design limitation worth noting the 1070 was never designed for 4K (if I remember the Nvidia slide-page ages ago with the position of the GPUs and gaming resolution correctly), nor the fact for it to be viable for both Fury X and 1070 they would both need to be 30fps capped resulting in the same performance.

    Cheers
     
    #871 CSI PC, Feb 17, 2017
    Last edited: Feb 17, 2017
    DavidGraham and pharma like this.
  12. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Funny thing in a thread about DX12 VS DX11 performance, people resort to selectively concentrate on a specific card at a specific resolution in a specific game. Then complain about other people resorting to NeoGaf's arguments "mine is bigger than yours". Or complain about dismissing a specific nuance
    (in that instance, the card's 4K performance), when at the same time they dismissed it's consistent horrendous VR performance as a mere "deficient developer support or faulty testing". I guess the cloak of double standard indeed is cloaked to it's wearer.

    This thread had it's fair share of derailment at this point, I suggest discussing Fury's 4K performance in a separate thread.
     
    #872 DavidGraham, Feb 17, 2017
    Last edited: Feb 17, 2017
    CSI PC, lanek and pharma like this.
  13. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    I dont know exactly why this discussion have take this direction there.... should be the end the week.
     
    pharma and DavidGraham like this.
  14. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Malo, Lightman, pharma and 1 other person like this.
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Just tried an obscure DX12 title called The Turing Test, apparently it was released in Dec 2016, and supports both DX11 and DX12 on UE4, tried running the game on both APIs on my 1070, but I can't get the damn game to disable V.Sync at all, so I am locked to 60fps for now. One observation is @1440p, DX12 runs at 60% GPU usage to achieve 60fps, however using DX11 GPU usage jumps to 70% to achieve the same rate.
     
  16. gamervivek

    Regular

    Joined:
    Sep 13, 2008
    Messages:
    805
    Likes Received:
    320
    Location:
    india
    I am pretty sure it was the other way round with the 48xx series, especially with 4890's release when it came up against 275 and the common talking point was how the nvidia card did better in higher resolutions.
     
  17. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,889
    Likes Received:
    4,536
    This might work ... the developer specified the following to disable vsync:

    https://steamcommunity.com/app/499520/discussions/1/343785380901874446/#c343785380902335779
     
    BRiT likes this.
  18. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Tried that, but failed. The game is still locked.
     
    BRiT and pharma like this.
  19. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
  20. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...