No DX12 Software is Suitable for Benchmarking *spawn*

Almost everyone with a PS4 Pro and a 4K TV, playing their games in 4K mode. Which makes it the majority of people with a 4k 60Hz monitor.
Besides, people with a 4K monitor are interested in the higher detail. If their concern was response times and framerates they'd rather go with a 1080p 120Hz panel which is a lot cheaper.

So by your logic, a lower framerate that's good enough for one game must always be good enough for all other games.
Strawman right back at you.
Sigh says a lot when debate come down to 'Strawman' accusation.
You do know that Eurogamer in comparing the PS4 to the PS4Pro actually say the games they have tested are more playable with the higher fps than the 30fps and worth it in quite a few of them?
Again 4k monitors you say detail over performance, then they should not be purchasing 60Hz monitors......
IF you say they should lower settings to hit close to 60fps, then that just reinforces my point about 33fps-38fps test results and using that as a point on how well FuryX performs (we do not know if any of the post-processing effects reduce performance of cards and so this could change with lower settings).

By my logic, I am just pointing out it is meaningless to conclude and use much about 4K result in a game when it is at a setting that will never be used; as you said you would cap to 30Hz and then FuryX/1070/1080 are all at parity.
Nice of you to throw in Strawman accusation (usually I see this thrown around in tech-engineering audio forums where it comes from entrenched positions) without discussing this in a better way.
Same way I do not bother arguing about FuryX and 1070 card performance at 1080 resolution as it is not really designed for that even though that is more likely than your 4K point as some want as close to as 144Hz as possible in their games.
 
Last edited:
@CSI PC, this is not about that, it's about celebrating the FuryX 3 fps lead over the 1070 and the earth shattering difference it would make. At any rate we have had several discussions about this phenomenon before: AMD cards being affected less at max resolution, but faltering on their faces at lower resolutions due to some limitations with their architecture, 3dilettante explained it very well since it dates back to the HD 4870 era. Though I can't seem to locate that post right now.

Though I seem to disagree with you in that statement:
FuryX and 1070 card performance at 1080 resolution as it is not really designed for that
As modern games push the visual front harder (like Watch_Dogs 2, Dishonored 2, Ghost Recon Wildlands, .. etc), these cards became unable to run said games at max settings and 1440p, so they are demoted to 1080p instead.
 
Last edited:
Sigh says a lot when debate come down to 'Strawman' accusation.
It says a lot more when you blatantly use one:

By your logic, it should be fine then that games are never optimised for PC and 30fps lock is great as it is just as playable as 60/144Hz for all games.

Anyone who reads the sentence above will understand it for what it is: a strawman.



You do know that Eurogamer in compaing the PS4 to the PS4Pro actually say the games they have tested are more playable with the higher fps than the 30fps?
No, not all games. It depends on the ones who have frame pacing working correctly. E.g. Final Fantasy XV has this problem, where the Xbone version with generally lower framerate than PS4/Pro but has better gameplay because frame pacing is working correctly.

Digital Foundry knows quite a bit more than to reduce their argument down to "moar framez = moar better".



Again 4k monitors you say detail over performance, then they should not be purchasing 60Hz monitors......
Why don't you list all these 4K PC monitors in the market that only work at 30Hz?



I am just pointing out it is meaningless to conclude and use much about 4K result in a game when it is at a setting that will never be used
And I'm pointing out you're awfully wrong about this setting never being used. From someone who's played other games in the series, I can say there's little reason why a person would be awfully bothered with playing single-player Sniper Elite 4 at 30 FPS, or low-30s using some smart vsync mode.
 
It says a lot more when you blatantly use one:
.
And so do you at times but I do not go "Strawman!!!".
I work with a team that has at least PhD with a degree in maths-Physics-engineering-computing science and do you know how many time they go "Strawman not arguing the point anymore!" in the years I have worked in the team?
Never, and why I do not bother picking up others with that accusation when I see either illogical or strawman arguments.
BTW you were the one making the case of locking to 30fps and deciding to ignore technical reasons why 60Hz/60fps is usually better and hence it is mainstream for PC (and even why many like 144Hz even over 60Hz) or the logical fact that at 30Hz there is no difference between a Fury X and 1070 or 1080 in Sniper Elite 4.
But we are digressing.
 
Last edited:
And I'm pointing out you're awfully wrong about this setting never being used. From someone who's played other games in the series, I can say there's little reason why a person would be awfully bothered with playing single-player Sniper Elite 4 at 30 FPS, or low-30s using some smart vsync mode.
And again as I keep pointing out then logically there is no difference between Fury X/1070/1080.
So your original post has no merit.
Now your moving the goalposts to say using either FreeSync or GSync with low-to-mid variable 30s because they are the only solutions fitting your 'smart' mode for smoothness-latency...
 
Last edited:
And so do you at times but I do not go "Strawman!!!".
I work with a team that has at least PhD with a degree in maths-Physics-engineering-computing science and do you know how many time they go "Strawman not arguing the point anymore!" in the years I have worked in the team?
Never,

11e62c00359416b4d600250e6dd6d1daaa6bfd54ec6be7452482116eebf7b25e.jpg


...I apologize, but I had to do it ;-)
 
@CSI PC, this is not about that, it's about celebrating the FuryX 3 fps lead over the 1070 and the earth shattering difference it would make. At any rate we have had several discussions about this phenomenon before: AMD cards being affected less at max resolution, but faltering on their faces at lower resolutions due to some limitations with their architecture, 3dilettante explained it very well since it dates back to the HD 4870 era. Though I can't seem to locate that post right now.

Though I seem to disagree with you in that statement:

As modern games push the visual front harder (like Watch_Dogs 2, Dishonored 2, Ghost Recon Wildlands, .. etc), these cards became unable to run said games at max settings and 1440p, so they are demoted to 1080p instead.
Yeah I agree with that, and one reason I find it frustrating monitor manufacturers push ever higher refresh rates with their monitors rather than upping build quality and component selection and putting out monitors say 90Hz to 120Hz.
Nightmare these days to reach that 144fps with near max settings, even the custom EVGA 1080 cannot and has fps of 92-109fps at 1080p resolution in Sniper Elite 4.
Is a dilemma whether to go 60Hz 4K or as close to one can get to 144fps as possible at 1440p with the latency advantages.
Both need settings lowered a fair chunk on quite a few of the modern games, and as you say and shown with Sniper Elite 4 the reality is the EVGA 1080 is not enough even at 1080p if pushing for that 144fps or anywhere close to it.
But originally it is fair to say it was never expected they would be needed for just 1080p and between 60fps to 85fps, also the Fury X used to be at a disadvantage at this resolution and was accepted back then scope was for higher resolutions.
Cheers
 
Last edited:
Thread title should be slightly renamed to "No DX12 Software is Suitable for Benchmarking at 4K..because...feelings ¯\_(ツ)_/¯" I guess?
I never realised I said that or was making that point :)

In my context, maybe you should ask Rebellion why they developed Sniper Elite 3 and Sniper Elite 4 aiming for 60fps where possible on consoles, more so PS4 (Sniper Elite 3) and then PS4 Pro rather than just locking these to 30fps that they had to do for Xbox/Xbox One.
But then it is about context and what one uses the information for and its scope-limitations, not about renaming thread around 4K no good for DX12 software and suitable benchmarking.
Case in point it would be interesting to know DX12 performance also with the additional post processing/bolt-on effects turned down as they are not usually optimised to the same extent as the game/rendering engine.
But hey if people want to go all nuts on how the Fury X is beating a 1070 at 4k but with performance around 33-38.5fps and then ignore the 980ti is also beating the 1070 with very similar performance as the Fury X then go for it - again using PCGamesHardware that IMO has some of the best game testing and benchmarking and also using custom AIB cards.

Cheers
 
Last edited:
I never realised I said that or was making that point :)

In my context, maybe you should ask Rebellion why they developed Sniper Elite 3 and Sniper Elite 4 aiming for 60fps where possible on consoles, more so PS4 (Sniper Elite 3) and then PS4 Pro rather than just locking these to 30fps that they had to do for Xbox/Xbox One.
But then it is about context and what one uses the information for and its scope-limitations, not about renaming thread around 4K no good for DX12 software and suitable benchmarking.
Case in point it would be interesting to know DX12 performance also with the additional post processing/bolt-on effects turned down as they are not usually optimised to the same extent as the game/rendering engine.
But hey if people want to go all nuts on how the Fury X is beating a 1070 at 4k but with performance around 33-38.5fps and then ignore the 980ti is also beating the 1070 with very similar performance as the Fury X then go for it - again using PCGamesHardware that IMO has some of the best game testing and benchmarking and also using custom AIB cards.

Cheers

And here I thought the technological part of B3D was more about looking into the technical or architectural points of any given piece of hardware, including strengths and weaknesses. Whether it is playable in a game or not is certainly of interest, but it isn't the be-all end-all.

The fact that the Fury X, a 4 GB card, loses less performance when more stress is put on the card should be of interest to everyone in this part of the forum.

Does this mean the Fury X is a better card than the 1070? Who the hell cares? (It's not).

Instead of people looking to see why it loses less performance at higher GPU loads than the 1070, in this particular instance, people are just dismissing it out of hand. On a forum where all data points should be evaluated to determine the architectural and technical capabilities of each piece of hardware.

I had hoped that the NeoGAF style of posting (my thing is better than your thing, or that data is meaningless because I don't like it) wasn't infecting this forum, but I guess I must be wrong.

Regards,
SB
 
And here I thought the technological part of B3D was more about looking into the technical or architectural points of any given piece of hardware, including strengths and weaknesses. Whether it is playable in a game or not is certainly of interest, but it isn't the be-all end-all.

The fact that the Fury X, a 4 GB card, loses less performance when more stress is put on the card should be of interest to everyone in this part of the forum.

Does this mean the Fury X is a better card than the 1070? Who the hell cares? (It's not).

Instead of people looking to see why it loses less performance at higher GPU loads than the 1070, in this particular instance, people are just dismissing it out of hand. On a forum where all data points should be evaluated to determine the architectural and technical capabilities of each piece of hardware.

I had hoped that the NeoGAF style of posting (my thing is better than your thing, or that data is meaningless because I don't like it) wasn't infecting this forum, but I guess I must be wrong.

Regards,
SB

To me your original post and Ike's were not about the technical aspect nor the results viability-scope limitations otherwise you would had commented how the 1070 is weaker than both 980ti and Fury X and mulled over why that was.
Instead we had a post about 4GB HBM with a smiley, and a Ike's saying we should include 4K and that a 2 year old Fury X leapfrogs the 1070.
That is how it comes across to me anyway, and while interesting from a bandwidth/design limitation worth noting the 1070 was never designed for 4K (if I remember the Nvidia slide-page ages ago with the position of the GPUs and gaming resolution correctly), nor the fact for it to be viable for both Fury X and 1070 they would both need to be 30fps capped resulting in the same performance.

Cheers
 
Last edited:
Funny thing in a thread about DX12 VS DX11 performance, people resort to selectively concentrate on a specific card at a specific resolution in a specific game. Then complain about other people resorting to NeoGaf's arguments "mine is bigger than yours". Or complain about dismissing a specific nuance
(in that instance, the card's 4K performance), when at the same time they dismissed it's consistent horrendous VR performance as a mere "deficient developer support or faulty testing". I guess the cloak of double standard indeed is cloaked to it's wearer.

This thread had it's fair share of derailment at this point, I suggest discussing Fury's 4K performance in a separate thread.
 
Last edited:
Just tried an obscure DX12 title called The Turing Test, apparently it was released in Dec 2016, and supports both DX11 and DX12 on UE4, tried running the game on both APIs on my 1070, but I can't get the damn game to disable V.Sync at all, so I am locked to 60fps for now. One observation is @1440p, DX12 runs at 60% GPU usage to achieve 60fps, however using DX11 GPU usage jumps to 70% to achieve the same rate.
 
@CSI PC, this is not about that, it's about celebrating the FuryX 3 fps lead over the 1070 and the earth shattering difference it would make. At any rate we have had several discussions about this phenomenon before: AMD cards being affected less at max resolution, but faltering on their faces at lower resolutions due to some limitations with their architecture, 3dilettante explained it very well since it dates back to the HD 4870 era. Though I can't seem to locate that post right now.

Though I seem to disagree with you in that statement:

As modern games push the visual front harder (like Watch_Dogs 2, Dishonored 2, Ghost Recon Wildlands, .. etc), these cards became unable to run said games at max settings and 1440p, so they are demoted to 1080p instead.

I am pretty sure it was the other way round with the 48xx series, especially with 4890's release when it came up against 275 and the common talking point was how the nvidia card did better in higher resolutions.
 
Just tried an obscure DX12 title called The Turing Test, apparently it was released in Dec 2016, and supports both DX11 and DX12 on UE4, tried running the game on both APIs on my 1070, but I can't get the damn game to disable V.Sync at all,
This might work ... the developer specified the following to disable vsync:

If anyone would like to push their max fps beyond the cap of 120, follow these steps. However, be warned that this is experimental and can produce unstable results:

  • Navigate to \Steam\steamapps\common\The Turing Test\TheTuringTest\Config\DefaultEngine.ini
  • Update the [/script/engine.engine] section with the following:

    bSmoothFrameRate=false
    MinSmoothedFrameRate=5
    MaxSmoothedFrameRate=240

  • Disable VSYNC in games
  • Open the console using ` and use the command t.maxfps to set the max framerate.
https://steamcommunity.com/app/499520/discussions/1/343785380901874446/#c343785380902335779
 
Back
Top