No DX12 Software is Suitable for Benchmarking *spawn*

I am saying it is better to use 1440p with uncapped frames, along with showing 4k.
I would say not even the 1080FE is truly designed in terms of HW spec/technology implemented as a single card solution for 4k when playing games with enthusiast settings - all of those games they benchmarked show sub-optimal fps at 4k (let alone what a frame analysis would show)
Sure you can make it work, but you really need to see a broad range of resolutions and ideally 1440p.
I assume they used 4k to overcome forced VSYNC?
Although I thought that was now resolved.
Cheers


Well 4k is still valid, its always good to see what happens when different parts of the cards are pushed, yeah theoretically we can figure things out but still, there will be some variance that just can't be noticed when ya do the math by hand. Over all, its best to look at as much data as possible then figure out what is best on a per user basis.
 
I am saying it is better to use 1440p with uncapped frames, along with showing 4k.
I would say not even the 1080FE is truly designed in terms of HW spec/technology implemented as a single card solution for 4k when playing games with enthusiast settings - all of those games they benchmarked show sub-optimal fps at 4k (let alone what a frame analysis would show)
Sure you can make it work, but you really need to see a broad range of resolutions and ideally 1440p.
I assume they used 4k to overcome forced VSYNC?
Although I thought that was now resolved.
Cheers

Yes.

They tested it at 1440p earlier and after the patch/amd driver, it was hitting vsync limit with Fury X as well.

http://www.overclock3d.net/reviews/...erformance_retest_-_the_game_has_been_fixed/5
 
Yes.

They tested it at 1440p earlier and after the patch/amd driver, it was hitting vsync limit with Fury X as well.

http://www.overclock3d.net/reviews/...erformance_retest_-_the_game_has_been_fixed/5
Thanks as we thought.
Fingers crossed they will revisit the game now the patch to uncap frames has been released - assuming the game itself is also patched, annoying if it isn't:
http://www.pcper.com/news/Graphics-...pport-unlocked-frame-rates-and-G-SyncFreeSync
Or they buy a decent fast refresh monitor :)

Cheers
 
Last edited:
Well 4k is still valid, its always good to see what happens when different parts of the cards are pushed, yeah theoretically we can figure things out but still, there will be some variance that just can't be noticed when ya do the math by hand. Over all, its best to look at as much data as possible then figure out what is best on a per user basis.
Yeah but currently 1440p is more ideal than 4k as from a spec aspect no current cards really have the HW for it, but agree to get a feel you use both.
For Nvidia IMO they need a Pascal version of the 980ti with its greater ROPs/GPC/streams architecture to be the 1st of their cards that could be said designed-suitable for single card 4k, along probably with HBM2 or 384-bit GDDR5x.
For now though if you had to choose between just one for a benchmark, 1440p would make more sense with a fast refresh monitor.
Cheers
 
Hitman episode 3 comparion of Fury X, 980Ti and 1080.


Something interesting 2:15 onwards till the end.;)

I think I will wait for publications to bench/analyse it, they seemed to have different figures to RandomDude a few times in the past.
Anyone know what he is using to capture the fps in DX12?
Also 4k.....
Where is 1440p that is more applicable for these cards.
Chapter 2 was as frustrating as that was 1080p on his site :)

The swing of performance between inside and outside for AMD and NVIDIA is interesting; more about the game engine in general rather than the benchmark.
Sort of similar situation to The Division.
Cheers
 
Last edited:
You're saying 40-55FPS isn't enough for a 3rd-person stealth game?
Context, we are talking about benchmark.
Also technically the hardware spec is not really designed-focused for 4k; context being focus of ROPs/GPC/SM/bit-bandwidth.
But then practical reason the answer is still no when there is more 1440p monitors sold to 4k ones, and critically people want a consistent and smooth-low input lag gameplay - it goes beyond just this game.
Maybe that will change with a 1080ti/Titan or big Vega.

If you are happy playing this game at an average of 42fps then that is great, but many of us would not be.
Cheers
 
Last edited:
Also technically the hardware spec is not really designed-focused for 4k; context being focus of ROPs/GPC/SM/bit-bandwidth.


What? The Fury X, 980 Ti and GTX 1080 were definitely championed as "GPUs for 4K" during each one's release.
 
What? The Fury X, 980 Ti and GTX 1080 were definitely championed as "GPUs for 4K" during each one's release.
If you believe that great.

Sure, yeah they are perfect for playing most modern games at close to top settings at 4k and 60fps with no frame latency issues,stuttering,frame swings,input lag.
Marketing is great :)

Anyway the GTX 1080 is the closest but in general still not there as its ROPs/GPC/stream architecture is more similar to a 980 than a 980ti.
The 1st true 4k "champion" GPUs in reality will be the 1080ti/Titan and big Vega.

Cheers
 
...

If you are happy playing this game at an average of 42fps then that is great, but many of us would not be.
Cheers

I think you need to invest in G-Sync/FreeSync system to see that 40FPS+ is perfectly fine for any TPP or slow paced FPP game out there. I play Witcher 3 maxed with no Hairworks at 1440p on my 290X and it's perfect since I bough FreeSync monitor (35FPS-60FPS in game). This was my way of skipping Fury X as an upgrade not worth spending money on and I cannot be happier with my decision!
 
Sure, yeah they are perfect for playing most modern games at close to top settings at 4k and 60fps with no frame latency issues,stuttering,frame swings,input lag.

In the age of Adaptive Sync, 60 FPS is overrated.
 
Oh, yes, notably lower quality textures on the machine on geforces for whatever reason
Sort of reminds me when Gregster tested BF4 and the textures were lower on the Nvidia comparison, turned out was a mistake his end, although did not stop it spreading everywhere that Nvidia was lowering visual quality for performance gains until he could eventually clear the air.
Cheers
 
Back
Top