No DX12 Software is Suitable for Benchmarking *spawn*

I think you need to invest in G-Sync/FreeSync system to see that 40FPS+ is perfectly fine for any TPP or slow paced FPP game out there. I play Witcher 3 maxed with no Hairworks at 1440p on my 290X and it's perfect since I bough FreeSync monitor (35FPS-60FPS in game). This was my way of skipping Fury X as an upgrade not worth spending money on and I cannot be happier with my decision!
True,
but how many out there actually invest in a solution that locks them into either AMD or Nvidia?
And try compare a G-sync or FreeSync running at 30-40fps to same tech running at 80-90fps on a 120-144Hz refresh monitor.

Most going either of the VRR will use it in conjunction with 144Hz monitor or close to that, although I appreciate it would be useful as well with 4k at 60Hz but response will still not be equal.
Cheers
 
This is amusing.

1440p must be used because more people have it than 4k monitors. There's even more people with 1080p monitors than 1440p monitors so that should be more important right?

But what if games are capped at 60 FPS on those monitors? That doesn't tell us anything.

Well, high refresh rate monitors exist! But what if the number of 1440p monitors with high refresh rate equal the number of 4k monitors in actual gaming use? And how about the 21:9 monitor owners? Shouldn't those be represented? And how about 1600p monitors?

Arbitrarily picking a resolution just because it makes the hardware you like look better doesn't serve a purpose.

4k is relevant. 1440p is relevant. 1600p is relevant. 1080p is relevant. The 21:9 variations are also relevant. Hell a 21:9 1440p monitor is going to be closer in performance to a 4k display than a 16:9 1440p display and those are quite popular with enthusiast gamers.

For example, I wish more reviewers would test various resolutions with a 60 FPS cap. Why? Because I'd like to see power consumption numbers of a realistic gaming scenario for the vast majority of gamers.

1080p@60hz, 1440p@60hz, 4k@60hz is far more representative of what most gamer's will be running their games at. Hence power consumption numbers at greater than 60 FPS are interesting and relevant to a small minority of gamers, but not terribly relevant to most gamers, IMO.

Regards,
SB
 
This is amusing.

1440p must be used because more people have it than 4k monitors. There's even more people with 1080p monitors than 1440p monitors so that should be more important right?

But what if games are capped at 60 FPS on those monitors? That doesn't tell us anything.
SB
Then a reviewing publication should also have a decent 1440p 120-144Hz refresh monitor....
How many buy a 980/980ti/390x/FuryX to play on a 1080p monitor at 60Hz?
Either way your going to need a product with a high refresh rate to overcome locked VSYNC at 60fps.
However worth noting that capped fps should be less frequent with UWP games since Microsoft seems to have accepted the complaints from gamers.
Yeah appreciate it also means previous games released on UWP will also need patching as well.

Anyway if you only had a choice of 1 resolution would it be 1080/1440/4k.....
That was the original context due to some keen amateur benchmarkers only doing 1 resolution for a game or chapter.
Ideally you would have 1440p with both or at least one of the other resolutions.
Cheers
 
How many buy a 980/980ti/390x/FuryX to play on a 1080p monitor at 60Hz?

Looking at the Steam Hardware Survey where the people using monitors over 1080p is below 4%, I'd say many, if not most, who buy high-end cards play in a 1080p monitor / TV with prospects of longevity and maxing out anti-aliasing or DSR/VSR.

Eventually, I'm getting a 21:9 1440p FreeSync monitor, but I've had two R9 290X in order to ensure 60 FPS V-Synced & VSRed 2560*1600 in a 1920*1200 24" monitor. And I only want 60 FPS because with V-Sync the other option is 30 Hz (or worse 15Hz) which is indeed a bit low. But as soon as I get my Freesync monitor, I'll just enable FRTC to 65Hz tops.
 
Hitman episode 3 comparion of Fury X, 980Ti and 1080.
Tss, tss, look at dude's jacket on FuryX in comparison with GTX1080 at 2:15, it's definitely not a streaming issue, I wonder whether AMD forced developers to screw up the carpet on gtx1080 just for parity:yep2:
 
Another reason to ban people from this forum: those who think testing at 8MP is not the most preferable choice for the most powerful cards on the market.
Is this directed at me regarding banning?
You missed the next line that gave that statement you quoted context which was:
me said:
That was the original context due to some keen amateur benchmarkers only doing 1 resolution for a game or chapter.

Anyway....
Have a guess what resolution Nvidia promote for latest Mirrors Edge game?
Yep 1440p.
And before responding, yes I appreciate other resolutions can be used but the point and context is this suggests they see 1440p being more relevant.
In same way Microsoft has responded by removing the capped framerate from UWP to ensure higher performance at 1080p and 1440p, which they only did after complaints from gamers.
Maybe this suggests as I said the "true" 4k GPUs will be seen as the 1080ti/Titan/big Vega.
Cheers

Edit:
Here is recommendation:
mirrors-edge-catalyst-nvidia-recommended-graphics-cards-update.png


I could make many more factual points, but it is academic as we will disagree.
 
Last edited:
Why does it have to be one choice? :rolleyes: What are you, FutureCPO*?

*Future Console Peasant Oafs

:mrgreen:
 
Of course.

You seem to think this forum is something it isn't. This is a technical forum.

It's not about benchmarking for the sake of making a purchasing decision to go with a particular monitor.
I like how you ignored everyone else who focused as well on that context, including what they play at....
Anyway last point, I also outlined why 1440p is better from a technical standpoint than 4k for now.
I could point out all the other posts by others that meet your criteria for banning....
But then that would be petty.
As a reference my initial post where several decided to argue with me was about benchmark testing and 4k being limited, eventually I explained part of this comes down to HW architecture-spec and gave an example of 1080 still not being ideal because its ROPs,SM,GPC structure are that of a 980 rather than 980ti.
But why am I defending myself I do not know.
OP that kicked this off: https://forum.beyond3d.com/threads/...-benchmarking-spawn.58013/page-4#post-1919394
 
Last edited:
Why does it have to be one choice? :rolleyes: What are you, FutureCPO*?

*Future Console Peasant Oafs

:mrgreen:
It doesn't, unless it is a discussion involving opinions :)
But one aspect not touched and why there is a good reason for using multiple resolutions and not just 4k; look at how AMD and NVIDIA performance trends change as they go from 1440p to 4k.
Many instances it can be shown that Nvidia performance starts to trail off compared to AMD at 4k when compared to 1440p and lower where Nvidia is strong.
Point being the HW is not really designed currently for optimal 4k, although generally AMD does better in that regard.

Cheers
 
Last edited:
Is this directed at me regarding banning?
You missed the next line that gave that statement you quoted context which was:


Anyway....
Have a guess what resolution Nvidia promote for latest Mirrors Edge game?
Yep 1440p.
And before responding, yes I appreciate other resolutions can be used but the point and context is this suggests they see 1440p being more relevant.
In same way Microsoft has responded by removing the capped framerate from UWP to ensure higher performance at 1080p and 1440p, which they only did after complaints from gamers.
Maybe this suggests as I said the "true" 4k GPUs will be seen as the 1080ti/Titan/big Vega.
Cheers.

Since this is OT...
If they have a GPU capable of doing 60fps at 4K in Hyper setting, they will have it in their recommendation. Actually, 1080X is doing a lot better in 4K vs FuryX on that game http://www.gamersnexus.net/game-ben...st-graphics-card-benchmark-gtx-1080-1070-390x
 
This makes this the third game in which The FuryX's limited Vram impedes it from running the maximum texture setting, After Rainbow 6 Siege and Doom.
 
This makes this the third game in which The FuryX's limited Vram impedes it from running the maximum texture setting, After Rainbow 6 Siege and Doom.

Im not quite sure i will call the textures of Mirror edge " maximum textures" . If something is eating so much Vram, it is not the textures ( relative to quality anyway )
 
I think it is more to do with the post processing of Ultra rather than memory per se, along with maybe the driver and dynamic VRAM not being optimal - excluding Hyper.

The gap between the 8gb 390x and FuryX is static at around 18% faster when looking at both 1080p and 1440p.
However at these resolution and using Ultra the 980ti performance increases from 18% to roughly 45% faster.
But the FuryX at 4k with High setting it is actually faster than the 980ti.

So seems it may be a combination of the post processing Ultra and possibly the driver dynamic VRAM affecting FuryX and 390x.
Maybe we will get to see some analysis of the memory behaviour on both Nvidia and AMD when it is reviewed again.
Cheers
 
I think it is more to do with the post processing of Ultra rather than memory per se, along with maybe the driver and dynamic VRAM not being optimal - excluding Hyper.

The gap between the 8gb 390x and FuryX is static at around 18% faster when looking at both 1080p and 1440p.
However at these resolution and using Ultra the 980ti performance increases from 18% to roughly 45% faster.
But the FuryX at 4k with High setting it is actually faster than the 980ti.

So seems it may be a combination of the post processing Ultra and possibly the driver dynamic VRAM affecting FuryX and 390x.
Maybe we will get to see some analysis of the memory behaviour on both Nvidia and AMD when it is reviewed again.
Cheers

Well, This "Hyper " settting was not available on the beta, and AMD have only get the same build that was used in the beta. It is quite remind me the Doom situation, in a sense, that suddenly new option, setttings appears on the launch day . im pretty sure that now that AMD got it, a new driver will come pretty soon who adress this issue.. Dynamic Vram is not really an issue with GCN.. In fact it couldl even use the memory of the cpu system for store data who dont need to be retrived fast. ( as we do with OpenCL raytracing, and this permit to pass way over the 3gb limitts of both my gpu's witthout much lost in sample/sec. )

Anyway there's a 38% perforrmance lost from going from ultra to hyper on the 1080 ... quite a big task for not so much quality gain... Actually, reading some forum threads about this game, quite mixed result.. it seems some 980TI users suffer of bad stutters, when some 980 users claim to got a perfect fluid and smooth render frametimes ... well this game and driver will need some fix.
 
Last edited:
Back
Top