No DX12 Software is Suitable for Benchmarking *spawn*

The R9 290 can be found at less than 150€ in ebay. Who would say a 300€ mGPU combo could ever match a recent 650€ single card in 4K.
Unfortunately it's one of the rarely big titles nowadays that has low VRAM usage characteristics, maybe the game having a primarily PC-focus? I just wouldn't buy 4Gb cards anymore as a gamer.

The 480CF @ $500 vs 1080 $600 is telling as well. Good to see developers utilizing DX12 mGPU.
 
Unfortunately it's one of the rarely big titles nowadays that has low VRAM usage characteristics, maybe the game having a primarily PC-focus? I just wouldn't buy 4Gb cards anymore as a gamer.

The 480CF @ $500 vs 1080 $600 is telling as well. Good to see developers utilizing DX12 mGPU.

I know it's just this one title and I haven't played the game to see if the graphics are anything out of the ordinary compared to other FPS games (e.g. Battlefield 1).
I'm just happy to see how DX12 seems to be revitalizing multi-GPU solutions.

mGPU has been traditionally great for cost-conscious high-end performance (with a few caveats), but aside from GameGPU almost no one tests SLI/Crossfire performance, and I even remember Anandtech releasing an article stating that mGPU was almost dead in the water.
Regarding the caveats, nowadays almost any AAA game that would make use of mGPU gets a supporting driver within the first couple of weeks after release, and frame pacing + adaptive sync have pretty much solved all the stuttering problems. I can vouch for that.
Even regarding power consumption, with frame-rate control in the latest drivers I'm getting substantially lower consumption by setting a limit to 70 FPS. My PC's consumption at the wall used to be close to 900W in games but now it rarely touches 600W. It's usually in the 550W range. The drivers are doing an excellent job with "clocks-on-demand" here.
 
Why 70fps and not 60fps?
My monitor has Freesync between 40 and 75Hz.
I guess I'll just feel like I'm throwing away capabilities if I don't put it above 60 FPS lol.

Regardless, for strategy games like e.g. Xcom 2 that I played a couple of weeks ago I put the limit at 60 FPS.
 
More Sniper Elite 4 testing:

14151457388l.jpg

https://www.overclock3d.net/reviews/software/sniper_elite_4_performance_review/8

10pmv5u.jpg


https://www.computerbase.de/2017-02...ramm-sniper-elite-4-nvidia-geforce-gtx-1060_2
 
Why not post screenshots of the 4K results where the nearly 2 years old Fury X leapfrogs over the 1070...
At that resolution most would be using factory OC'd cards, and we know how that would end up ...
 
This argument is pretty far - fetched. I figure most nV buyers / fans would go for factory OC given the tradition of the company to encourage such models. Resolution has little to contribute here..
 
Why not post screenshots of the 4K results where the nearly 2 years old Fury X leapfrogs over the 1070...
I always post the 1080p shots, regardless of who is the winner, since 1080p is vastly the most common resolution to date. Besides, most games are still unplayable at 4K (In this game, computerbase downgraded the visuals from Ultra to High, to obtain playable fps at 4K in all situations). Also FuryX used to go toe to toe with 980Ti at 4K or exceed it, so seeing it do the same here is not surprising.
 
Last edited:
Even regarding power consumption, with frame-rate control in the latest drivers I'm getting substantially lower consumption by setting a limit to 70 FPS. My PC's consumption at the wall used to be close to 900W in games but now it rarely touches 600W. It's usually in the 550W range. The drivers are doing an excellent job with "clocks-on-demand" here.
That is insane. My pc takes just over 300W with an OC 1080. It must get hot in your room?
 
That is insane. My pc takes just over 300W with an OC 1080. It must get hot in your room?

Dude this winter has been mighty cold in here. I sometimes have to turn on the heat even when I'm gaming.
 
The problem is the classic lets show at max resolution but never play this game at 38fps average...
We saw the same in the past when comparing 390x to Nvidia, kinda meaningless outside of SLI or actual cards more accepted for 4k (even the GTX1080 is not seen as the ideal 4k GPU for PCs and we need one more generation or Pascal Titan equivalent).

But for sake of comparison, PCGamesHardware at 4k has the 980ti matching the Fury X and slightly ahead at 3% of the 1070, custom AIB in all cases.

At a more playable resolution of 1440, the custom 1070 and custom 980ti are 12% faster than the Fury X.
Playable in this case is 53 to 65fps.
http://www.pcgameshardware.de/Sniper-Elite-4-Spiel-56795/Tests/Direct-X-12-Benchmark-1220545/
Cheers.
 
The problem is the classic lets show at max resolution but never play this game at 38fps average...

Please do explain the problem of playing a game like Sniper Elite 4 at 38 FPS (let's assume vsynced 30 FPS) and why people with 4K monitors will "never" play at that framerate.
 
Please do explain the problem of playing a game like Sniper Elite 4 at 38 FPS (let's assume vsynced 30 FPS) and why people with 4K monitors will "never" play at that framerate.
How many FPS games do you lock to 30fps on a 60Hz monitor?
I take it you are not irritated by input response time or playing such games (Sniper Elite games fall into this bracket for me) at that fps, but I am sensitive to it and so are many others.
And then how many love to pay a high price for their good 4k 60Hz monitor only to perm lock it to 30Hz.

By your logic, it should be fine then that games are never optimised for PC and 30fps lock is great as it is just as playable as 60/144Hz for all games.
Cheers
 
How many FPS games do you lock to 30fps on a 60Hz monitor?
I take it you are not irritated by input response time or playing such games (Sniper Elite games fall into this bracket for me) at that fps, but I am sensitive to it and so are many others.
I don't know how many of the other Sniper Elite games you played, but I played Sniper Elite 3 in my laptop cooped with a friend during a weekend out . The laptop could only run it at 30 FPS and I had a blast with it.
It's not a fast-paced game, if you try to play it as a fast-pacing game (i.e. let's play this like I'm Rambo) you're dead in 5 seconds unless you're in the ultra-easy mode that takes away all the point in the game.
This is much more of a strategy FPS where you have to observe the terrain, guard positions and routes, advantage points, etc. and advance to your target little by little. Actual snipers in WW1 and WW2 would stay still in the same spot for days or even weeks, studying for the right opportunity to start shooting and give away their position.

So yeah, I ask again what exactly is so wrong about playing Sniper Elite at 30 fps.
It sure sounds like your logic is only "it's a first-person shooter so it must run at a gazillion FPS" and you have no idea the kind of game this is.



I am sensitive to it and so are many others.
You are sensitive to it and you think many others are too, so... these 4K results showing better performance on AMD cards should never be taken into account.
And it's totally not because you're butthurt about the results.
Got it.


And then how many love to pay a high price for their good 4k 60Hz monitor only to perm lock it to 30Hz.
Almost everyone with a PS4 Pro and a 4K TV, playing their games in 4K mode. Which makes it the majority of people with a 4k 60Hz monitor.
Besides, people with a 4K monitor are interested in the higher detail. If their concern was response times and framerates they'd rather go with a 1080p 120Hz panel which is a lot cheaper.



By your logic, it should be fine then that games are never optimised for PC and 30fps lock is great as it is just as playable as 60/144Hz.
So by your logic, a lower framerate that's good enough for one game must always be good enough for all other games.
Strawman right back at you.
 
Please do explain the problem of playing a game like Sniper Elite 4 at 38 FPS (let's assume vsynced 30 FPS) and why people with 4K monitors will "never" play at that framerate.
And the most important fact within your context.
At Vsynced 30 FPS the Fury X has the same performance as the 1070, it is not faster :)
Using PCGamesHardware benchmarks I linked earlier who do some of the best testing IMO.
So the original point then is meaningless that the Fury X is faster than a 1070.
Anyway it seems you agree you would not play games with frame performance of 33-38.5 fps at 60Hz, and if you lock to 30Hz it still has some technical limitations compared to it running at 60fps and importantly the fps is capped at 30 and so no difference for quite a few of the cards (FuryX/1070/1080).

Cheers
 
Back
Top