No DX12 Software is Suitable for Benchmarking *spawn*

But with Pascal? If the GPU got idle, then it was most likely actually idle, and not just stalled internally.
Yeah, we have good examples of DX12 titles with extensive Async implementation, in which Pascal competes with or leads the competition, games like Ashes, Gears of War 4, Hitman and Sniper Elite 4.
 
Forza 7, in my informed opinion, seems to be heavily CPU limited for preparing frames to the GPU, thus limiting its utilisation.
CPU limited or frame setup limited? Jaguar cores aren't the fastest out there and frame times looked extremely solid. Holding a consistent 12ms time throughout the benchmark. That would imply a dynamic load as there should have been more variation. There would simply be portions of the track with more trees or cars.

We've yet to see any real examples of games pushing async compute either. Even AOTS was 30% or so and not all that asynchronous. Doom unknown, but there was that recent quote of 80%+ on some titles. Most titles would be better described as parallel compute than asynchronous as the workload is consistent.

Assuming Forza7 was designed with VR in mind, cars providing a frame of reference being ideal, it may be tuned more for latency than pure FPS. Gaps in execution would be expected along with the possibility of leaning on async more asynchronously with different priorities. On a console, using the GPU as a coprocessor for frame setup wouldn't be unreasonable.
 
PCGH tested Total War Warhammer 2, NVIDIA cards lose a lot of performance in DX12, while AMD cards achieve noticeable gainsin DX12, however NVIDIA's DX11 is way way faster than even AMD's DX12, to the point that a GTX 1070 (DX11) is faster than a Vega 64 Liquid (DX12). This is exactly the same situation as the first Total War Warhammer.

http://www.pcgameshardware.de/Total...60823/Specials/Benchmark-Test-Review-1240653/
 
Destiny 2 GPU Benchmark: Massive Uplift Since Beta
October 25, 2017
We’re seeing some CPU uplift from the beta, resulting in an upward pull on results across the board. This is coupled with major modifications to AMD’s driver implementation, which AMD itself earmarks as offering “up to 43%” performance improvement. Our numbers have largely validated AMD’s, and we believe we’ve pinpointed most of said uplift to shader optimizations with some of the Highest settings. Depth of Field seems to be of particular note, as indicated by our beta graphics optimization guide. Retaining all Highest settings, but dropping DOF to High, our beta testing suggested that 82% of “High” performance could be gained.

It seems as if either AMD or Bungie optimized how some of these Highest settings are handled on AMD hardware, offering some of the near-50% boosts that we’re seeing over beta performance.

NVidia’s uplift in these tests (versus beta) largely seem to correspond with CPU performance uplift on the whole, which could be caused by a couple of things – game-level changes being one of them.

Our CPU benchmark is forthcoming and targeted for publication within the next 24 hours. Keep an eye out for that.
https://www.gamersnexus.net/game-bench/3096-destiny-2-gpu-benchmark-launch-performance-uplift
 
Well, here's one game where the 1070 Ti won't surpass the Vega 56 on highest settings.

It's a DX11 title though.



I wonder why Bungie went with DX11. They've been developing on consoles with low-overhead APIs + x86 + GCN GPUs for a while, and they're most probably using DX12 for the Xbone/XboneX versions.
Also, it's their first PC title/port in 16 years so it's not like they're reusing older PC code..
This could have been the ideal opportunity to release a game with an engine designed for a low-overhead API from the ground up.

Are they afraid of losing customers who are still using Windows 7? If so, why not Vulkan?
 
Quite incredible how much performance Nvidia lose going from 1080p to 1440p with Destiny 2, it will be interesting to see if this can be resolved in future but the gap is closer between AMD and Nvidia depending upon the settings used.

Separately though it does raise a discussion point; what should be the realistic settings pertaining to AA/Edge Smoothing, because I do notice quite a few sites using settings that provide massive fps (140+ fps at 1080p and stay with same setting for 1440p) but in real world most gamers would set higher settings/AA/smoothing for those resolutions below 4K.
It is a tough call and one I am not sure there is an arbitary answer for as IMO there is a need for benchmarking a couple of different setups around those settings.

Edit:
And just to say some sites are still overdoing with those said settings with some games.
Cheers
 
Last edited:
Do the results scale according to resolution for NV?

I’m rushing out now, but the 1080 Ti scores seemed to fall roughly linearly for the highest settings between 1080p and 1440p. Could be the stereotypical AMD CPU overhead skewing things.
 
So far GameGPU numbers reflect GamersNexus numbers in that GTX 1080 is ahead @1080p, but falls behind both Vegas @1440p and 2160p, ComputerBase and Guru3D paint the same picture as well. However PCGH have the GTX 1080 neck and neck with Vega 64 @1440p, but behind only @2160p.

http://gamegpu.com/action-/-fps-/-tps/destiny-2-test-gpu-cpu
https://www.gamersnexus.net/game-bench/3096-destiny-2-gpu-benchmark-launch-performance-uplift
https://www.computerbase.de/2017-10/destiny-2-benchmark/2/#diagramm-destiny-2-1920-1080
http://www.guru3d.com/articles_pages/destiny_2_pc_graphics_performance_benchmark_review,4.html
http://www.pcgameshardware.de/Desti...Destiny-2-PC-Technik-Test-Benchmarks-1241945/
 
Last edited:
Do the results scale according to resolution for NV?

I’m rushing out now, but the 1080 Ti scores seemed to fall roughly linearly for the highest settings between 1080p and 1440p. Could be the stereotypical AMD CPU overhead skewing things.

The result is interesting when looking at 1440p highest and high at gamersnexus as the GPU relative performance/position changes, they diverge a lot in performance relative for AMD and Nvidia *shrug*.
Computerbase.de noticed some quirks in resolution performance scaling with Nvidia, along with influence of not just the primary visual settings but also AA/smoothing used; enough to be notable.
But there could be quirks as you say with the AMD results in certain circumstances and this game.
 
Yep NVIDIA is suffering badly at upper resolutions, as you go up NVIDIA GPUs have their fps halved! That's why 1080 and 1070 fall back behind @1440p and 2160p.

I am guessing a new driver will come to fix this glaring difference.

 
Last edited:
More Forza 7 Testing from PCPer:

forza7-avgfps.png
Forza7_2560x1440_PLOT.png


Taking a look at the first chart, we can see while the GTX 1080 frame times are extremely consistent, the RX Vega 64 shows some additional variance.

However, the frame time variance chart shows that over 95% of the frame times of the RX Vega 64 come in at under 2ms of variance, which will still provide a smooth gameplay experience in most scenarios. This matches with our experience while playing on both AMD and NVIDIA hardware where we saw no major issues with gameplay smoothness.


https://www.pcper.com/reviews/Graphics-Cards/Forza-Motorsport-7-Performance-Preview-Vega-vs-Pascal
 
Another test for Total Warhammer 2 from Hardware.fr, showing the exact same phenomenon, NVIDIA's DX11 is 15% faster than AMD's DX12, each vendor was running with best API available to it. This was confirmed previously by both PCGH and GameGPU.
http://www.pcgameshardware.de/Total...60823/Specials/Benchmark-Test-Review-1240653/
http://www.hardware.fr/articles/971-16/benchmark-totalwar-warhammer-ii.html
http://www.pcgameshardware.de/Total...60823/Specials/Benchmark-Test-Review-1240653/
http://gamegpu.com/rts-/-стратегии/total-war-warhammer-ii-test-gpu-cpu
 
Last edited:
Back
Top