No DX12 Software is Suitable for Benchmarking *spawn*

Battlefield V Alpha: Our First Gameplay Benchmark Results

"But finally, after two long days of benchmarking, we compiled comparisons between DirectX 11 and DirectX 12 performance, a look at the various graphics quality presets, an analysis of run-to-run variance, and benchmarks for Nvidia’s five highest-end Pascal-based GeForce cards.
...
There’s a post up on EA’s Battlefield V Closed Alpha forum flat-out guiding players toward the DirectX 11 default setting. Specifically, “…we recommend using the defaulted DX11 setting as this build is a work-in-progress and using DX11 will improve performance.” To verify, we compared runs under both APIs using the Ultra quality preset on our MSI GeForce GTX 1070 Ti Gaming 8G."


aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9EL0UvNzgxOTcwL29yaWdpbmFsL0JhdHRsZWZpZWxkLVYtQ2xvc2VkLUFscGhhLUF2RlBTb0ItMjU2MHgxNDQwLUdlRm9yY2UtR1RYLTEwNzAtVGktVWx0cmEucG5n

"Naturally, we expect EA DICE to continue refining Battlefield V’s technical foundation. What’s interesting, though, is that the game is based on a well-established Frostbite engine. It’s not clear what Battlefield V adds that’d negatively affect performance under DirectX 12 to such an extent."
https://www.tomshardware.com/reviews/battlefield-v-gameplay-benchmarks,5677.html
 
Battlefield V Alpha: Our First Gameplay Benchmark Results
For those who will ask the simple question, here's the answer:

Why no AMD cards? To begin, our closed alpha invite came by way of Nvidia’s PR firm, so we were already wary of running benchmarks that’d pit the two companies against each other. Then we spotted the following known issue on EA’s closed alpha forum: “[TRACKED] BFV - PC - Massive performance issue is present with the AMD RX series cards.” We’ll revisit a head-to-head between AMD and Nvidia once the final game becomes available in a few months.
 
Battlefield V Alpha: Our First Gameplay Benchmark Results

"But finally, after two long days of benchmarking, we compiled comparisons between DirectX 11 and DirectX 12 performance, a look at the various graphics quality presets, an analysis of run-to-run variance, and benchmarks for Nvidia’s five highest-end Pascal-based GeForce cards.
...
There’s a post up on EA’s Battlefield V Closed Alpha forum flat-out guiding players toward the DirectX 11 default setting. Specifically, “…we recommend using the defaulted DX11 setting as this build is a work-in-progress and using DX11 will improve performance.” To verify, we compared runs under both APIs using the Ultra quality preset on our MSI GeForce GTX 1070 Ti Gaming 8G."


aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9EL0UvNzgxOTcwL29yaWdpbmFsL0JhdHRsZWZpZWxkLVYtQ2xvc2VkLUFscGhhLUF2RlBTb0ItMjU2MHgxNDQwLUdlRm9yY2UtR1RYLTEwNzAtVGktVWx0cmEucG5n

"Naturally, we expect EA DICE to continue refining Battlefield V’s technical foundation. What’s interesting, though, is that the game is based on a well-established Frostbite engine. It’s not clear what Battlefield V adds that’d negatively affect performance under DirectX 12 to such an extent."
https://www.tomshardware.com/reviews/battlefield-v-gameplay-benchmarks,5677.html
Actually PCGamesN beat them to the punch
https://www.pcgamesn.com/amd-nvidia-battlefield-5-graphics-performance

They compare GTX 1060 6GB and RX 580, which are in general about the same performance. In BFV's current state, RX 580 beats GTX 1060 by notable margins in both 1080p and 1440p
 
I thought the new apis, like directx12 and vulkan were mostly beneficial for algorithmic flexibility. id basically said Wolf 2 couldn't have been made with OpenGL. So kind of like, until your game abandons dx11/opengl and goes fully into dx12/vulkan, you won't see the benefits.
 
Curious why just DX11/OpenGL? Doesn't the Dx12 version also have to be void of any Vulkan coding to see the "benefits"?

I just mean in the sense that DX12 and Vulkan provide flexibility over their previous "high level" apis. So the games that will really take advantage of the new low-level flexibility are the ones that are not designed, architected with the old apis in mind.
 
Even with low end CPUs, the gains are usually non existent to limited.
I was able to play deus ex mankind divided a high details thanks to the DX12 port on a very old i5 750 (though overclocked at 3.8GHz) and a R9 280X. In DX11 mode it was unplayable. A good DX12 port can significally lower the CPU overhead, which means you will see benefits only CPU limited scenarios. If you want more, the developers have to re-thing completely the game engine and the rendering algorithms, not just making a port adding a D3D12 backend..
 
I was able to play deus ex mankind divided a high details thanks to the DX12 port on a very old i5 750 (though overclocked at 3.8GHz) and a R9 280X. In DX11 mode it was unplayable
I am not casting doubts on your experience in any way, but I would have loved a more objective evaluation with proper numbers, maybe results from the built in benchmark, or an in game scene while standing still. The DX11 being unplayable could be the result of a bad driver version.

„… by way of Nvidias PR firm…“?
Yeah, seems like massive issues for the PR firm to steer people clear of using Radeons.
The issue with Radeons is real, according to PCGH: constant stutters on the RX 580 that are not present on the 1060

 
Interesting that the other two articles are using Ryzen CPUs but Toms doesn't even mention any specifications anywhere that I can see. Given it's Tom's, I assume it's an Intel. You really need full system specs and driver versions though.
 
Interesting that the other two articles are using Ryzen CPUs but Toms doesn't even mention any specifications anywhere that I can see. Given it's Tom's, I assume it's an Intel. You really need full system specs and driver versions though.
I haven't followed the latest numbers, but with all the security fixes for Spectre and Meltdown in place, Ryzen may be the better testing platform. Would be interesting to see if there is a difference using Threadripper with even more cores. I doubt most games scale well enough, but would be interesting for a DX11/12 comparison.
 
I haven't followed the latest numbers, but with all the security fixes for Spectre and Meltdown in place, Ryzen may be the better testing platform. Would be interesting to see if there is a difference using Threadripper with even more cores. I doubt most games scale well enough, but would be interesting for a DX11/12 comparison.
Well Battlefield 1 scales very well across multiple cores and loves Ryzen CPUs because of that (64 player, not single player benchmark)
 
Single module Ryzens don't scale bad at all, but I did encounter a couple of weird side effects with the multi-module (Threadripper, Epyc) ones.

For some odd reason, PCIe DMA transfers don't seem to like the default channel interleaving pattern of 256byte stride, all channels across all modules, at all.
With cross-module channel interleaving, PCIe uploads with PCIe 3.0 16x drop to <9GB/s, and PCIe device to host downloads even drop to 2-3GB/s with massive jitter.
Doing this with multiple GPUs, downloads even drop as far as only 1-2GB/s per PCIe 16x port, uploads also suffer significantly. Concurrent up and downloads also hurt upload performance.

Keep the system to a single module, or keeping at least limit interleaving to a single module by corresponding BIOS setting, and everything appears to be fine.
12GB/s full-duplex, completely stable and jitter free (which even Skylake still fails to achieve occasionally). Multi GPU scales up to the available dual channel memory bandwidth.

Limiting interleaving to a single module is hurting memory bandwidth available to CPU work though, as it only behaves like a dual-channel system effectively.

There is a middle way of setting stride to 2kB, keeping it cross-module, that at least brought the download speed back up to ~8-9GB/s. While not hurting performance too much in memory bandwidth dependent applications.
 
Since when does it even make sense to talk about performance of a 2 day alpha test? Performance will get much more optimized at later stages of the development and still can change dramatically. I really don't understand all these alpha, beta tests of unreleased games. It's ok in early access games, as people can play it at that time, but unreleased games? I don't get it.
 
Since when does it even make sense to talk about performance of a 2 day alpha test? Performance will get much more optimized at later stages of the development and still can change dramatically. I really don't understand all these alpha, beta tests of unreleased games. It's ok in early access games, as people can play it at that time, but unreleased games? I don't get it.

Page hits?
 
Back
Top