No DX12 Software is Suitable for Benchmarking *spawn*

Is linux a suitable replacement for benchmarking?

20-Way GPU Gaming Comparison With March 2020 Linux Drivers
March 4, 2020
embed.php

https://www.phoronix.com/scan.php?page=article&item=march-2020-gaming&num=1
 
Last edited by a moderator:
Maybe more cards hitting hard memory limits? Apparently, DE is supposed to eat vidmem even before breakfast.
 
The game is definitely VRAM hungry. On max settings, 1440p mode and no res scaling, my 1080Ti GPU utilization hover around the 40% mark with nearly 8GB VRAM occupied.
 
Maybe more cards hitting hard memory limits? Apparently, DE is supposed to eat vidmem even before breakfast.
According to TPU it stays within 8GB limits until you go 4K, when it jumps slightly over 8GB making pretty much everyone run out of memory 2080 Ti aside
 
Without knowing the architecture of the renderer and which hardware features it’s leveraging, I don’t know how anyone can make claims about how things should be performing.
 
Without knowing the architecture of the renderer and which hardware features it’s leveraging, I don’t know how anyone can make claims about how things should be performing.
In general, the engine is forward-rendering with tile-based occlusion pre-pass for the lighting. My guess is that if the engine is using async compute operations, this could be the main reason Kepler is struggling without dedicated driver profiling, considering this architecture is particularly reliant on fine software tuning of the driver compiler for sustained resource utilization.
 
In general, the engine is forward-rendering with tile-based occlusion pre-pass for the lighting. My guess is that if the engine is using async compute operations, this could be the main reason Kepler is struggling without dedicated driver profiling, considering this architecture is particularly reliant on fine software tuning of the driver compiler for sustained resource utilization.

Yah, I've read that the renderer is still a forward-renderer. It seems like they've changed/upgraded their AA solution again (computed asychronously, accumulates up to 32 samples per pixel), and you can't actually change it or turn it off (except in the console). They stated that turning it off breaks a lot of things, because they rely on temporal accumulation. They've also completely re-done the texturing system and gotten rid of megatexture entirely. There are just so many places where they've upgraded from Doom 2016 to Wolf 2 to Doom Eternal that I really don't think anyone should be reading into Doom 2016 numbers and extrapolating to Doom Eternal. Looking forward to future tech talks on this one.
 
Yah, I've read that the renderer is still a forward-renderer. It seems like they've changed/upgraded their AA solution again (computed asychronously, accumulates up to 32 samples per pixel), and you can't actually change it or turn it off (except in the console). They stated that turning it off breaks a lot of things, because they rely on temporal accumulation. They've also completely re-done the texturing system and gotten rid of megatexture entirely. There are just so many places where they've upgraded from Doom 2016 to Wolf 2 to Doom Eternal that I really don't think anyone should be reading into Doom 2016 numbers and extrapolating to Doom Eternal. Looking forward to future tech talks on this one.
Source please. I'm interested in what they did (except if they went back to just loading everything a level requires before-hand ^^)
 
Resident Evil 3 Benchmark Test & Performance Analysis - 27 Graphics Cards Compared
March 30, 2020
In this mini-review, we test Resident Evil 3 across a wide selection of graphics cards from all price segments; we test both the DirectX 11 and DirectX 12 modes and present comparisons between both.
...
Resident Evil 3 supports both the DirectX 11 and DirectX 12 API. Just like in previous RE Engine games the DirectX 12 render path is actually a little bit slower than DirectX 11. Even on older cards, like NVIDIA Pascal and AMD Polaris. The only difference is the GTX 1060 3 GB, which seems to take a smaller performance hit on DX12 when it runs out of VRAM at 1440p and 4K—still not the right card for those resolutions. In a reversal of that result, the Radeon RX 5500 XT with its 4 GB VRAM and PCIe x8 interface does a lot worse in DX12 than DX11 when it runs out of memory. My recommendation is to switch to DirectX 11 for your playthrough and never look back.
https://www.techpowerup.com/review/resident-evil-3-benchmark-test-performance-analysis/
 
Back
Top