Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I really wonder what's going on with RTXGI. It's the same behavior in Warhammer. As you turn it on, it completely destroys performance beyond repair.

rton.png

rtoff.png

Note: 66 FPS with rt off is in CPU limit, as you can see with 55% GPU utilisation. So with the CPU out of the equation, it performs actually above 100 fps.

Does performance gets destroyed this much for you too or is it just my system? This usually screams VRAM bottleneck but it also seem to happen on low textures. And given how awful RTXGI in Witcher 3 runs even on high end GPUs.. it appears that is just the way it behaves. I wonder why, though. RTXGI used to be so efficient.
 
Maybe you should compare the efficient case directly with what's happening here to try to understand.
Huge world with lots of foliage and similar improvements to visual quality:


This was way before the release of Ada. And RTXGI was a lot more efficient there as you can see by yourself.
 
This was way before the release of Ada. And RTXGI was a lot more efficient there as you can see by yourself.
RTXGI isn't just on/off, it's possible to adjust probe grid density, rays per probe etc. Would actually be nice if The Witcher had a low and high settings for it on PC.
 
(10) Wo Long: Fallen Dynasty | Xbox Series S/X vs PS5 | Graphics Comparison Demo | Analista De Bits - YouTube

PS5: Resolution Mode: 1440p/60fps FPS Mode: Dynamic 1440p/60fps (Common 1332p)
Xbox Series S: 1080p/60fps
Xbox Series X: Resolution Mode: 1440p/60fps FPS Mode: Dynamic 1440p/60fps (Common 1296p)
- This is a demo and the game still has a long way to go to finish its development, so its quality will differ from the final version.
- All versions use temporary reconstruction rendering.
- Xbox Series S has lower quality lighting, shadows, textures, draw distance and some post-processing effects.
- Loading times are slightly faster on PS5.
- The framerate on Xbox Series S is considerably more stable than on the other 2 consoles due to its lower graphical demands. On PS5/XSX, I recommend FPS mode.
- Xbox Series X has a slightly lower average resolution in FPS mode compared to PS5.


The demo is out, from personal experience it constantly crashes/giving me errors on XSX. The workaround for some people is to kill the game start new and skip tutorial. Unfortunately it dosnt work for me i got popup every minute or so with info that something went wrong.
How was the demo graphically?
I believe they did an hour long stream this week. How was the scale and draw distance etc looking?
 
How was the demo graphically?
I believe they did an hour long stream this week. How was the scale and draw distance etc looking?
Graphically, ignoring the technical stats that can be misleading (notably min resolution "Only 900p in Forspoken!" with FSR2 and DRS), Wo Long demo looked graphically way worse than Forspoken for me. Smaller scale environments, loadings not using PS5 custom I/O, way less detailed ennemies/environment and many disappointing PS3 textures.

But Wo long has better artstyle obviously. I guess that's what matters the most in that case for many.
 
  • Like
Reactions: Jay
I don't quite know the reason but at the exact same settings, DX12 seems to be performing %35-55 slower than DX11 in Witcher 3 in terms of CPU boundness;
 

Attachments

  • The Witcher 3 12_17_2022 12_32_21 AM.png
    The Witcher 3 12_17_2022 12_32_21 AM.png
    6 MB · Views: 23
  • The Witcher 3 12_17_2022 12_34_22 AM.png
    The Witcher 3 12_17_2022 12_34_22 AM.png
    6.7 MB · Views: 34
@yamaci17 Yah, it seems like their dx12 is very poor. To start with it's D3D11on12, which isn't as good as native D3D12. Then you pile DXR on top, of a bad D3D12 implementation and it's not great.
I'm not really sold on the D3D11onD12 thing, to be honest. Isn't something complex such as RTGI would need native DX12 functions and a proper DX12 version of an engine? I'm really bamboozled on that front, if it they truly use a wrapper to get DX12 renderer, how would or could they pull RT GI on DX11? It seems illogical to me. I still think there's a possibility that it is simply bad coded native Directx 12 situation, similar to Battlefront 2 somehow.
 
I'm not really sold on the D3D11onD12 thing, to be honest. Isn't something complex such as RTGI would need native DX12 functions and a proper DX12 version of an engine? I'm really bamboozled on that front, if it they truly use a wrapper to get DX12 renderer, how would or could they pull RT GI on DX11? It seems illogical to me. I still think there's a possibility that it is simply bad coded native Directx 12 situation, similar to Battlefront 2 somehow.

Control does the same thing. It uses the D3D11on12 wrapper. I remember I wanted to try to profile it and it was very difficult because of support for D3D11on12.
 
I don't quite know the reason but at the exact same settings, DX12 seems to be performing %35-55 slower than DX11 in Witcher 3 in terms of CPU boundness;
Your settings might be the same but the DX12 pic is much sharper looking with better resolve on the distance textures. That doesn't explain the performance, the the output is definitely different.
 
I don't quite know the reason but at the exact same settings, DX12 seems to be performing %35-55 slower than DX11 in Witcher 3 in terms of CPU boundness;
The Witcher 3 Complete Edition has significantly better LOD, NPC Detail, Shadow Casting Lights and more by Default even at the Same settings as the OG Game. It is actually really obvious If you Play them Back to Back, the new game has far less Pop in and much better distant Detail even at the same settings.

When you put both at "Ultra" or even "Low" right next to eachother, they are not actually the same graphical Presentation. The complete edition is much better in multiple regards.

That is why distant textures in your shot look different - the geo/asset LOD is significantly higher in the new game at the same settings. You are not measuring the same thing essentially. At least that is what I have found.

See my screen below, Ultra OG vs Ultra in Complete Edition.
assets_more.00_15_50_wcdqy.png
 
Last edited:
Isn't the post by yamaci17 comparing the Complete Edition running in DX11 and DX12 mode? I'm sure the difference in sharpness is just TAAU being on in DX11 mode, therefore the image is blurrier.

Here is DX11 mode and DX12 mode in Complete Edition compared with matching settings (as far as the game allows, the SSAO looks different) and DX12 is definitely performing worse.
 

Attachments

  • The Witcher 3 Screenshot 2022.12.17 - 11.21.49.63.jpg
    The Witcher 3 Screenshot 2022.12.17 - 11.21.49.63.jpg
    3.7 MB · Views: 20
  • The Witcher 3 Screenshot 2022.12.17 - 11.25.15.98.jpg
    The Witcher 3 Screenshot 2022.12.17 - 11.25.15.98.jpg
    3.4 MB · Views: 12
Isn't the post by yamaci17 comparing the Complete Edition running in DX11 and DX12 mode? I'm sure the difference in sharpness is just TAAU being on in DX11 mode, therefore the image is blurrier.

Here is DX11 mode and DX12 mode in Complete Edition compared with matching settings (as far as the game allows, the SSAO looks different) and DX12 is definitely performing worse.
The DX12 version of complete edition uses a different config - the DX11 of complete edition uses the same config as the old game. Go to Documents/The Witcher 3 to confirm.
 
My GPU heavily became a bottleneck at native 4K with DX11' and I had to go down to 1440p to make sure I'm CPU limited there. That is on me, I'm sorry for the confusion. I should've both compared them at 1440p.

Here's the exact same comparison at native 1440p. The difference in sharpness/LOD is all caused by 1440p.


I'm comparing newgen dx11 vs newgen dx12. None of my comparisons are from the OG version. DX12 is still a tiny bit sharper but I don't see how that costs a %40 CPU bound performance penalty. If you think the difference in slight sharpness justifies the %40 CPU bound performance, then I'm baffled. :D
 
Last edited:
If you think the difference in slight sharpness justifies the %40 CPU bound performance, then I'm baffled. :D
The DX11 new gen uses less LOD, less draw distance and less NPC count than the DX12 new gen, it also has less shadow casting lights. That's what Alex is telling you. The DX11 new gen uses the old config files from the old gen version, so it's visual makeup is less than the DX12 version.
 
Back
Top