The Witcher 3 : Wild Hunt ! [XO, PS4, NX, PS5, XBSX|S, PC]

Even if it's not perfect, I have to say the swamps look nice with RT&all ...

swamps1.jpg



swamps2.jpg


swamps3.jpg
 
p.s. please @BRiT or any other moderator, delete my post in the DF thread sharing the same news 'cos it makes no sense to have it there for now.
Just moved it into this thread before seeing this post. :)
 
I have to disagree with the "... were imune to this once the shaders wre loaded...." I've a 3090, and it's still stuttering after hours in the same region / location. I doubt it's shaders compilation related stutters....
 
I have to disagree with the "... were imune to this once the shaders wre loaded...." I've a 3090, and it's still stuttering after hours in the same region / location. I doubt it's shaders compilation related stutters....

Out of curiosity, does it do that with RT off?

Regards,
SB
 
i6joyw.JPG



it's not very realistic for me to play this game at 4K or on my 4K TV -maybe with FSR 2 Performance- at RT Ultra, which is the point of this update, imho.

So after watching this video of a guy running the game on a RTX 3060, I decided to lock the game at 30 fps (everything Ultra+, RT Ultra) and I am happy with the results. It's what it is but you get the best of Raytracing and some "classic" 🙄framerate to boot. :)

 
the game seems to be either CPU or GPU limited, more likely the former. Whatever I do, I get 32 to 33 fps when setting the game to everything Ultra+, RT Ultra, whether the resolution is 4K FSR2 Performance (185W-190W GPU power consumption, the max of the GPU at default settings like I do have), 1440p FSR2 Performance -158W- or1080p FSR2 Performance -145W-. So I locked it at 30fps for now and hoping for XeSS support in the future and maybe some performance improvements. Knowing this, I play it on the TV rather than on the 1440p monitor, being the TV a larger display it is much more immersive.
 
Last edited:
Now we know why the Witcher 3 runs bad in RT mode.


Granted it's only been a few locations over the course of an hour, but it runs fine without that DLL...but that's assuming that deleting the local one there actually alters the functionality though, as this has been a part of Windows since WIndows 10, so maybe I'm just redirecting it to the system DLL (which is why Rivatuner is reporting it as D3D12 even with the local one deleted). I also don't see any large performance gain with that DLL deleted, for me it was just primarily just to get Rivatuner working. Also that doc link is old, its performance has been improved, albeit 'relatively performant' doesn't mean much:


Outside of RT though, it's generally...ok. There does look to be something akin to shader stutters, but then again there can be on the original as well, and some of these stutters are repeatable too so it's not just shader compilation. Nothing at least on the order of an UE4 DX12 game though.

I've tried both the DX11.exe which is an option, but it's brutal - complexly maxxes out my 12400f even without RT and removes the vast majority of graphical options (no DLSS/FSR), while DX12 is half the CPU and much smoother. Also tried the original, and the increased detail and DLSS options in the newer give it the edge overall. I can get a relatively solid 60 with DLSS auto and a mix of high/Ultra at 4K on my 3060, which look better than an intermediate res like 1800p I need to use on the original with it's weak AA to maintain 60 in the same areas.

The only consistent performance issue is microstutter when panning the scene while sprinting on the horse. This is nothing related to performance, it's a solid 60, but there's some frame pacing issues with this particular action that an FPS lock/external vsync can't solve.

So without RT I'd say this is likely still worth it, kinda, but otherwise wait for some patches - which may be a while considering the hill that has to be climbed.

I read, somewhere, that Ultra+ renders with no distance limit and everything is rendered at LOD 0.

If true, that's a fairly pointless combination of settings.

Completely unoptimized options so you can say "Ultra Premium ++" as a feature for a PC game? In THIS economy?!
 
Last edited:
Back
Top