The Witcher 3 : Wild Hunt ! [XO, PS4, NX, PS5, XBSX|S, PC]

a truly excellent video on the new The Witcher 3 update by NX Gamer. He goes into detail even using the .ini file as a reference to compare settings between machines and so on. Plus the wrapper effect -DX11 to DX12-.


edit: the tip to use Xbox game bar app on PC to show performance instead of MSi Afterburner is something I didn't know.
 
Last edited:
a truly excellent video on the new The Witcher 3 update by NX Gamer. He goes into detail even using the .ini file as a reference to compare settings between machines and so on. Plus the wrapper effect -DX11 to DX12-.


edit: the tip to use Xbox game bar app on PC to show performance instead of MSi Afterburner is something I didn't know.

Yes that was a great video. Lots of good detail and comparison points in there. I was left quite confused about the performance comparisons he mentioned though, specifically in relation to him mentioning the consoles being around 33% faster than his 2070. Was that measured in game? With or without RT? Just a generic performance differential not related to this game? The comparison was never actually shown so it's difficult to judge. The 2070 should certainly be faster than the consoles with the RT enabled in this game. In addition his console matched settings (which the performance comparison might have been based on) aren't actually console matched, as even in this video earlier on he notes the RTXGI and RTAO run at higher settings than the consoles - something Alex has also confirmed.

But aside from that confusion, yes it was a great video.
 
Videos are improving, however some bias is still in there. RTX 2070 oc should be not far off from console performance in raw raster let alone RT.
Its important to have close enough matched settings on all levels when you make comparisons between platforms.

Agreed. The chances of either console being 30% faster than the 2070 in this game with RT enabled are very low. The fact that his listed console matched settings aren't even console matched based on his own commentary doesn't bode well for an apples to apples comparison.
 
Agreed. The chances of either console being 30% faster than the 2070 in this game with RT enabled are very low. The fact that his listed console matched settings aren't even console matched based on his own commentary doesn't bode well for an apples to apples comparison.

The chances of the consoles being 30% faster than his 2070 OC is low too. In raw raster they are not far from eachother. Consoles more often than not have heavily tweaked settings for the best performance for their GPU's, in dGPU land the low end entry level GPU's. Lower than low in some cases, DRS can also have its merits which seems to works differently on PC.
For those with 5700XT, 2070, 6600, 3060 class of hardware, it would be welcomed to see developers implement these console settings. Not for comparisons, but for users with such hardware.
 
The chances of the consoles being 30% faster than his 2070 OC is low too. In raw raster they are not far from eachother. Consoles more often than not have heavily tweaked settings for the best performance for their GPU's, in dGPU land the low end entry level GPU's. Lower than low in some cases, DRS can also have its merits which seems to works differently on PC.
For those with 5700XT, 2070, 6600, 3060 class of hardware, it would be welcomed to see developers implement these console settings. Not for comparisons, but for users with such hardware.
Are the consoles better or worse on average than a 1080? I am curious how they stack up to Nvidias flagship of a few years ago
 
Are the consoles better or worse on average than a 1080? I am curious how they stack up to Nvidias flagship of a few years ago

Id bet on the consoles any day lol. While quite capable in its day, its ancient by now (released before the PS4 pro even). While in raw power it seems close, Pascal doesnt match up to RDNA2 in special ray tracing ofcourse. It should be a close match to the RX5700 non XT and RTX 2060/S in raster.
The damn thing was released just 2/3 years after the PS4/OneS, in that view its quite a capable GPU for being so old.
 
Are the consoles better or worse on average than a 1080? I am curious how they stack up to Nvidias flagship of a few years ago

Certainly better. The 1080 was around 2060-2070 raster performance levels with no RT or other DX12U features.

The 1080TI is a more interesting comparison, but only in games without RT.
 
Certainly better. The 1080 was around 2060-2070 raster performance levels with no RT or other DX12U features.

The 1080TI is a more interesting comparison, but only in games without RT.

In raw performance/raster 1080Ti should be ballpark premium consoles, its right around 2070/2070Super. However its an old architecture and ray tracing is a draw on that thing.
 
In raw performance/raster 1080Ti should be ballpark premium consoles, its right around 2070/2070Super. However its an old architecture and ray tracing is a draw on that thing.
So, what exactly does happen when a GPU like that is run with a game with RT features? Does the performance tank compared to a GPU that is "RT capable" or does the game not even run? And I'm assuming software RT like lumen would work fine on that kind of architecture?

Again I apologize for being technically incompetent 🙏
 
So, what exactly does happen when a GPU like that is run with a game with RT features? Does the performance tank compared to a GPU that is "RT capable" or does the game not even run? And I'm assuming software RT like lumen would work fine on that kind of architecture?

Again I apologize for being technically incompetent 🙏

Going to tank:


Quite nice for what it is. RDNA2/Turing and up have RT in hardware speeding things up considerably. On a 1080Ti theres no acceleration at all, thus the 'compute' way or 'software' as its called too sometimes.
 
Sorry for all weird questions btw 😅 I am constantly floored by advancements in technology. I got interested in the inner workings of things prior to the 8th generation.

so I remember the absolute hype train surrounding the original titan graphics card and thinking to myself "no game would ever need that kind of outrageous graphics power for 1 thousand dollars", especially when compared to the 7th gen machines which were very long in the tooth, and the upcoming consoles.

And now almost 10 years later graphics cards really have hit near the diminishing returns front regarding normal graphics rendering 😂
 
Sorry for all weird questions btw 😅 I am constantly floored by advancements in technology. I got interested in the inner workings of things prior to the 8th generation.

so I remember the absolute hype train surrounding the original titan graphics card and thinking to myself "no game would ever need that kind of outrageous graphics power for 1 thousand dollars", especially when compared to the 7th gen machines which were very long in the tooth, and the upcoming consoles.

And now almost 10 years later graphics cards really have hit near the diminishing returns front regarding normal graphics rendering 😂

Never be sorry for questions and healthy technical discussions lol. Also, i still think raster is improving quite much, although slowed down looking over the past two/three decades.
 
Going to tank:


Quite nice for what it is. RDNA2/Turing and up have RT in hardware speeding things up considerably. On a 1080Ti theres no acceleration at all, thus the 'compute' way or 'software' as its called too sometimes.

ANd that's just in games that allow you to render RT via software. Most simply block the option in the settings menu altogether.

I tried RT on Lego Builders Journey yesterday along with all other settings maxed at my monitors native res (3840x1600). Managed to bring my 1070 down to 1 fps lol.
 
Well, enabling Reflex does wonder for me, a lot less stutters even in low framerates situations, and a better cpu spread, I still don't know why but I can reproduce it, without reflex I never see like 6 or more of my cpu going full speed, and my cpu powerdraw is around 80-90w. With reflex I sometimes see 8-12 cores (so all of them, I've a 10920x) going full speed, even in cpu limited situation, and the power draw is around 120-130w in theses situations. Weird, but happy....

EDIT : So, it seems I'm never at 99% gpu load load, or rarely, which could be logical with how Reflex work, by avoiding to be gpu bound. But how it works on the cpu is a mystery to me...
 
Last edited:
Back
Top