Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

And to think there was a time when RTXGI was basically free and still massively improved the visual quality like it's doing now in Witcher 3....


I really wonder what the frick Nvidia did to RTXGI that it is so expensive now. First Warhammer, now this. It feels like Nvidia is trying to intentionally destroy performance on any GPU that is not FG capable.

40 fps @ 1440p gaming on a 3080 with DLSS and a 5800x cpu. Seems totally reasonable.
What? That's worse than Cyberpunk with all RT effects. You gotta be kidding me. It's rendering internal at 860p.
 
^ yeah, steam forums are filled with complaints. Reddit as well, various websites, twitter. It's an extremely poorly performing game and it stutters on top of it, even if the framerate is high. I would not call the game playable on any system, really
 
Witcher 3 looks like a predictable combo of:
1- trying to stitch new tech to an old game you haven't worked on in a while
2- vastly underpaying and over-crunching workers producing decades of talent loss, including multiple well known graphics devs
3- ray tracing against big open fields with tons of trees
 
Wasn’t some external studio working on this patch and then they cancel it and decided to do it ourselves? Performance mode is mystery how this is not running locked 60 on consoles is mystery to me. And pc performance is some kind of a joke.
 
And to think there was a time when RTXGI was basically free and still massively improved the visual quality like it's doing now in Witcher 3....


I really wonder what the frick Nvidia did to RTXGI that it is so expensive now. First Warhammer, now this. It feels like Nvidia is trying to intentionally destroy performance on any GPU that is not FG capable.


What? That's worse than Cyberpunk with all RT effects. You gotta be kidding me. It's rendering internal at 860p.
Seems like a CPU issue. The performance doesn't change from 1080p to 1440p with RT+DLSS/FSR Quality mode. Increasing the resolution to 4K from 1080p with DLSS only decreases the frame rate by 5fps from 78 down to 73. In my rig, the 4090 is often underutilized all the while the CPU is practically idle at 15-25% usage. That's with a 4090.
 
The Witcher 3 thread is prolly the best place for all this kvetching.

But, while I'm here, maybe they should have just ported the game to UE5.1 with Nanite and Lumen, as a test for their next project...
 
In what way is it more dynamic than HFW ?

I'm not comparing it to anything, because I see no point in that.

I'm offering up ideas for others to think about and investigate as to why The Witcher 3 may be having stutters, as the usual suspects don't seem to be responsible. 🤷‍♂️
 
Witcher 3 looks like a predictable combo of:
1- trying to stitch new tech to an old game you haven't worked on in a while
2- vastly underpaying and over-crunching workers producing decades of talent loss, including multiple well known graphics devs
3- ray tracing against big open fields with tons of trees
The game aint that pretty even
 
Witcher 3 may be an old game and engine but it has a lot of transparencies which are much tougher on the RT hardware than CP2077's City scape.

And looking at some of the comparison video's you can see pop-in in the RT reflections way out in to the distance so it has a large render distance which will also tank performance.
 
Looking worse or not (personal opinions aside), what resolution does HFW's ray tracing solutions render at? The higher the quality the larger the performance impact.
"Resolution" probably needs be defined as 'rays per frame' for RT as it's not pixel based.
 
Back
Top