CYBERPUNK 2077 [PC Specific Patches and Settings]

Yeah I'm thinking they'll respond with exactly that, I'm pricing a new 1000w. 80 plus Platinum versions are almost $400 :oops:

Yea, PSU's these days are crazy expensive.

I bought one of those new Corsair SHIFT PSU's with the side exit cables a few months back and cried paying for it.
 
Latest patch brings XeSS performance improvements
April 22, 2023
Good news gamers! Cyberpunk 2077 will run even faster now on Intel Arc GPUs thanks to XeSS. Thanks to the efforts of CDPR’s developers, they implemented the tech and released a patch that boosts performance significantly. This comes in handy if you play with ray tracing features on. We measured up to 71% uplift, hitting above that sweet 60FPS marker with the Intel Arc A750 graphics card.
 
About PSUs, some are tricky with the 12V repartition between different rails. That's why I choose psu with one big 12V rail, less trouble for my lazy self.
 
Tried it on 6800XT machine today. The PT setting works pretty badly with GPU power dropping to 200-220W and it's certainly not CPU limited since that goes down too with the fps. In fact, the power usage decreases with increasing effective resolution(FSR).

Psycho RT now seems to work better than I remember.
 
Tried it on 6800XT machine today. The PT setting works pretty badly with GPU power dropping to 200-220W and it's certainly not CPU limited since that goes down too with the fps. In fact, the power usage decreases with increasing effective resolution(FSR).

Psycho RT now seems to work better than I remember.
So an interesting thing about path tracing is it will decrease unit utilisation heavily over all in comparisons to a Traditional Rendering because it is something GPUs are bad at.
 
Recently I played Cyberpunk 2077 with pathtracing on my 3080 Ti. The monitor is a 1080p one so the resolution is lower, but of course the performance is still not very good. I finally settled on DLSS performance because it dips into low 40FPS too frequently with DLSS quality or balance. With DLSS performance I think it can be considered as playable if you have a VRR monitor.
One thing interesting is that it has more noises or artifacts than 4090 (at least what I remembered). It's not just resolution (I played on a 4K monitor with my 4090). It's also not just DLSS as I tried it with DLSS disabled and those artifacts are still there (though a bit less prominent).
After playing it for a while, I think it's not bad but the resulting artifcats probably makes it not that worthwhile on a 3080 Ti. On a 4090 I think it's no-brainer as the performance is good enough and there are no obvious artifacts (at least on a 4K monitor). On a 3080 Ti I'd say maybe playing with normal raytracing or even raytracing disabled probably going to be better overall. Of course, if DLSS3 will be available on 3080 TI someday it might be worthwhile.
 
From the article:

Regular raytracing also enjoys better hardware utilization, across both GPUs. That’s because it gets higher occupancy in the first place, even though its cache hitrates and instruction mix is largely similar. With regular raytracing, hardware utilization on RDNA 2 goes from mediocre to a pretty good level.

Decided to try it out again today on 6800XT for the comparison. With PT the power usage remains in the same range(200-210W) from 1024x768(!) all the way upto 3440x1440. Meanwhile, power usage for normal RT Psycho scales normally with resolution.

Had similar issue with Portal RTX:

 

AMD's RT profiler is so much better than Nvidia's. I wonder what Nvidia is hiding.

"However, one shader engine finished its work after 70 ms, leaving a quarter of the GPU idle for the next 91 ms." That's brutal, is AMD not doing tile based distribution across shader engines?

Between thread divergence and memory stalls there's a lot of hardware idling each frame :( It's the same problem as always, it's very hard to feed these beasts.
 
Last edited:
So an interesting thing about path tracing is it will decrease unit utilisation heavily over all in comparisons to a Traditional Rendering because it is something GPUs are bad at.
Curious if Ada sees significantly higher utilization than Ampere due to SER.
 
So an interesting thing about path tracing is it will decrease unit utilisation heavily over all in comparisons to a Traditional Rendering because it is something GPUs are bad at.
But this path tracing is highly optimized for GPUs. My GPU power in Portal RTX and Cyberpunk Overdrive is nearly at 100%. I think that path tracing is much better suited for current GPUs than tradional rasterizing rendering with a bloated software stack to archive similiar image quality to path tracing (or ray tracing).
 
But this path tracing is highly optimized for GPUs. My GPU power in Portal RTX and Cyberpunk Overdrive is nearly at 100%. I think that path tracing is much better suited for current GPUs than tradional rasterizing rendering with a bloated software stack to archive similiar image quality to path tracing (or ray tracing).

Path tracing is certainly a more elegant workload and bypasses a ton of fixed function hardware but current GPUs are nowhere near optimized for it. Lots of divergence and memory requests in path tracing that lead to bubbles in execution.
 
But it doesnt seem to hold at least Lovelace back. My 4090 can go over 500W in Portal RTX and Cyberpunk Overdrive. On the other hand In Fortnite with UE5.1 my 4090 cant even go over 400W and performance is barely better than with path tracing.
 
In Cyberpunk Overdrive, the PT and Compute stages take the vast majority of the ALU runtime. The pixel/vertex shading share is relatively small, so even with sub-optimal SIMD occupancy the GPU will be sucking a lot of power.
 
It still pretty incredible given the age of the engine. Of course it was tweaked&co, but around CP release, some devs talked about the hell it was to make it do stuff needed for CP2077, coming from W3 (and W3 engine was already tweaked with the blood&wine extension).
 
Back
Top