CYBERPUNK 2077 [PC Specific Patches and Settings]

I hope these adventures in RT on the old engine are transferable to their UE5 development otherwise I don’t see the point. They’ve already far surpassed what Lumen is capable of.
RTXDI has already been available for months in UE 5.1 through Nvidias NvRTX branch. Then again CDPR's UE5 based games are probably a long way from release still so who knows, maybe Lumen or other methods develop into something more appealing than what Nvidia has to offer.
 
I noticed a very interesting thing in Cyberpunk path tracing, upscaling extends NVIDIA's lead even further than native resolution.

So for example here, at native 4K, the 3090Ti is 50% faster than 7900XTX, but the 3090Ti with DLSS Q runs 2x times faster than 7900XTX with FSR Q. This repeats again across all resolutions, and happens with the 40 series as well. The 4090 is 2.8x times faster than 7900XTX at native 4K, then using DLSS Q it becomes 4x times faster vs 7900XTX FSR Q.

I am curious if the difference is coming from DLSS or something else, I hope someone gets to test NVIDIA GPUs with FSR to see if it makes a difference or not.

 
I noticed a very interesting thing in Cyberpunk path tracing, upscaling extends NVIDIA's lead even further than native resolution.

So for example here, at native 4K, the 3090Ti is 50% faster than 7900XTX, but the 3090Ti with DLSS Q runs 2x times faster than 7900XTX with FSR Q. This repeats again across all resolutions, and happens with the 40 series as well. The 4090 is 2.8x times faster than 7900XTX at native 4K, then using DLSS Q it becomes 4x times faster vs 7900XTX FSR Q.

I am curious if the difference is coming from DLSS or something else, I hope someone gets to test NVIDIA GPUs with FSR to see if it makes a difference or not.

Perhaps 4k native shifts some of the bottlenecks away from ray calculation and more towards areas like bandwidth where AMD wont fare as poorly.
 
I noticed a very interesting thing in Cyberpunk path tracing, upscaling extends NVIDIA's lead even further than native resolution.

So for example here, at native 4K, the 3090Ti is 50% faster than 7900XTX, but the 3090Ti with DLSS Q runs 2x times faster than 7900XTX with FSR Q. This repeats again across all resolutions, and happens with the 40 series as well. The 4090 is 2.8x times faster than 7900XTX at native 4K, then using DLSS Q it becomes 4x times faster vs 7900XTX FSR Q.

I am curious if the difference is coming from DLSS or something else, I hope someone gets to test NVIDIA GPUs with FSR to see if it makes a difference or not.


Those tests were already posted:

Post in thread 'CYBERPUNK 2077 [PC Specific Patches and Settings]' https://forum.beyond3d.com/threads/...cific-patches-and-settings.62263/post-2296686

No difference between FSR and DLSS.
 
Physically based materials also show up much better with physically based lighting. But ray tracing also helped in many places.
 
RTXDI has already been available for months in UE 5.1 through Nvidias NvRTX branch. Then again CDPR's UE5 based games are probably a long way from release still so who knows, maybe Lumen or other methods develop into something more appealing than what Nvidia has to offer.

Yeah UE5 will probably evolve significantly over its lifetime. Epic seems to be happy with shadow maps and rasterized direct lighting for now though.
 
seems its a double edged sword. making the assets looking faker than ever

I agree in general but the assets in those screenshots don’t look too bad. You can tell the guy has a real eye for photography. Open world games usually don’t have the highest fidelity assets given the sheer volume that needs to be shipped and rendered. Cyberpunk is pretty detailed compared to other open world games set in a city (GTA, Watch Dogs, Spider-Man).
 
I'm with the minority here that thinks Cyberpunk's assets are generally rather weak. I typically find the materials to be very unconvincing but those screens due to the framing/editing/whatever leave a much better impression than is typical of the game.
 
seems its a double edged sword. making the assets looking faker than ever

I wonder how much of it is the assets having been with an eye to the old renderer and trying to make the best out of that. Perhaps just by being made with a better lighting model from the start, the assets of a game might end up more consistent.
 
seems its a double edged sword. making the assets looking faker than ever

OK, glad I'm not the only one seeing that. It makes the visuals a mixed bag and I'm not sure the graphics end up being overall better in my eyes since the contrast between the dated assets and the new lighting clashes significantly to me, where with the standard lighting, the lighting is obviously worse, but it doesn't clash so much with the dated assets.

Regards,
SB
 
I wonder how much of it is the assets having been with an eye to the old renderer and trying to make the best out of that. Perhaps just by being made with a better lighting model from the start, the assets of a game might end up more consistent.

I'm not sure it would have mattered much. Just like the hardware in most machines weren't capable of fully path traced lighting at the time with the rendering technology available (software side) the same was true for world assets (texture and geometry quality and density) with hardware and rendering technology (their engine).

It'll be interesting to see how they improve in all areas with UE5. If they'd continued with their in house engine, I'm not sure world geometry would end up significantly better and it needs to be significantly better if RT is going to really look good to me. For the lighting to look believable the geometry also needs to look believable and vice versa. If one or the other doesn't, then it all kind of falls apart for me.

Regards,
SB
 
Tried pathtracing last night on my overclocked Asus 3080Ti that can hold ~1840MHz reliably with all the fans at full tilt. It's also a 5950x at 5GHz, 64GB of B-die CL14 DDR4 at 3800 1:1, and a pair of Sammy 980Pro NVMe drives.

Holy hell does it hammer framerate.

Normally I run all RT features enabled and the psycho preset, with all the other adjustable features cranked to the hilt. Raster is 2560x1440 with DLSS Quality and vsync enabled on my 60Hz Dell 2711. I do manually disable some minor stuff like chromatic abberation and motion blur and lens flare because I'm not a fan of those features... This gives me a benchmark result of 59.xx FPS with a minimum framerate recorded in the high 40's.

Then I turned on the pathtracing function, left everything else the way it was, and re-ran the benchmark. Ouch! Came out with something in the low 30's FPS range... Flipped DLSS setting to performance, retried the benchmark, and got into the low 40's FPS range with minimums still in the teens. Actual playability was pretty poor; the minimum framerates really do rear their ugly head pretty often.

I'll be staying away from the pathtracing option until I decide to spend another grand on a different video card.
 
Next to DF's showcase I found this video to be one of the best so far.


seems its a double edged sword. making the assets looking faker than ever
I see it differently. Among Control, multiple Unreal Engine titles and some other games Cyberpunk 2077 has some of the best material shading for me. On the other hand I didn't like the material shading in Doom Eternal.

In addition, you can see here very well how much better and more real the wall looks with path tracing. Start at 2:21.

Tried pathtracing last night on my overclocked Asus 3080Ti that can hold ~1840MHz reliably with all the fans at full tilt. It's also a 5950x at 5GHz, 64GB of B-die CL14 DDR4 at 3800 1:1, and a pair of Sammy 980Pro NVMe drives.

Holy hell does it hammer framerate.

Normally I run all RT features enabled and the psycho preset, with all the other adjustable features cranked to the hilt. Raster is 2560x1440 with DLSS Quality and vsync enabled on my 60Hz Dell 2711. I do manually disable some minor stuff like chromatic abberation and motion blur and lens flare because I'm not a fan of those features... This gives me a benchmark result of 59.xx FPS with a minimum framerate recorded in the high 40's.

Then I turned on the pathtracing function, left everything else the way it was, and re-ran the benchmark. Ouch! Came out with something in the low 30's FPS range... Flipped DLSS setting to performance, retried the benchmark, and got into the low 40's FPS range with minimums still in the teens. Actual playability was pretty poor; the minimum framerates really do rear their ugly head pretty often.

I'll be staying away from the pathtracing option until I decide to spend another grand on a different video card.

There is something wrong. A colleague with a 3080 Ti was able to optimise the game to 30 fps. I think he set the resolution to 1660p with DLSS Balance. In any case his minimum FPS in the benchmark area was around 29 fps and not in the teens.
 
Last edited:
Oh I'm quite sure I could "optimize" it to 30fps by turning some features down, I simply wanted an apples-to-apples comparison of how I normally run it today but with pathtracing enabled.

I bet I could get it to 60fps with pathtracing enabled, along with some "optimizing." I'm not really sure I'd be happy with the result...
 
Back
Top