CYBERPUNK 2077 [PC Specific Patches and Settings]

So after moving from a 4070ti to a 7900XTX my preferred way of playing this is using @Dictator optimised settings at a native 4472x1872 and then down-sampled to 3440x1440.

This nets me a locked 60fps with 20-40fps head room depending on the scene.

I have never seen this game look so clean, as much as I loved using Over Drive mode on my 4070ti, the image quality was, frankly, dog shit and losing Over Drive mode but gaining this level of image quality is well worth it to me.

Using the LUT Pure 2.0 lighting mod helps mitigate losing Over Drive mode and I'm going to be using some texture packs to and a few other tweaks.

But yea, down-sampling is the way forward.

A quick comparison of native vs down-sampled

But it doesn't do it justice as in motion is where you really see the difference, there's no aliasing, flickering, fizzing....it looks amazing.
 
@davis.anthony It's good to have options available on pc. Being able to brute force your way into image quality or performance is a good option on pc. In terms of the way forward, it's a dead end because it'll never be a good fit on console. Developers will have to find ways to save performance as hardware gains slow. I'd argue that 60 fps is a dead end in itself, because it's an image quality problem in itself. There are huge gains in image quality by going to 120 fps. 60 fps isn't even good enough on CRTs because of flicker. I think we're likely to see more 120 fps performance modes on consoles next gen and the only way to get there and still allow for other rendering improvements is with smarter upscaling.
 
@davis.anthony It's good to have options available on pc. Being able to brute force your way into image quality or performance is a good option on pc. In terms of the way forward, it's a dead end because it'll never be a good fit on console. Developers will have to find ways to save performance as hardware gains slow. I'd argue that 60 fps is a dead end in itself, because it's an image quality problem in itself. There are huge gains in image quality by going to 120 fps. 60 fps isn't even good enough on CRTs because of flicker. I think we're likely to see more 120 fps performance modes on consoles next gen and the only way to get there and still allow for other rendering improvements is with smarter upscaling.

Until 120fps gets rid of aliasing, flickering and fizzing then 60fps with higher IQ will always have a place.

Running at higher frame rates is not an option and is simply trying to brute force the problem and failing.
 
Until 120fps gets rid of aliasing, flickering and fizzing then 60fps with higher IQ will always have a place.

Running at higher frame rates is not an option and is simply trying to brute force the problem and failing.

You are right. Running at extremely high frame rates is a brute force approach. So is super sampling. They mean to address different image quality problems and they do it just by throwing hardware at it. Sample and hold displays are an image quality problem. Display flicker from strobing or CRT decay is an image quality problem. Panning at 24, 30 and 60 Hz is an image quality problem. Rasterization has a set of image quality problems. Ray tracing trades off those problems for a set of new ones. There are just endless problems when you start getting into color spaces, HDR etc. The way forward is never going to be throwing bigger hardware at it, but it is a very nice option in the pc space for people who have the money to take that approach.
 
You are right. Running at extremely high frame rates is a brute force approach. So is super sampling. They mean to address different image quality problems and they do it just by throwing hardware at it. Sample and hold displays are an image quality problem. Display flicker from strobing or CRT decay is an image quality problem. Panning at 24, 30 and 60 Hz is an image quality problem. Rasterization has a set of image quality problems. Ray tracing trades off those problems for a set of new ones. There are just endless problems when you start getting into color spaces, HDR etc. The way forward is never going to be throwing bigger hardware at it, but it is a very nice option in the pc space for people who have the money to take that approach.

But down-sampling is easier to achieve than 120fps+

If you're able to run at 120fps at your native resolution then you already have head room for down-sampling at a 60fps target.

Titanfall 2 had the best idea, dynamic resolution that could go above native resolution if there was headroom available.
 
Until 120fps gets rid of aliasing, flickering and fizzing then 60fps with higher IQ will always have a place.

Running at higher frame rates is not an option and is simply trying to brute force the problem and failing.
well, dunno about 120fps because I rarely play at that framerate, I play more like at 165fps when I can. But 60fps isn't enough, and in fact I am just playing games at 30fps with either frame generation or BFI from the TV -and some other TV settings to smooth the image-,'cos I don't find 30 to 60 fps to be such a big of a jump.

That being said, higher framerates like 165Hz help a lot with AA, and in that sense the best monitor I've ever had was a 240Hz Samsung display that I had to return 'cos of dead pixels. I still miss that monitor tbh.

I remember that game where I applied the lowest settings I could, and disabled AA an so on to achieve 240fps, the jaggies would cut your fingers, they were so pointy in still frames. But you started moving, and at 240fps the jaggies were totally gone, I didn't expect that at all
 
well, dunno about 120fps because I rarely play at that framerate, I play more like at 165fps when I can. But 60fps isn't enough, and in fact I am just playing games at 30fps with either frame generation or BFI from the TV -and some other TV settings to smooth the image-,'cos I don't find 30 to 60 fps to be such a big of a jump.

That being said, higher framerates like 165Hz help a lot with AA, and in that sense the best monitor I've ever had was a 240Hz Samsung display that I had to return 'cos of dead pixels. I still miss that monitor tbh.

I remember that game where I applied the lowest settings I could, and disabled AA an so on to achieve 240fps, the jaggies would cut your fingers, they were so pointy in still frames. But you started moving, and at 240fps the jaggies were totally gone, I didn't expect that at all

I've recently replaced a 240hz OLED and now use a 265hz OLED and at maximum refresh they both still have jaggies.

Once you're used to seeing stupidly clean image quality going to back to the issues I mentioned above it very very difficult.
 
I've recently replaced a 240hz OLED and now use a 265hz OLED and at maximum refresh they both still have jaggies.

Once you're used to seeing stupidly clean image quality going to back to the issues I mentioned above it very very difficult.
do you mean jaggies running the games at actual 240 or 265Hz? Or jaggies in general? 'Cos in order to get rid of jaggies, the game must be running at those framerates. Gotta say that the jaggies didn't disappear on the monitor I previously mentioned, 'cos it was a driving game and when stopping the car the jaggies were noticeable, but when you started moving, the 240fps were wonderful to hide jaggies.
 
don't find 30 to 60 fps to be such a big of a jump.
I m surprised to hear this, especially from someone who plays at higher framerates. Difference between 30fps and 60fps is highly noticeable in my eyes. It's hard for me to go back to 30fps even from 40fps. I wonder if, just like colors, some people have biologically higher or lower sensitivity to framerates due to how their eyes work.
 
do you mean jaggies running the games at actual 240 or 265Hz? Or jaggies in general? 'Cos in order to get rid of jaggies, the game must be running at those framerates. Gotta say that the jaggies didn't disappear on the monitor I previously mentioned, 'cos it was a driving game and when stopping the car the jaggies were noticeable, but when you started moving, the 240fps were wonderful to hide jaggies.

The games actually ran at those frame rates.
 
I used to really care about how pristine the image quality looks, but then I discovered how little it mattered when it exacerbated the flaws in graphics, the pop ins and pop out, the low fps animation, the shadowless parts of the image, how flat the lighting looks, how fake and incomplete the relfections are, etc.

I can't see myself ever playing Cyberpunk without Path Tracing considering how much more complete it looks from most aspects, and I am not bothered by the minimal noise in the image, considering I am used to watch movies and video clips with as much noise or worse than Cyperpunk.
 
One fortunate thing about being horribly nearsighted (-14 diopter!) is I lose the ability to resolve fine details at distance, even when wearing my contacts or glasses. As such, on a high enough resolution monitor, typical edge-of-geometry jaggies disappear for me. However, I'm still very sensitive to light and motion, and aliasing of over-bright contrast edges (eg a lightsource or reflection thereof which is aliased) is still very much visible to me. My single largest complaint with nearly every modern game is the apparent inability to proper anti-alias light emitting surfaces and high contrast reflections. A common example would be a high gloss metallic grate on the floor, reflecting a bright room light, which "blinks" as you move.

Raytracing seems to get this more right than wrong, especially in CP2077. Part of it might be DLSS, part of it might just be the ray/path tracing, dunno exactly but it works really well. I can't imagine trading pathtracing for a higher framerate, just because I'd rather play at 40-50 FPS and not have the jarring, jagged edges.

Oh, and now that I'm spoiled to ray tracing, screen-space reflection and lighting effects are just world-breaking to me. At this point I'd rather have baked lighting than SSR / SSAO.
 
Flickering must go first. Sharpness is not so important for me.

I also don't like ghosting but flickering bothers me much more. Maybe it's also an evolutionary thing that people see flicker so clearly. The teeth of a predator in the bushes. Who knows.
 
I think I could adapt back to low frame rates if I stuck with it, but the adaptation period is so unpleasant. I have several games I play close to 240Hz , and I think 120Hz is still very good. If I go lower than 90Hz I’d say the experience is very bad. I tried playing Alan wake 2 around 70 Hz and it looked like a slide show and was very unpleasant. But different people adapt to different things. So cyberpunk with ray tracing at 60hz for one person, cyberpunk with super sampling for another and cyberpunk with max frame rate for another.

The way forward will be whatever addresses the largest pool of customers, which is consoles and low end cards. Maybe ray tracing will be viable on ps6 but not at 4K native. Maybe 120 fps will be viable but not at 4K native. Would 60 fps even be viable at 4K native and still leave room for other next generation visual improvements? Probably not.

So I think a game like cyberpunk will continue to target 30/60hz upscaled to 4K when next generation comes around and brute force options will continue to be pay to play on pc.

Actually if image stability was my primary concern, it would be interesting to play a game like cyberpunk with a high-end gpu on a 1080p screen, super-sampled and still getting 120Hz minimum. Too bad there aren't 24" 1080p oleds. I would actually consider that as purely a gaming display, since 1440p usually requires upscaling anyway to either hit high frame rates or high graphics.
 
Flickering must go first. Sharpness is not so important for me.

I also don't like ghosting but flickering bothers me much more. Maybe it's also an evolutionary thing that people see flicker so clearly. The teeth of a predator in the bushes. Who knows.

Down-sampling like I am with a 60fps target, there is no ghosting or flickering, at all.

I think you all should just try it.

Use DF's recommend settings and down-sample until you get to 60fps.

Alan Wake 2 looks amazing with it too.
 
I m surprised to hear this, especially from someone who plays at higher framerates. Difference between 30fps and 60fps is highly noticeable in my eyes. It's hard for me to go back to 30fps even from 40fps.
sry if I didn¡t explain myself well. There is a very noticeable difference of course. I meant that I use BFI from the TV or framerate generation and play certain games at internal 30fps, i.e. on my TV which is 4K and 4K 120Hz is not attainable. So for me going from 30fps + BFI (works ok, but rotating the camera gives 30fps out) or 30fps + FG (rotating the camera gives some 30fps vibe, but not as much as with BFI). to native 60fps doesn't make much of a difference, but from 82 fps to 164fps with frame generation, the difference is very noticeable for me.

I wonder if, just like colors, some people have biologically higher or lower sensitivity to framerates due to how their eyes work.
My 6 y.o. nephew seems to be one of those people. He and my 4 y.o nephew play a lot of Rocket League on my computer. I had set the game at 60fps to play on my TV 'cos of the 4K resolution, but I use my 165Hz monitor when they play, and the other day they were playing Rocket League at 60fps.

It did occur to me that setting the game to run at 165fps would be nice. When I did so, my 6 y.o. nephew told me, without even knowing what I had changed nor even knowing what framerate is:

"Hey, something changed in the screen. Something changed in the screen, the game plays different. (....seconds later...) My car seems to be faster".:
 
Down-sampling like I am with a 60fps target, there is no ghosting or flickering, at all.
I think it very much depends on the game.

The most recent incarnation of Prey (which is nearing 10 years old IIRC) had a lot of specular lighting on reflective glossy metal grates as I described earlier, and it was absolutely horrid for creating a sparkling / twinkling effect. This was present even when combined with either NVIDIA's native DSR supersampled AA (2x, 3x, 4x) and also their "DL intelligent" DSR supersampled AA (1.78x, 2.25x). I wrote a post somewhere here on B3D about my experience, and no matter the scaling factor, the specular highlights in the reflections still caused visible flickering and shimmering artifacts in high gloss, well lit, thin-geometry objects.

For whatever reason, there are still a number of graphical anomalies supersampling simply cannot fix, even when it seems like it should.

I do enjoy framegen more than I thought I would. Setting a framerate cap of 50Hz and then letting FG fill in the blanks allows me a ton of visual fidelity while also allowing the 4090 to have plenty of processing power left for folding in the background. I can literally achieve 80+ FPS benchmark scores in CP2077 with all my favorite ultra settings, with DLSS quality and framegen, while still having enough left to fold at ~10MPPD in the background. It's stupid how much power is in this card...
 
Back
Top