CYBERPUNK 2077 [XO, XBSX|S, PC, PS4, PS5]

Hmm at first i thought maybe my cpu was not up to par but you have a 3900x and see the same drops. Im thinking this just needs more patches. If i remember correctly Witcher 3 got patches every month for like 5 months when it launched

I dropped RT lightning to medium and cant really tell the difference in IQ. Well see if that helps any. Just completed the vr dive at Lizzies. Wow great start to the story so far. Main characters are super nice looking.
 
I put three-to-four hours in on PS5 yesterday, starting as nomad which throws you into a massive open landscape from the out. Apart from some weird middle-distance LOD issues which existed only in the desert - so perhaps a buggy heat haze effect, I've only encountered two bugs.

The first was in the desert and was a guy sitting in an invisible car, only the car wasn't just invisible is wasn't there at all because I managed to clip him with my car which the game interpreted as a hit and run, which initiates a police report. The second was a bit of UI that failed to fade away. Fortunately, the game autosaves a lot and I was able to reload from a point a couple of minutes earlier and the UI reverted back to normal.

I'm still learning the game and the mechanics, the majority of which will be familiar to anybody who has played video games before. I expected the UI and menu to be Mass Effect-level bad but it's not. It's not great either, and there it literally no explanation for a lot of stuff but if you can't work out the UI and menus, you're going to struggle with an RPG period,

Because of the technical state of the game I went in with super-low expectations and I've been pleasantly surprised. :yes: I've only done a few early missions, including the one demo'd at 2018 Gamescom where you rescue a woman from an apartment. Driving is fine (not great), gunplay is pretty good as far as I'm concerned. Putting aside the core graphics - the visual presentation of the game, such as your UI which includes various info from your cyberware implants and the way you interact with the world, feels super good and fluid.
How is the framerate?
 
How is the framerate?
On PS5? Pretty close to 60 on most places.

I'm really liking the game but I am so bad at roleplaying my actual character. I decided to go for a high intelligence, high cool build, brains over brain etc, keeping a low profile and not drawing attention to myself and so on, but the moment I see a nearby crime I immediately become Robocop. :yep2:

It is the same or worse on PS5, I find it funny. For sure Tom Warren comment is valid but the game will probably look much better in 2021.
There is something wrong with the Series X picture here - it's way too bright. I just went through this same sequence on PS5.
 
doubt it, first mmission you drive car by yourself in the city and certainly drops are there and not minor one

The driving hits all platforms but I guess ti depends how fast you're driving. I'm not racing around, I'm taking in the city.
 
if the 3090 is running at 4K and the PS5 can be locked to 1080P then the PS5 might have even better graphics or frame rate.

The game is running at 4k/30 on a 3090 when maxed without dlss, 6900XT wont be long from that. These are about three times more powerfull than PS5 GPU. Dont forget the much more powerfull CPUs those 3090/6900XT tests systems have.

Looking at other benches, the PS5 gpu seems abit above 5700XT/OC (DFs valhalla bench vs pc gpus etc).

Even without RT, this game is taxing. It all depends on if they are going to include RT or how much of it. I strongly doubt they will run at a 1080p resolution, thats just insane to even imagine? Maybe 1440p medium, slight RT if any.
 
Neither does the PS5 pack 6800XT/3080 rasterization level capability. Even rasterization power matters alot in this game seeing the benches with RT/DLSS off. In valhalla, the PS5 seems close/abit faster to 5700XT (amd optimized game). In pure rasterization, one gets an idea.


I think there is hope for ps5 version. CP2077 is very pretty to begin with and there is really good mix of ray tracing settings. Maybe they can come up with good optimizations and a hw specific compromise. But yeah, it's not going to match high end pc unless pc gets that much faster also.

The PS5 (and XSX) should be able to put in reasonable showings without RT. The RX 5700XT can hit almost 40fps at 1440p and Ultra settings without RT in the benchmark below so we should expect even better than that from the consoles.

Naturally though they will incorporate some level of RT and that's where I expect things to become more problematic. Here's a 2080Ti running at RT Ultra, 1080p native at 41fps average. Obviously there's a lot more RT capability there than in the new consoles but with some minor cut backs they might be able to achieve close to the PC experience at 1080p and 30fps. The question is will they do that or will they prioritise resolution over graphics like every other current release seemingly has.

wvPxx6QTybXUS3SsDiFLWo.png


Then I showed him the PC version. And he's now going to get a gaming PC for the first time in his life.
clip_image002.gif

When I first played the Wticher 3 I recall at the time thinking it was one of the best looking games available. Then I modded it and it looked significantly better. Then I ran it in 3D Vision (RIP) and it still blows my mind today.

If he really thinks that game is top 5 PS4 then he has not played God of War, the last of us 2, Spider-Man, horizon, The Order, RDR2, Death Stranding, Ghost of Tsushima, Final Fantasy, just to name a few.

As those games would make Witcher 3 a top 10 game, at best.

I think there's often an element with exclusives of "this game looks better than any multi platform game because it's only available on my platform". That's not to say the above aren't great looking games. But after the hype that Horizon received I must admit that I wasn't as impressed as I expected to be when I got it on PC (running at Ultra settings at that). It's a lovely looking game no doubt, but personally I prefer the graphics of both the Witcher 3, AC Origins, AC Odyssey and RDR2. Although granted I have significant graphical mods on the first 3 which heavily skew the comparison.

The Witcher 3 has weird character proportions, movement, as well as animations. The colors especially look very cartoony.

That's where mods come in :) I absolutely agree with you that the colours are over saturated. Same with AC Origins and AC Odyssey. The first thing I did with all 3 games was a apply re-shade mods to give more photorealistic pallets (amongst other things). The difference is night and day IMO.

if the 3090 is running at 4K and the PS5 can be locked to 1080P then the PS5 might have even better graphics or frame rate.

See the 2080Ti benchmark above. The PS5 is unlikely to be able to achieve 30fps at 1080p at the PC's full Ultra level settings. I expect some settings compromises and a higher resolution target as seems to be the trend at the moment.

I forgot about DLSS. I understand it does not scale like this but, if your 2070 S can run the game with everything at ultra around 4fps, then lowering the resolution by 4 times could produce 16fps, probably higher, correct?

Not if the bottleneck at those higher settings is VRAM capacity. DLSS will reduce VRAM requirements due to the lower native resolution and so could bring you back within your capacity and this allow frame rate to scale more than linearly with resolution.

I expect the PS5 to be more powerful than your 2070S, no offense to your rig just stating facts.

I think this is an incorrect expectation. At least in RT workloads which CB2077 on consoles will be. Digital Foundry has already shown both the PS5 and XSX to be a little below a 2060S in RT workloads - granted this is just a single data point at present (Watch Gods Legion), but based on that, there are no grounds for expecting the PS5 to out perform a 2070S in RT workloads.

the 3090 might cost 4 times as much as an entire PS5, it’s actual performance is not even twice that.

Here's the 3090 at 4k native with no RT achieving 46.7fps to the 5700XT's 18fps. Granted the PS5 will be a little faster than the 5700XT but even if it's 25% faster that still leaves the 3090 at comfortably double the non-RT, non DLSS performance. And of course, when you include RT and DLSS - because you should since they do make up portions of the GPU's potential performance that you're paying for, then the comparison favours the 3090 far more heavily. To the point in fact where it's likely to be able to achieve 5-6x the performance of the PS5 running at a native 4k.

Without RT and DLSS; PS5 should run the exact same settings at 1440p as 3090 4K.

So you're saying that the PS5 without RT and at a much lower resolution can match the 3090 with RT and at a much higher resolution (with DLSS)? I'm not sure what the point of such a comparison is however based on these benchmarks, the 3090 would still be faster:

5700XT @1440p Ultra (No RT) - 38.1fps

3090 @ 4K DLSS Performance with Ultra RT - 54.8fps

Below 1440p you should see higher frame rates. Again, not taking into account RT or DLSS. With RT enabled PS5 would need to drop to 1080P to produce the same quality of graphics but with higher FPS.

No. it wouldn't. Again, see my 2080Ti comparison above. The PS5 would be lucky to hit 30fps at PC Ultra RT settings and 1080p. The 3090 at the sames settings at 4K DLSS is hitting almost 55fps.

DLSS might be the game changer though, I don’t expect the consoles to have their alternative ready when the PS5, Series X upgrade drops.

Consoles already have alternatives. Straight upscaling, checkerboard rendering etc... However if you're talking about something that is both as performant, and as high quality as DLSS then I think you probably need to check your expectations. AMD have recently said that the alternative they're working on isn't ML based, which makes sense given their lack of Tensor cores. It'll be interesting to see what they come up with, but given they lack the hardware to accelerate ML code to anywhere near the levels of Turing or Ampere I'll be both surprised and impressed if they manage to match both the performance and quality of DLSS. And that's on their desktop RDNA2 parts.
 
1080p and 30fps

he PS5 is unlikely to be able to achieve 30fps at 1080p at the PC's full Ultra level settings. I expect some settings compromises and a higher resolution target as seems to be the trend at the moment.

Again i doubt 1080/30 will do any good marketing wise alone. Also, 1080p/30fps and still no ray tracing, its what made this game look excellent to begin with. Nighttime with the smoke/fog GI RT is what creates this bladerunner feel.
This game is easily the best showcase ever for ray tracing (maxed). Nothing even comes close.

based on that, there are no grounds for expecting the PS5 to out perform a 2070S in RT workloads.

According to tests, the PS5 is close to 5700XT, somewhat faster. In this game we talk about (CP2077), 5700XT is below a 2070S in rasterization performance. Its a match for a 2060S in this game, but seeing the PS5 is somewhat faster, we are looking at 2070 (non S) raw rasterization performance. And thats right where it was placed against awhile ago based on specs.

Its also unlikely they will go for 40fps (with possible dips). Incase, they will most likely go for a stable 30.

Edit: this game is a cpu taxing monster, low resolutions to chase higher FPS, the CPU might get a real workout. Even 3700x's do, at 3.5ghz the ps5 cpu will be hammered?

https://videocardz.com/newz/cyberpu...orce-rtx-30-and-radeon-rx-6000-graphics-cards
 
Last edited:
lower level fast ps5 api should do work on cpu side

Obviously, but it wont make it faster then a 3700x which clocks way above 4ghz when needed (besides a larger cache?). Anyway, i compared it to the 3700x which gets a real workout in this title. DF has put the PS5 cpu below a 3700x before, but lets assume due to magic it will match a 3700x, its still a sweat in this game.
Its going to be real intresting what compromises CDPR will make, if they forego ray tracing they can win alot, they can go for high/ultra and maintain 1080p/30 w/o rt, but i doubt it. Maybe 1440p medium and try to reach 60?
 
So you're saying that the PS5 without RT and at a much lower resolution can match the 3090 with RT and at a much higher resolution (with DLSS)? I'm not sure what the point of such a comparison is however based on these benchmarks, the 3090 would still be faster:

5700XT @1440p Ultra (No RT) - 38.1fps

3090 @ 4K DLSS Performance with Ultra RT - 54.8fps

no what I said and what you quoted is:
“Without RT and DLSS; PS5 should run the exact same settings at 1440p as 3090 4K.”

The 3090 has very weak rasterization to the point of Nvidia blackmailing reviewers that focus on it.

in anyway, with no RT and DLSS, PS5 will outperform 3090 when PS5 is at 1440 native compared to 3090 4K native.

DLSS produces quite a lot of shimmering and artifacts in this title, but that is besides the point
 
turns out the game only use around 6 threads max. My ryzen 2600 only have max cpu usage of 50%.

So it probably runs better with hyper threading disabled. hmm gonna disable hyper threading.
 
Obviously, but it wont make it faster then a 3700x which clocks way above 4ghz when needed (besides a larger cache?). Anyway, i compared it to the 3700x which gets a real workout in this title. DF has put the PS5 cpu below a 3700x before, but lets assume due to magic it will match a 3700x, its still a sweat in this game.
Its going to be real intresting what compromises CDPR will make, if they forego ray tracing they can win alot, they can go for high/ultra and maintain 1080p/30 w/o rt, but i doubt it. Maybe 1440p medium and try to reach 60?
they should target for one of modes high settings without rt, 30fps and resolution above 1440p dynamic, in 1188p is very blurry so I wouldn't play low resolution rt mode (btw one of df video suggests ps5 gpu is on rx5700 not xt performance because of lack of bandwith so they also make mistakes ;))
 
no what I said and what you quoted is:
“Without RT and DLSS; PS5 should run the exact same settings at 1440p as 3090 4K.”

The 3090 has very weak rasterization to the point of Nvidia blackmailing reviewers that focus on it.

in anyway, with no RT and DLSS, PS5 will outperform 3090 when PS5 is at 1440 native compared to 3090 4K native.

DLSS produces quite a lot of shimmering and artifacts in this title, but that is besides the point

Yeah..... Nvidia is not blackmailing reviewers lol.
 
The biggest scam in the industry atm.
no not necessarily. It serves another audience. CP2077 may just be another Crysis scenario. A lot of advanced graphic techniques made using methods that are probably inefficient to the type of technologies that will be available going forward. I don't expect CP2077 performance to be considered the norm for going forward. If it is, PS5 and XSX won't even make it after 2022.
 
Sorry but that has to be a mistake; especially with the thing about the PC version as there is little to no difference between ultra and 2013 PS4 non pro.


If he really thinks that game is top 5 PS4 then he has not played God of War, the last of us 2, Spider-Man, horizon, The Order, RDR2, Death Stranding, Ghost of Tsushima, Final Fantasy, just to name a few.
As those games would make Witcher 3 a top 10 game, at best.
I am not trying to diss Witcher 3, just stating technical facts.
The Witcher 3 has weird character proportions, movement, as well as animations. The colors especially look very cartoony. It’s like comparing Forza Horizon 4 versus Forza 7, with Witcher 3 being Forza 7. Nobody is going to claim Forza 7 has better lighting or graphics.
I never got Impressed with Witcher 3's visuals either. It has a lot of work. But sacrifices were obviously made to enable the huge world and as one of the big projects for that generation.
 
Back
Top