Neither does the PS5 pack 6800XT/3080 rasterization level capability. Even rasterization power matters alot in this game seeing the benches with RT/DLSS off. In valhalla, the PS5 seems close/abit faster to 5700XT (amd optimized game). In pure rasterization, one gets an idea.
I think there is hope for ps5 version. CP2077 is very pretty to begin with and there is really good mix of ray tracing settings. Maybe they can come up with good optimizations and a hw specific compromise. But yeah, it's not going to match high end pc unless pc gets that much faster also.
The PS5 (and XSX) should be able to put in reasonable showings without RT. The RX 5700XT can hit almost 40fps at 1440p and Ultra settings without RT in the benchmark below so we should expect even better than that from the consoles.
Naturally though they will incorporate some level of RT and that's where I expect things to become more problematic.
Here's a 2080Ti running at RT Ultra, 1080p native at 41fps average. Obviously there's a lot more RT capability there than in the new consoles but with some minor cut backs they might be able to achieve close to the PC experience at 1080p and 30fps. The question is will they do that or will they prioritise resolution over graphics like every other current release seemingly has.
Then I showed him the PC version. And he's now going to get a gaming PC for the first time in his life.
When I first played the Wticher 3 I recall at the time thinking it was one of the best looking games available. Then I modded it and it looked significantly better. Then I ran it in 3D Vision (RIP) and it still blows my mind today.
If he really thinks that game is top 5 PS4 then he has not played God of War, the last of us 2, Spider-Man, horizon, The Order, RDR2, Death Stranding, Ghost of Tsushima, Final Fantasy, just to name a few.
As those games would make Witcher 3 a top 10 game, at best.
I think there's often an element with exclusives of "this game looks better than any multi platform game because it's only available on my platform". That's not to say the above aren't great looking games. But after the hype that Horizon received I must admit that I wasn't as impressed as I expected to be when I got it on PC (running at Ultra settings at that). It's a lovely looking game no doubt, but personally I prefer the graphics of both the Witcher 3, AC Origins, AC Odyssey and RDR2. Although granted I have significant graphical mods on the first 3 which heavily skew the comparison.
The Witcher 3 has weird character proportions, movement, as well as animations. The colors especially look very cartoony.
That's where mods come in
I absolutely agree with you that the colours are over saturated. Same with AC Origins and AC Odyssey. The first thing I did with all 3 games was a apply re-shade mods to give more photorealistic pallets (amongst other things). The difference is night and day IMO.
if the 3090 is running at 4K and the PS5 can be locked to 1080P then the PS5 might have even better graphics or frame rate.
See the 2080Ti benchmark above. The PS5 is unlikely to be able to achieve 30fps at 1080p at the PC's full Ultra level settings. I expect some settings compromises and a higher resolution target as seems to be the trend at the moment.
I forgot about DLSS. I understand it does not scale like this but, if your 2070 S can run the game with everything at ultra around 4fps, then lowering the resolution by 4 times could produce 16fps, probably higher, correct?
Not if the bottleneck at those higher settings is VRAM capacity. DLSS will reduce VRAM requirements due to the lower native resolution and so could bring you back within your capacity and this allow frame rate to scale more than linearly with resolution.
I expect the PS5 to be more powerful than your 2070S, no offense to your rig just stating facts.
I think this is an incorrect expectation. At least in RT workloads which CB2077 on consoles will be. Digital Foundry has already shown both the PS5 and XSX to be a little below a 2060S in RT workloads - granted this is just a single data point at present (Watch Gods Legion), but based on that, there are no grounds for expecting the PS5 to out perform a 2070S in RT workloads.
the 3090 might cost 4 times as much as an entire PS5, it’s actual performance is not even twice that.
Here's the 3090 at 4k native with no RT achieving 46.7fps to the 5700XT's 18fps. Granted the PS5 will be a little faster than the 5700XT but even if it's 25% faster that still leaves the 3090 at comfortably double the non-RT, non DLSS performance. And of course, when you include RT and DLSS - because you should since they do make up portions of the GPU's potential performance that you're paying for, then the comparison favours the 3090 far more heavily. To the point in fact where it's likely to be able to achieve 5-6x the performance of the PS5 running at a native 4k.
Without RT and DLSS; PS5 should run the exact same settings at 1440p as 3090 4K.
So you're saying that the PS5 without RT and at a much lower resolution can match the 3090 with RT and at a much higher resolution (with DLSS)? I'm not sure what the point of such a comparison is however based on these benchmarks, the 3090 would still be faster:
5700XT @1440p Ultra (No RT) - 38.1fps
3090 @ 4K DLSS Performance with Ultra RT - 54.8fps
Below 1440p you should see higher frame rates. Again, not taking into account RT or DLSS. With RT enabled PS5 would need to drop to 1080P to produce the same quality of graphics but with higher FPS.
No. it wouldn't. Again, see my 2080Ti comparison above. The PS5 would be lucky to hit 30fps at PC Ultra RT settings and 1080p. The 3090 at the sames settings at 4K DLSS is hitting almost 55fps.
DLSS might be the game changer though, I don’t expect the consoles to have their alternative ready when the PS5, Series X upgrade drops.
Consoles already have alternatives. Straight upscaling, checkerboard rendering etc... However if you're talking about something that is both as performant, and as high quality as DLSS then I think you probably need to check your expectations. AMD have recently said that the alternative they're working on isn't ML based, which makes sense given their lack of Tensor cores. It'll be interesting to see what they come up with, but given they lack the hardware to accelerate ML code to anywhere near the levels of Turing or Ampere I'll be both surprised and impressed if they manage to match both the performance and quality of DLSS. And that's on their desktop RDNA2 parts.