CYBERPUNK 2077 [XO, XBSX|S, PC, PS4, PS5]

That's a terribly false conclusion you've made there based simply on an overall 50% usage of your CPU with your hardware.

i just disabled multi threading. Got 5 fps boost and GPU usage now hovering on 98% instead of 90%, and CPU usage hovering on 80% instead of 50%.

EDIT:
btw what other conclusions could be concluded from underused CPU and GPU with multi threading enabled, but disabling multi threading results in perf boost and higher CPU + GPU usage?
 
i just disabled multi threading. Got 5 fps boost and GPU usage now hovering on 98% instead of 90%, and CPU usage hovering on 80% instead of 50%.
Your point being what? Your initial conclusion is incorrect.
 
Your point being what? Your initial conclusion is incorrect.

what other conclusions could be concluded from underused CPU and GPU with multi threading enabled, but disabling multi threading results in perf boost and higher CPU + GPU usage?

the game didn't spread the load into multiple threads evenly? thus higher per thread performance give boost compared to more available threads?

IIRC MSFS 2020 also have this design issue (the infamous "limited by main thread" on the debug perf overlay).
 
what other conclusions could be concluded from underused CPU and GPU with multi threading enabled, but disabling multi threading results in perf boost and higher CPU + GPU usage?
Your conclusion was that it doesn't use more than 6 threads, which is incorrect as 12 core and 16 core CPUs show load on all cores when playing Cyberpunk.
 
they should target for one of modes high settings without rt, 30fps and resolution above 1440p dynamic, in 1188p is very blurry so I wouldn't play low resolution rt mode (btw one of df video suggests ps5 gpu is on rx5700 not xt performance because of lack of bandwith so they also make mistakes ;))

Sounds reasonable, the CP2077 settings/perf. The PS5 is faster then a 5700, 5700XT abit above. The lack of BW can be addressed by lowering resolution and/or RT effects.

OK maybe they are. Wow, just saw the tweet that they're withholding product samples from reviewers for focusing on rasterization. That's awful.

Lets not polute the DF thread with this, theres a seperate thread for it now i think.
 
PS5 and XSX are kneecapped by BC for CP2077. They are running GCN instructions not the ones for RDNA 2. There's not a lot to discuss in this case until they release the next-gen patches. Which... considering how things are looking, could take a while to arrive.

I did just check; it's XDK;Durango.

So definitely not the GDK and definitely running GCN instructions for both.

Also just to speak to poor CPU performance: one second let me put up a picture.
eh, just wait for it having troubles getting the picture off my phone.
AShA1VA.jpg


Bottom left hand corner; Oodle Kraken whose purpose is largely for texture compression. So this could explain why the CPUs are being stressed so much. Lowering resolution may not be enough to resolve the framerate issues while driving because the game needs to decompress the textures via CPU.

PS5 won't have to though so it walks away with a huge benefit here. Not sure if they'll use something else for XSX.

The X1X enhanced CP2077 is 53.8GB in size, or 58.3GB in size, sort of mixing up the numbers here. But this is _much_ smaller than RDR2, FH4, and Gears and COD which all are near 100GB on XSX. And this game is 2 discs!

PC is also 50GB, therefore putting some explanation behind CPU killing.
 
Last edited:
I forgot about DLSS. I understand it does not scale like this but, if your 2070 S can run the game with everything at ultra around 4fps, then lowering the resolution by 4 times could produce 16fps, probably higher, correct?
I expect the PS5 to be more powerful than your 2070S, no offense to your rig just stating facts.

the 3090 might cost 4 times as much as an entire PS5, it’s actual performance is not even twice that. Without RT and DLSS; PS5 should run the exact same settings at 1440p as 3090 4K. Below 1440p you should see higher frame rates. Again, not taking into account RT or DLSS. With RT enabled PS5 would need to drop to 1080P to produce the same quality of graphics but with higher FPS.

DLSS might be the game changer though, I don’t expect the consoles to have their alternative ready when the PS5, Series X upgrade drops.
I think you may be expecting too much. My 2070 super is factory overclocked, and in raw pixel fillrate is maybe 15% slower than PS5. Texel performance is closer, and RT performance much better. Not only that, but AMD's RT hardware isn't bespoke. It's part of the same hardware used for regular rasterization workloads. That's part of the reason why RDNA cards have such a performance hit while using RT. Not only is the RT slower, but most games use RT effects and a raster/compute workload. The hardware can't do both at once.
 
no what I said and what you quoted is:
“Without RT and DLSS; PS5 should run the exact same settings at 1440p as 3090 4K.”

This still makes no sense. Do you mean the same settings and frame rate? As "running at the exact same settings" doesn't really mean anything unless it's also playable.

The 3090 has very weak rasterization

You're not doing your credibility any favours with absurd statements like that.

in anyway, with no RT and DLSS, PS5 will outperform 3090 when PS5 is at 1440 native compared to 3090 4K native.

No, it wouldn't. This benchmarks shows the 3090 at 4k outperforming the 5700XT at 1440p by 29%. So unless you're expecting the PS5 to be more than 30% faster then it's at best going to be a tie. And from the Valhalla face off, the conclusion was that the PS5 was performing roughly 15% higher than the 5700XT. Obviously that's just one data point, but when it's all we have (and it makes sense based on the relative specs) then there's no point assuming something wildly different.

And besides all that, I don't know what you're expecting to gain from such a comparison. Even if the PS5 were as fast as the 3090 when the 3090 is running with 2.5x more resolution and has it's two biggest hardware advantages ignored, why is that in any way meaningful?

DLSS produces quite a lot of shimmering and artifacts in this title, but that is besides the point

There's a different thread for people who want to complain about DLSS image quality because they don't have access to it on their platforms:

https://forum.beyond3d.com/threads/...g-discussion-spawn.60896/page-29#post-2182172
 
Last edited:
upload_2020-12-11_15-9-57.png

upload_2020-12-11_15-13-58.png

like I said at without DLSS and RT at 1080P the PS5 would outperform the 3090 at 4K, with 20% higher frame rates.

mind you this is a Nvidia sponsored title, with even the 1080ti performing at the level of 5700xt which is a total joke in itself.

not to mention toms hardware is not allowed to show RT without DLSS, but yeah.
In anyway I expect a PS5 framerate mode at 1080P/60 locked maybe some smart upscaling like Spider-Man as well.
 
View attachment 5071

View attachment 5072

like I said at without DLSS and RT at 1080P the PS5 would outperform the 3090 at 4K, with 20% higher frame rates.

mind you this is a Nvidia sponsored title, with even the 1080ti performing at the level of 5700xt which is a total joke in itself.

not to mention toms hardware is not allowed to show RT without DLSS, but yeah.
In anyway I expect a PS5 framerate mode at 1080P/60 locked maybe some smart upscaling like Spider-Man as well.
you're looking at Ultra preset. Typically consoles sit between medium-high at most for a variety of settings that get them the most optimal performance. And very exceptional cases have a feature or 2 at ultra or above.

3090 running console settings would give a significant performance boost over the metrics posted.
 
View attachment 5071

View attachment 5072

like I said at without DLSS and RT at 1080P the PS5 would outperform the 3090 at 4K, with 20% higher frame rates.

You said 1440p earlier, not 1080p. I've literally quoted you in the post above yours. If you're now saying 1080p then yes I agree. The PS5 should be able to run a non Ray Traced version of CB2077 faster than the 3090 while running at 25% of the resolution. It'd be pretty terrible if it couldn't eh?

But given that the patched console version will likely use RT, and the 3090 can and does use DLSS, this is a completely moot point.

mind you this is a Nvidia sponsored title, with even the 1080ti performing at the level of 5700xt which is a total joke in itself.

The over half a decade old 1080i vs the 1.5 year old 5700XT?

In anyway I expect a PS5 framerate mode at 1080P/60 locked maybe some smart upscaling like Spider-Man as well.

Without RT, yes I agree.
 
For every other title it’s 50% of the resolution, I didn’t expect this game to be that Nvidia.. “optimized”

first search hit for 3090 benchmark
https://www.techspot.com/review/2105-geforce-rtx-3090/

at any given resolution the 3090 produces only (around) double the framerate of 5070xt

it is cyberpunk where the results are totally out of whack in favor of Nvidia. I would go as far as say it’s the worst AMD title in years
 
Last edited:
For every other title it’s 50% of the resolution, I didn’t expect this game to be that Nvidia.. “optimized”
without knowing what's been or hasn't optimized, there's not really anything to discuss here.
By default all AMD hardware is optimized as a result of console programming.

You want ot talk about optimized, this game uses oodle kraken for PC and PS5 platforms. Likely for XBox as well. So everyone is being forced to do oodle kraken CPU decompression, while PS5 is enjoying it's hardware decompress. PS5 is likely going to perform very well to a comparative PC setup.
 
https://www.tomshardware.com/news/cyberpunk-2077-pc-benchmarks-settings-performance-analysis

Based on this the game is not that bad, at most resolutions the 5700xt is only around half the framerate. It’s at 1440p and above where the 5700 is around a third or less

edit: I think we can forget about console ray tracing..

even for the 2070 it’s 41 vs 14 FPS with Rtx off vs on...
Unless they play to run 720P Series X and PS5, with ray tracing at 360P or something :p
 
For every other title it’s 50% of the resolution, I didn’t expect this game to be that Nvidia.. “optimized”

first search hit for 3090 benchmark
https://www.techspot.com/review/2105-geforce-rtx-3090/

at any given resolution the 3090 produces only (around) double the frame rate of 5070xt

it is cyberpunk where the results are totally out of whack in favour of Nvidia. I would go as far as say it’s the worst AMD title in years

Sorry, was this in response to my post? I'm not clear what you're referring to. It's fair to say that in pure rasterization performance the PS5 might be around half the performance of the 3090. Probably less at 4K and probably more at 1440p.

But the 3090 simply isn't designed to hit it's peak performance in pure rasterization / non-DLSS workloads. It dedicates significant die space to these features and so if you're wanting to see a real performance potential you have to test with them enabled.

And Cyberpunk 2077 is the perfect test of this.
 
Back
Top