Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
That 448 GB/s the 5700XT has it's for itself. It's clockspeed determined largely by it's own ability to stay cool as well. Its hard to get a tell, because really, if PS5 drops 10% in clockspeed (200Mhz) it's the same clockspeed approximately as the 5700XT now, with less CUs, and less bandwidth.
5700xt has 448gb/s for itself also in Valhalla and Call of Duty cold war and still ps5 is little ahead in this games
 
5700xt has 448gb/s for itself also in Valhalla and Call of Duty cold war and still ps5 is little ahead in this games
Yea, and that's exactly why I wouldn't put PS5 anywhere in terms of expectation. PS5 exists in a range of values. You can't expect this type of output repeatedly since the code bases will change over time. Besides, none of those games are Hitman.

As things mature, and covid goes away, companies will have more time to optimize etc. I wouldn't look at the first few data points and project it all the way to the end.

Also, a single game will never represent the 'paper' spec of the hardware. You get an idea of the performance of the card looking over a range of games.

You'll also notice the 5700XT moving up and down at times beating the RTX series of cards here and there, where they couldn't before.

There's certainly space for movement
 
Last edited:
Its a 40 CU 1900 Mhz RDNA gpu that can boost above 1900.
-4 CUs and crank up max boost 300 Mhz and you're at PS5.
with likely the same cache setup and the same memory setup.
There's no performance metric I understand currently that separates RDNA 1 from RNDA 2 CUs aside from feature set.

There's no closer GPU we can find unless they release a 6600 or something.

Thanks! That makes sense -- with my (pretty layman here) knowledge that sounds like it should be a worse gpu than the ps5, if you're only a little narrower and faster that probably means less time waiting on other parts of the gpu when your code path isn't perfectly parallel, right?

Unless your theories above about the boost system really hurting peak gpu performance are valid, which I'm pretty dubious about (it seems unlikely sony would design a machine thats really that much worse at high cpu load than the xsx, especially considering it's less equipped to push more workload onto the gpu as compute shaders.)
 
Yea, and that's exactly why I wouldn't put PS5 anywhere in terms of expectation. PS5 exists in a range of values. You can't expect this type of output repeatedly since the code bases will change over time.

As things mature, and covid goes away, companies will have more time to optimize etc. I wouldn't look at the first few data points and project it all the way to the end.

Also, a single game will never represent the 'paper' spec of the hardware. You get an idea of the performance of the card looking over a range of games.

You'll also notice the 5700XT moving up and down at times beating the RTX series of cards here and there, where they couldn't before.

There's certainly space for movement
sure some games will prrefer amd other nvidia cards but its different case when ps5 gpu is on similar arch. so I've just noticed thats first exmple where apparently ps5 gpu behave worse than 5700xt
 
Yea, and that's exactly why I wouldn't put PS5 anywhere in terms of expectation. PS5 exists in a range of values. You can't expect this type of output repeatedly since the code bases will change over time. Besides, none of those games are Hitman.

As things mature, and covid goes away, companies will have more time to optimize etc. I wouldn't look at the first few data points and project it all the way to the end.

Also, a single game will never represent the 'paper' spec of the hardware. You get an idea of the performance of the card looking over a range of games.

You'll also notice the 5700XT moving up and down at times beating the RTX series of cards here and there, where they couldn't before.

There's certainly space for movement

The 6700 could be only 36 CUs and 6700 XT 40 CUs with higher frequency than 5700XT with RDNA 2 feature set, the only problem they will have infinity cache. The console GPU are RDNA 2 but without infinity cache.
 
  • Like
Reactions: snc
Its a 40 CU 1900 Mhz RDNA gpu that can boost above 1900.
-4 CUs and crank up max boost 300 Mhz and you're at PS5.
with likely the same cache setup and the same memory setup.
There's no performance metric I understand currently that separates RDNA 1 from RNDA 2 CUs aside from feature set.

There's no closer GPU we can find unless they release a 6600 or something.

Indeed, ive said before, 5700XT oced and your as close to PS5 gpu performance as it gets, in terms of performance, with the advantage to the 5700XT OC due to bandwith and extra CU's. Obviously 5700 series lack ray tracing but that feature seems generally non-important to the PS userbase anyway.

5700xt has 448gb/s for itself also in Valhalla and Call of Duty cold war and still ps5 is little ahead in this games

5700XT OC (most 5700s you buy actually are factory oced), is very close. I will have to borrow or get a 5700XT and bump the clocks abit (afterburner) and see, so far ive been using someone elses PC to compare to my PS5 with limited time.
A 5700XT will do 2100mhz.

Thanks! That makes sense -- with my (pretty layman here) knowledge that sounds like it should be a worse gpu than the ps5, if you're only a little narrower and faster that probably means less time waiting on other parts of the gpu when your code path isn't perfectly parallel, right?

Yes, thats why you'd want to OC the 5700XT, which is no problem, to match the clocks abit more. It then has the advantage.

The 6700 could be only 36 CUs and 6700 XT 40 CUs with higher frequency than 5700XT with RDNA 2 feature set, the only problem they will have infinity cache. The console GPU are RDNA 2 but without infinity cache.

Yeah, i'd put the PS5 at 6700 levels, if they didnt have IC. Also, i think theres some more features missing as opposed to the dGPUs? Anyway, 5700XT, if OC'ed, is the closest we can get from AMD. Comparing to Nvidia procucts is obviously going to complicate things even more.
 
sure some games will prrefer amd other nvidia cards but its different case when ps5 gpu is on similar arch. so I've just noticed thats first exmple where apparently ps5 gpu behave worse than 5700xt
well.. I think it's normal. The consoles will perform over a range of things, this is normal for any distribution of data. Game development isn't as precise and regulated as Formula 1. There's going to be large deviations in how titles perform and therefore hardware perform against it.
Interestingly enough, Simulation Quality – Graphics didn’t affect my overall performance at all in Dubai: there were larger crowds and more people milling about, but my average frame-rate didn’t budge much between Base and Best. The story changed once I jumped into the Dartmoor benchmark. At the Base setting glass shatteres and bullet impacts are rendered, but that’s really about it. At Best, however, full environmental destruction is enabled. Books fly off the manor’s library shelves as torn pages fill the air like snow, pillars crumble into stone and dust, and wooden furniture is shredded into splinters. It’s a lot for a CPU to take in, and the eight-core recommendation for the Best setting makes sense after running the benchmark a few times.

Oddly enough, my six-core i7 8700K handled the destruction with little issue, with one major exception. The benchmark stalled and froze briefly at the beginning as the first bookshelf was demolished in a hail of gunfire, as all the cores and threads in my CPU appeared to “wake up”. Once all the cores were spun-up and in-sync, my CPU handled the rest of the benchmark without issue, despite there being more devastation and particles to render. That said, the delta between running the benchmark with Sim Quality set to Best and running it at Base is stark: I averaged 98.01 FPS with the Sim Quality Best setting enabled, and 139.19 FPS with it set to Base.
https://attackofthefanboy.com/articles/hitman-3-pc-performance-is-to-die-for/

This game can be CPU heavy and can affect framerate. You're going to need to factor this in. I don't know if @Dictator can determine what the CPU settings are by eyeballing the console, but in particular for PS5 you need to take this into consideration. If there is no DRS system, IOI needs to create headroom for the CPU when those times come. But this particular feature is not covered in any console report I've seen so far.
 
hmm. to be thoughtful here on PC benchmarks; PC benchmarks are designed with systems to place the bottleneck on the GPU only. They are given obscene CPUs and memory to ensure that they are measuring the GPU in isolation.

PS5 is a complete system, the CPU is not obscene, it's paired with an equivalent GPU. It's sharing it's power between the two, with an upper limit on power draw. And it shares it's memory.

So while we can get an idea of where PS5 might land, any game that is extremely CPU taxing should be taken into consideration for it's effect on the consoles. As they have no ability to benchmark without it. More CPU means more bandwidth taken away from the GPU for both consoles. More CPU for PS5 means more clockspeed taken away as well.

That 448 GB/s the 5700XT has it's for itself. It's clockspeed determined largely by it's own ability to stay cool as well. Its hard to get a tell, because really, if PS5 drops 10% in clockspeed (200Mhz) it's the same clockspeed approximately as the 5700XT now, with less CUs, and less bandwidth.

So imo I don't think it's necessarily fair to state that PS5 needs to perform at this GPU level or that GPU level. PS5 performs likely in a range of values given how it's being used.

I think we've tried in the past to explain this, but people seem to only want to look at the best case scenarios for PS5, ie. all times 100% clock rate for CPU and GPU. No bandwidth loss.

I do agree with all of this, but I think in the case of this particular benchmark we're almost certainly not looking at a CPU limitation. On the PC side the game is hitting 267 fps at 1080p Ultra and although that's on a Ryzen 9 5950X, the PS5's CPU shouldn't be breaking a sweat to hit 1/4 of that frame rate.
 
With both consoles supporting VRR hopefully we start seeing more uncapped modes allowing for proper performance comparisons.

Sony PS5 does not support VRR yet. Unless I missed their updated firmware with release notes saying they now support it. They said it would come at some future time.
 
I do agree with all of this, but I think in the case of this particular benchmark we're almost certainly not looking at a CPU limitation. On the PC side the game is hitting 267 fps at 1080p Ultra and although that's on a Ryzen 9 5950X, the PS5's CPU shouldn't be breaking a sweat to hit 1/4 of that frame rate.
yea I agree, there's going to be varying levels here to consider. But last gen consoles have frame rate issues with this game. They are all locked 30 fps. So it's clearly heavy enough however.
The 60fps mode on 4Pro AFAIK is interpolated for VR. While it retains the smoothness of 60fps, it's quite blurry as I understand it from the articles. Regardless, it looks to be heavy enough that jaguars cannot do this game at 60.
 
I do agree with all of this, but I think in the case of this particular benchmark we're almost certainly not looking at a CPU limitation. On the PC side the game is hitting 267 fps at 1080p Ultra and although that's on a Ryzen 9 5950X, the PS5's CPU shouldn't be breaking a sweat to hit 1/4 of that frame rate.

The PC doesn't have a dynamic system where pushing the GPU hard takes away from the CPU.
 
I do agree with all of this, but I think in the case of this particular benchmark we're almost certainly not looking at a CPU limitation. On the PC side the game is hitting 267 fps at 1080p Ultra and although that's on a Ryzen 9 5950X, the PS5's CPU shouldn't be breaking a sweat to hit 1/4 of that frame rate.
Yep. The game even runs mostly at 60fps on Pro (at 1080p) so using a CPU ~3 or 4 times less powerfull than the Zen 2 CPUs. This is a game heavily GPU bound (them teraflops).
 
Excuuuuuse me?

Downclocking doesnt happen on PC gpus as far im aware, not below the advertised minimum clocks. They give a max boost, which often translates to pretty much all the time. Neither does this balancing exist where CPU clocks go down if the GPU gets hammered. I think its a PS5 exclusive, because neither does XSX such a thing.
 
Downclocking doesnt happen on PC gpus as far im aware, not below the advertised minimum clocks. They give a max boost, which often translates to pretty much all the time. Neither does this balancing exist where CPU clocks go down if the GPU gets hammered. I think its a PS5 exclusive, because neither does XSX such a thing.

They may not downclock based on cpu load/clocks but desktop gpus do downclock for other reasons, mainly for thermal concerns
 

An interview of a french support engineer working as a programmer on Unreal Engine at Epic Games in Japan. The interview is in french but at the end a few questions about PS5 and Xbox Series console, the main innovation is for him the SSD with streaming being not a constraint anymore* and he thinks the gap is bigger from PS4/XB1 to PS5/XBOX Series than between the generation PS3/360 and PS4/XB1.

* And the SSD inside the consoles are excellent.

EDIT: For the gap to materialize, he said to be patient. He has a PS5 and no Xbox Series and he thinks Spiderman and Demon's souls are solid but this we will see better. His favorite console of all time is the Xbox 360.
 
Last edited:
Status
Not open for further replies.
Back
Top