Current Generation Hardware Speculation with a Technical Spin [post launch 2021] [XBSX, PS5]

Status
Not open for further replies.
For comparison i did quick test on my xsx and control with RT during the gameplay draws around 140W ( very early in the game first 10 min, just bought the game).
With photomode powerdraw jumps to 170-180W (again early in the game). Gears5 hivebuster DLC around 190W (180-190W), ori2 160W+
 
I don't know why the PS5 would be limited to ~205W at the wall (~175W DC) if they have a 350W PSU, the cooling fan seems to have a lot of headroom without becoming loud and it looks like the 16-phase VRMs could support a lot more than that.

They do need to take away some 20-30W DC for the VR headset, but the console could probably still pull 300W at the wall without any problems.
As you said they need to account the VR headset and it probably depends of the PSU efficieny. In recent consoles Sony have historically used quite low efficiency PSUs.
Just tested destruction all stars and in entering arena scene power spikes to 225W just before actual gameplay, so it's not limited to 200W.
I don't think it's a meaningfull number as it's the first time I hear about it. Could it be due to some sudden increase of power consumption misread by your device?
 
As you said they need to account the VR headset and it probably depends of the PSU efficieny. In recent consoles Sony have historically used quite low efficiency PSUs.
Lower efficiency would affect how much more power the PSU needs to take at the wall for the same DC output, not how much less DC output it can provide compared to its rated values.
E.g. a (super) low 60% efficiency 300W PSU would need to pull 300/0.6 = 500W at the wall, but it should still be able to provide 300W nonetheless.
The PS5's PSU has a rated max output of 350W (or 340W on the discless version). Take away 30W for a headset (which is still a ridiculously high power for a VR headset nonetheless) and the PSU should still be able to provide 310W for rest of the console. At the moment it's providing just ~180W max, so there's a lot of headroom left.

However, this is still the first PS5 model so it's probable that both the PSU and cooling systems were over-engineered while Sony wasn't sure how the SoC would behave.
Same thing happened with the first-gen fat PS3.


poster on neogaf made quite interesting power consumption simulation of pc with hardware close to ps5
https://www.neogaf.com/threads/power-analysis-what-happens-if-you-spec-a-pc-like-a-ps5.1597916/
It's impressive how close he got to the PS5's power consumption and performance with a PC.

Also, that Navi 22 pulling only 125-130W of power at 2.21GHz shows how far away from its ideal power-performance curve the 6700XT is. It's also a nice indicator on how good the mobile Navi 22 GPUs could be.
 
  • Like
Reactions: snc
Also, that Navi 22 pulling only 125-130W of power at 2.21GHz shows how far away from its ideal power-performance curve the 6700XT is. It's also a nice indicator on how good the mobile Navi 22 GPUs could be.
50% more power to get just 200 more Mhz. Power curve is cubic, and I'm pretty sure the trend here is still somewhat fitting.
If they dropped it 400Mhz to 1800Mhz, they'd lose another 50% to 65W-70W I think.
 
It depends on how hardware support for ML, VRS, Mesh Shaders and SFS will affect the difference in performance between consoles. And on how multiplatform developers will use it.

Also, we do not know how much the ps5 GPU frequency drops when the CPU is used at full throttle. And how multiplatform developers will use Smart Shift.
According to Mark Cerny, they could not get the GPU to run stable at 2.0 Ghz and the CPU at 3.0 Ghz at the same time.
+ There was a questionable leak, which may well be close to the truth:
V8q8RzG.png

This is not a good source.
 
How could those values ever be close to the truth?
1.2GHz GPU for 3.5GHz CPU? We just watched a guy clocking a 6700XT at 2.21GHz and a 8-core Zen2 CPU at 3.5GHz (+ 128bit DDR4 3600MT/s) to pull just 200W at the wall, which is what the PS5 pulls when playing games.
How does suggesting the PS5 needs to practically halve the GPU clocks compared to a larger discrete RDNA2 GPU to get the same power consumption mean anything other than pointless FUD?
 
Whatever the actual clocks of those specific debug modes, those are only used for testing / profiling during development. All retail PS5s are set in automatic mode.
 
Whatever the actual clocks of those specific debug modes, those are only used for testing / profiling during development. All retail PS5s are set in automatic mode.
Yes, it works automatically, but it's kind of a swing thing.
When the GPU is maxed out then the CPU will have to significantly lower the frequencies. And vice versa.
How multiplatform developers will deal with this remains to be seen.

How could those values ever be close to the truth?
1.2GHz GPU for 3.5GHz CPU? We just watched a guy clocking a 6700XT at 2.21GHz and a 8-core Zen2 CPU at 3.5GHz (+ 128bit DDR4 3600MT/s) to pull just 200W at the wall, which is what the PS5 pulls when playing games.
How does suggesting the PS5 needs to practically halve the GPU clocks compared to a larger discrete RDNA2 GPU to get the same power consumption mean anything other than pointless FUD?
Did he test cpu-dependent games with low graphics settings (so that the CPU and GPU are about equally highly loaded)?
+ Navi 22 =/= ps5 GPU architecture
Just like a retail videocard in one instance =/= ps5 GPU, as you can get both very good and not so good samples.

And you must have missed my previous post:
«Running a GPU at 2 GHz was looking like an unreachable target with the old fixed frequency strategy.»
«Similarly running the CPU at 3 GHz was causing headaches with the old strategy.»


Because they have different architectures, designs and cooling systems.
MS spent a lot of time to make the CPU run stable at 3.6/3.8 Ghz. And this is the hottest point on the chip.


For most pastgen or crossgen games the 2.2 Ghz GPU and 2.0 Ghz CPU (with multithreading) mode should be enough to double the frame rate (and improve the graphics).
+ xbox has some problems with the toolset at the moment
 
This paraphrase
According to Mark Cerny, they could not get the GPU to run stable at 2.0 Ghz and the CPU at 3.0 Ghz at the same time.
And this statement

«Running a GPU at 2 GHz was looking like an unreachable target with the old fixed frequency strategy.»
«Similarly running the CPU at 3 GHz was causing headaches with the old strategy.»
Are not the same. If you keep watching from your timestamp, he says;

"with this new paradigm we are able to run way over that, in fact we have to cap the GPU frequency at 2.23GHz so that we can guarantee that the on chip logic operates properly"
 
Last edited:
This paraphrase

And this statement


Are not the same. If you keep watching from your timestamp, he says;

"with this new paradigm we are able to run way over that, in fact we have to cap the GPU frequency at 2.23GHz so that we can guarantee that the on chip logic operates properly"
Frequency mode switching can happen right in the middle of the frame. So conventionally we can say that for each individual frame both CPU and GPU were working with the maximum frequencies. But in fact it happened alternately.

And in order to get the most out of this "new paradigm" a special approach to engine/game development is needed.
It will not work effectively on its own.
 
Still has an abundance of bandwidth that's not available on consoles.
its about power consumption (40cu rdna2 clocked as ps5 gpu) and not performance mostly tough I'm curious how you calculated bandwidth as its only 384 gb/s, yes it has 96mb l3 cache that helps a lot but talking about bandwidth abundance in 6700xt case is overstatement :d
 
its about power consumption (40cu rdna2 clocked as ps5 gpu) and not performance mostly tough I'm curious how you calculated bandwidth as its only 384 gb/s, yes it has 96mb l3 cache that helps a lot but talking about bandwidth abundance in 6700xt case is overstatement :d

The infinity cache is a 2+x bandwidth multiplier according to AMD, and that holds up considering how well the 6900 does with only 512GB/s.
 
Status
Not open for further replies.
Back
Top