Playstation 5 [PS5] [Release November 12 2020]

Even at the same voltages, higher frequencies actually do impact temperatures in a negative way. Atleast thats my experience with overclocking pc components. It's very complicated, so its hard to know exactly whats going on, but higher frequencies mostly means higher deltas.

I'm pretty sure it's not complicated or beyond the engineers involved with designing the system.
 
PSman1700 said:
Which means they are really pushing that GPU to the max limits at those clocks.

Yeah , that’s what that they been saying all along. Higher clocks, hardware better utilized. But at the cost of power efficiency.
 
So if PS5 is running at a constant power delivery....then is the fan going to be "max" the entire time?
 
Seems the SSD penny has finally dropped and people are beginning to understand the potential.

Ironic people finally got what they wanted, a super fast SSD and BC but now they’re showing so much concerns on CUs, clocks, teraflops, etc. it only goes to show raw TF power and bandwidth are the utmost importance after all, let’s not lie to ourselves any more.
Ps5 is clearly a gimped design, it got its priorities wrong and it tried hard to catch up to the competition. Sony is lucky it didn’t dip to a single digit marketing wise but the hardware hype is just not there, in fact it’s on the depressing side. Maybe Mark Cerny should step down and let someone else take the reign of PS5 Pro? Yep, the Pro will be the only redemption if Sony go wild at it.
Unless of course their exclusives look so good they eclipse the Series X games, then I’ll have to readjust my judgement. But right now it’s not ultra terrible but ain’t no hype either.
Let’s wait and see the exclusives (which will looks amazing)

Really thinking about the bc situation with PS5. The load time situation may very well make playing legacy games too much of a chore.

It will be like going from using a washing machine to hand washing your clothes. I'm starting to question the value of bc, especially if the cu count was a function of.
Based on?

So if i understood correctly the ultra fast data transfert rates will allow to feed a lot more detail into player's point of view at any given time, bye bye textures/assets streaming issues and visible LoD transitions ?
Bingo

AngWsd2.png


So PS5 supports 2 types of legacy BC. Legacy meaning it runs the same clock speeds as the legacy device. PS5 native mode is to run older BC code at PS5 speeds. This is where they tested the 100 titles and some of them had issues, but they are confident that most of them will be running for launch on PS5 Native mode.
I found this picture interesting, there’s a gap big enough to add more...maybe?

I agree. Surely they could of shown some kind of tech demo. Like the one you describe would of been fantastic to get the potential across.
Release the info over several phases to keep interest.
 
Most likely not, not even my old i7 setup does that, when not gaming voltages are down and so is frequency.

Right but with PS5 it sounds like both power and cooling would constantly be running at 100% and then frequency only scales back when workload exceeds a certain amount.

Edit: Unless some game engines can be more efficient at the same power level? Then maybe cooling would variate?
 
Right but with PS5 it sounds like both power and cooling would constantly be running at 100% and then frequency only scales back when workload exceeds a certain amount.

Edit: Unless some game engines can be more efficient at the same power level? Then maybe cooling would variate?

RL from Digital Foundry did an interview with Cerny last Monday. I'm sure we can get some details about how the power management works, it's really something very different to what we have in PCs at the moment.
 
Right but with PS5 it sounds like both power and cooling would constantly be running at 100% and then frequency only scales back when workload exceeds a certain amount.

Cooling can't be running at 100% all the time that would leave no leeway for ambient temperatures and efficiency loss over time.
 
RL from Digital Foundry did an interview with Cerny last Monday. I'm sure we can get some details about how the power management works, it's really something very different to what we have in PCs at the moment.

It sounds like the system is running at constant set power budget and then clocking up to the cooling systems capabilities. Which to me means that the fan would have to be constantly enganged? (of course this would have made the fan noise more predictable and therefore more optimally engineered for acoustics)
 
It sounds like the system is running at constant set power budget and then clocking up to the cooling systems capabilities. Which to me means that the fan would have to be constantly enganged? (of course this would have made the fan noise more predictable and therefore more optimally engineered for acoustics)
From Dictator's post on Resetera, it seems developers will opt for more CPU or GPU power, and design accordingly.
 
I found this picture interesting, there’s a gap big enough to add more...maybe?
The problem is that despite what the slide says about "legacy modes", Cerny didn't talk about any such modes and gave the impression there won't even be full backwards compatibility on launch.
I just re-watched the entire section and I believe the "Legacy modes" in that only means that it can actually run the specific code built for different GCN-versions for PS4 and PS4 Pro on the new RDNA2 GPU, not that it would run at the same speed / performance in any scenario, and that backwards compatibility in fact will not cover full PS4 library on launch.
 
So if PS5 is running at a constant power delivery....then is the fan going to be "max" the entire time?

Probably not. It's entirely possible for game to not utilize cpu and gpu fully.

Those avx2 instructions are power hungry and not every engine uses them a lot. It's very typical even for desktop cpu's to run at lower clock when avx2/avx512bit is used. It's a nice option to be able to trade gpu performance for more cpu and wise versa. Dynamic clocks would only be bad if ambient temperature affected clocks but that is not the case with ps5. As it's now implemented the performance will be constant between consoles. Variable refresh rate should help even more as having locked 30/60/?? fps will eventually become thing of the past and framerate will run free. Combine that with all the software side optimizations and things are peachy.

As it's becoming more and more difficult to add flops without adding price it only makes sense to look for smarter solutions extracting maximum performance out of available silicon.
 
From Dictator's post on Resetera, it seems developers will opt for more CPU or GPU power, and design accordingly.

As far as I understand that AMD Smartshift is handled automatically at the hardware level though..actually reading about it seemed somewhat pointless for a console. On a PC you are running gaming and non-gaming applications therefore shifting power between the CPU and GPU makes sense...but PS5 is a dedicated gaming device.
 
Probably not. It's entirely possible for game to not utilize cpu and gpu fully.
I think he's saying, as long as the CPU isn't running full tilt the power is transferred over to the GPU.

He's wondering if it's running 100W max all the time, because whatever the CPU isn't using the GPU is using.
If the CPU starts demanding more and more the GPU clocks will need to decrease to adjust for the power heading towards the CPU.

Or

Is it running below 100W, and there is some additional power to give to the CPU before it hits max load in which afterwards it needs to start throttling the GPU.
 
I think he's saying, as long as the CPU isn't running full tilt the power is transferred over to the GPU.

He's wondering if it's running 100W max all the time, because whatever the CPU isn't using the GPU is using.
If the CPU starts demanding more and more the GPU clocks will need to decrease to adjust for the power heading towards the CPU.

Or

Is it running below 100W, and there is some additional power to give to the CPU before it hits max load in which afterwards it needs to start throttling the GPU.

I understood it so that cpu sucking "excessive" power would be more related to avx2/512(or whatever the amd name for it is) instructions which typically suck power like crazy. However not every engine uses those instructions and even those that use don't use them all the time. Cerny specifically mentioned the avx as power hungry. Typically pc cpu's downclock themselves heavily when those instructions are used.
 
I think he's saying, as long as the CPU isn't running full tilt the power is transferred over to the GPU.

He's wondering if it's running 100W max all the time, because whatever the CPU isn't using the GPU is using.
If the CPU starts demanding more and more the GPU clocks will need to decrease to adjust for the power heading towards the CPU.

Or

Is it running below 100W, and there is some additional power to give to the CPU before it hits max load in which afterwards it needs to start throttling the GPU.

The AMD Smartshift is an additional feature.

More generally I'm saying is if PS5 is running at a constant (max) power, then the cooling also has to no?

Unless game engines have varying effects on silicon at the same power level...
 
The AMD Smartshift is an additional feature.

More generally I'm saying is if PS5 is running at a constant (max) power, then the cooling also has to no?

Unless game engines have varying effects on silicon at the same power level...
The GPU and CPU have frequency caps.

If designed correctly, it shouldn't have to run at max power in order for both CPU and GPU to hit their cap. There has to be some power room available to ramp up a bit before hitting max, and then it begins throttling frequency as required.

That being said, the fan should follow respectively.
 
I'm not worried about the clocks dropping at all, Cerny made perfect sense and it would only really be a problem towards the very end of the machines life when developers are really pushing the silicon.

He specifically mentioned the 256bit instructions on Zen generating a lot of heat and power when used, but he also said will they might not be used a lot but they test for them being fully used to give them a worse case.

Xbox One to Xbox One-S was a 7.5% increase in GPU clocks and in GPU limited cases the difference is barely a single frame. And that 7.5% is like higher then the "couple of percent" Cerny has claimed PS5 will drop by.

That's before we even know how well developers can actually keep Series X's CU's fed with data as it is correct what Cerny said, it's easier to keep fewer but faster CU's busy then it is to keep a wider GPU busy.

And then there's the whole back end potentially being faster on PS5 that could make using those 10tflops easier to achieve.

AMD GPU's have never scaled very well with increased CU count and there's plenty of data from the PC platform to back that up.

Gaming Nexus tested Vega 56 and Vega 64 at the same clock speed, Vega 64 has a 14% advantage in shader count but the testing showed that the real world difference was less then 1%.

This same behavior can be seen in Fury vs Fury X, 290x vs 290...heck all the way back to the 7950 vs 7970.

AMD GPU's scale better with frequency and not CU count which is what Cerny was saying in the presentation, the real world gap will not be that big between PS5 and Series X.

I just need them to announce PS2 backwards compatibility and I'll be a happy bunny.
 
Back
Top