Playstation 5 [PS5] [Release November 12 2020]

He did by implication. He implied that doing so could save 10% power. If it’s jumping that much in dynamic consumption, something bizarre is going on. They probably have a bit of hysteresis to prevent motorboating.

That's exactly talking around it.

What level of utilization starts to push it into the throttle zone? That's what we want to find out. Forgive me if I don't just take his implication as gospel.
 
That's exactly talking around it.

What level of utilization starts to push it into the throttle zone? That's what we want to find out. Forgive me if I don't just take his implication as gospel.
He can’t give concrete limits because games will find a way to push those bounds over a 7 year cycle.
 
He can’t give concrete limits because games will find a way to push those bounds over a 7 year cycle.

He absolutely could have given better information than he did without giving concrete limits.

How far can they push it? Does the thermal limit give them more to work with over years or less?
 
He absolutely could have given better information than he did without giving concrete limits.

How far can they push it? Does the thermal limit give them more to work with over years or less?
The level of detail you’re asking for is beyond the scope of that presentation IMO.
 
IIRC cerny said something in the line of : doesn't matter where you live, ps5 won't throttle tp heat.
I don't think that can be taken to extremes. There are already operating parameters in the manual for an operating range, and instructions like avoiding direct sunlight or extreme humidity.
What about my scenario where the PS5 is running in an oven?

1. Sounds like a jet taking off as it gets hotter
2. Show overheat warning and telling you to close the game
1) I don't think there was a condition that the fan wouldn't spin up, so that could happen. If the console is put in an inappropriate ambient temperature and/or in direct sunlight, it could be possible to raise the temperature such that no amount of airflow would sufficiently dissipate heat: "what if I replaced the cooling fan with a high-temp hair dryer?"
The Dualshock controller has a temperature range where it is considered safe to operate, why would the console be required to operate normally at temperatures that would endanger everything plugged into it?

2) If an overheat warning condition is met, and the current gen consoles have such states, I don't see why the console would be obligated to keep boosting. Hopefully the warning message's simple geometry doesn't cause the frame rate to boost to 1000 fps.


He did by implication. He implied that doing so could save 10% power. If it’s jumping that much in dynamic consumption, something bizarre is going on. They probably have a bit of hysteresis to prevent motorboating.
What is duty cycling but that on a nanosecond scale...?
More seriously, if the method is constant-power, then a spike in demand could trigger clock stretching once Vdroop was detected. Enough events of that type could feed back to the microcontroller about what clocks are sustainable.

It’s not as if MS didn’t recognize the need for a dynamic power envelope. That’s why they specified a different clock rate for SMT mode. I’d argue Sony’s method is the more elegant, but they also should have given developers the choice on SMT. Perhaps that’s too hard of a runtime switch to make...

I commented before about the whether the Series X could flip a switch per-game because it could treat things like spinning up a VM instance. If the multi-OS organization from the Xbox One persists, then the game is a newly launched guest OS, and instances can be set up with multithreading or without. I'm not sure if that means the CPUs themselves have control registers set to disable multithreading, or the hypervisor just exposes one thread per core and lets the other sleep.

If Sony's OS organization remains, it's all one instance and like on a PC it might not be possible to change without a reboot. Sony may have also evaluated the solution but decided it was too difficult for them to implement for the possible return. Microsoft's interview stated the expectation that SMT mode would take over after the first cross-gen games with poorer scaling are replaced, so Sony may have projected something similar and decided it wasn't worth it.

Also, maybe this is something they could add later if they really felt Microsoft added value with the option? It wouldn't be the first feature recent consoles have added after the fact with an update, perhaps after some delay going by Sony's history with updating the PS4's features.
 
What level of utilization starts to push it into the throttle zone? That's what we want to find out. Forgive me if I don't just take his implication as gospel.

I think assuming it's about "utilization" is a mistake. I think they're more worried about runaway thermals like you use to get with furmark. Cerny's example calls out the Horizon map screen. The problem there isn't that the GPU is being stressed by over utilization, as such.
 
I think assuming it's about "utilization" is a mistake. I think they're more worried about runaway thermals like you use to get with furmark. Cerny's example calls out the Horizon map screen. The problem there isn't that the GPU is being stressed by over utilization, as such.

It is clear there are conditions which will cause the gpu clock to throttle. Do you think that condition will only occur with furmark? Or is that just where the throttling would get extreme (more than a few %).
 
The problem is, unlike on a PC, where boost frequencies work based on temperature and so are helped by better cooling solutions, the PS5 has its clock targets fixed based on power draw. So even if you were to liquid cool it or something it would still downclock itself the same as it would using the stock fan. The only difference would be the actual temp the chip is running at.

Cooling affect power consuption, no ? The better my Vega is cooled, the less power it draws (at a constant frequency).
 
What about my scenario where the PS5 is running in an oven?
Assuming it didn't have an issue with the 2.4ghz waves spewed by the microwave, I suspect it will simply shut itself off.

And refuse to turn on if it's already / still hot.

(those are also the behavior of ps4 and PS4 pro)
 
If Sony's OS organization remains, it's all one instance and like on a PC it might not be possible to change without a reboot. Sony may have also evaluated the solution but decided it was too difficult for them to implement for the possible return. Microsoft's interview stated the expectation that SMT mode would take over after the first cross-gen games with poorer scaling are replaced, so Sony may have projected something similar and decided it wasn't worth it.

Also, maybe this is something they could add later if they really felt Microsoft added value with the option? It wouldn't be the first feature recent consoles have added after the fact with an update, perhaps after some delay going by Sony's history with updating the PS4's features.

Could it be the case, for example, that one CPU core has dedicated lanes to the memory controller/DDR4 for the OS?

And is there any difficulty in keeping one CPU core in SMT mode at a lower clockspeed, while the other seven cores run without SMT at a higher clockspeed?
 
The curious part of Sony's strategy from a developer's point of view is how the dynamic clocks which actually be managed. Perhaps there is some benchmark that looks at the worst-case utilization for your specific title for which you're given a specific power profile to use which keeps clocks constant that game. Or if it is indeed going to be variable while the game is actually running, which might make certain game systems more challenging to synchronize.

Why is it so difficult for you to understand that Sony went through all this effort and made these innovations to make sure performance under load and thermal conditions is as predictable for developers as possible.

Always start your assumptions with that given the current situation, an innovation was created to solve a problem. There was a question, in other words, that needed an answer. Before you start criticizing the answer to the problem, please make sure you understood the question!

Adhere to these principles and you will become a nicer and happier person in life and also on this forum.

In this case it seems to me that the question was - given the way current systems handle overheating (jet plane like fans and developers having to take a large performance margin to prevent some users having single digit frame rates under load combined with high temperatures, say, with a high and unpredictable difference between one user and another, OR hardware manufacturers having to use lower clock speeds so that this level of heat can’t happen), how can we design a system that uses as high a clock as possible combined with a cooling system that never gets too loud, varies as little as possible in performance, and behaves as predictably as possible irregardless of ambient temperatures etc.
 
Why is it so difficult for you to understand that Sony went through all this effort and made these innovations to make sure performance under load and thermal conditions is as predictable for developers as possible.

Always start your assumptions with that given the current situation, an innovation was created to solve a problem. There was a question, in other words, that needed an answer. Before you start criticizing the answer to the problem, please make sure you understood the question!

Adhere to these principles and you will become a nicer and happier person in life and also on this forum.

In this case it seems to me that the question was - given the way current systems handle overheating (jet plane like fans and developers having to take a large performance margin to prevent some users having single digit frame rates under load combined with high temperatures, say, with a high and unpredictable difference between one user and another, OR hardware manufacturers having to use lower clock speeds so that this level of heat can’t happen), how can we design a system that uses as high a clock as possible combined with a cooling system that never gets too loud, varies as little as possible in performance, and behaves as predictably as possible irregardless of ambient temperatures etc.

Yeah, I’m a bit confused. Cerny said 2.23 ghz most of the time with the worst case ‘couple % drop’ but we have @Rockster saying it’s going to be at 2 ghz most of the time.

No disrespect @Rockster, but how do you understand how PS5 runs better than Cerny? What qualifications or experience or job gives you your knowledge?
 
Cerny is not neutral in this, he has to sell a product. Some people doubt 2.23ghz because it's above everything we know.

But to stand on stage and blatantly lie? Not only that this is new tech so we don’t know it. I guess Cerny must be lying about the SSD too because it’s faster than anything we know?
 
  • Like
Reactions: scf
Cerny is not neutral in this, he has to sell a product. Some people doubt 2.23ghz because it's above everything we know.
Microsoft is running a bigger part than the 5700XT which has 225W TBP at the same clocks. It’s almost like the 5700XT may not be a great reference point anymore.
 
But to stand on stage and blatantly lie? Not only that this is new tech so we don’t know it. I guess Cerny must be lying about the SSD too because it’s faster than anything we know?

Get more lines/channels, and you will have faster ssds, so it's not hard to believe. A 2.23ghz gpu in a mass produced console, is harder to believe.

For the record I believe it. I'm just saying I understand people who don't, especially without rdna2 product available yet (and rdna being not that efficient).
 
So basically, how I see it is they have fixed TDP/cooling budget, and they capped APU at a limit of this budget. When GPU/CPU operations are straining APU more > frequency is dropped so it doesnt pass TDP/cooling limit. Pretty smart, you eek out more perf from the chip and get nicer spec sheet then you would have if you had fixed clocks.

Although from Cerny own words, he said "we expects GPU to spend most of its time at or close to this frequency."

What "most" means (51% does fit it for example) and what "at or close to means" I would say both 2200 and 2100 fit the bill) is to be seen. Performances will tell the whole story because there is no way we will ever find clocks at which consoles actually run.

Microsoft is running a bigger part than the 5700XT which has 225W TBP at the same clocks. It’s almost like the 5700XT may not be a great reference point anymore.
I agree, although if you listen to what Cerny said closely, there is quite a bit of wiggle room where frequency might end up. We might see it at 2230MHz in cut scenes, then 2200MHz in exploration or 2100MHz in heavy fighting, perhaps even lower then 2000MHz in some corner caes ("original unreachable target"), yet it will all fit in with "close to and most of the time".
 
Get more lines/channels, and you will have faster ssds, so it's not hard to believe. A 2.23ghz gpu in a mass produced console, is harder to believe.

For the record I believe it. I'm just saying I understand people who don't, especially without rdna2 product available yet (and rdna being not that efficient).
RDNA 2 and mass produce product is a PS5 and Series X. A great way for AMD to enjoy economies of scale (or scope) when it will be available for everyone in the PC space. Sony made customization on a technology that partly exists and partly is in the making. There is nothing strange about it. We just need the details to understand how effective these customizations were
 
So basically, how I see it is they have fixed TDP/cooling budget, and they capped APU at a limit of this budget. When GPU/CPU operations are straining APU more > frequency is dropped so it doesnt pass TDP/cooling limit. Pretty smart, you eek out more perf from the chip and get nicer spec sheet then you would have if you had fixed clocks.

Although from Cerny own words, he said "we expects GPU to spend most of its time at or close to this frequency."

What "most" means (51% does fit it for example) and what "at or close to means" I would say both 2200 and 2100 fit the bill) is to be seen. Performances will tell the whole story because there is no way we will ever find clocks at which consoles actually run.
I dont think Cerny would have used such PR wording without caring much about the useful performance for devs. He is primarilly an architect that wants a friendly and useful ecosystrm for developers and much much less if at all a marketing guy in a suit that wants to pat our ears. I doubt Cerny would have allowed such a discrepancy between useful performance and paper numbers
 
I dont think Cerny would have used such PR wording without caring much about the useful performance for devs. He is primarilly an architect that wants a friendly and useful ecosystrm for developers and much much less if at all a marketing guy in a suit that wants to pat our ears. I doubt Cerny would have allowed such a discrepancy between useful performance and paper numbers
But there is no discrapancy in this. If GPU is running 30% at max frequency, 50% of time 2-3% close to limit and 20% of time around 2GHz no body will notice it nor would he be lying. Technically, most of the time GPU would be spending at or near that frequency.

Entire conference was PR exercise as much as it was technical deep dive. You do not expect Sony to just pass chance to show their system in better light after XSX showed theirs?
 
Back
Top