Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
The 5700XT is RDNA 1.0.

Cerny said clocks are deterministic, so if a developer wants the full 10.28 TF all the time at 2.23Ghz, it will be achieved. He specifically said he wants the same performance across all consoles sold.
That's not what I got from his statement. Wanting the same performance is based on not varying the base and boost clocks speeds for individual chips. All chips will meet the same standard of performance regardless of the fact that some chips may be able to achieve higher frequencies. I could be wrong but that's how I remember interpreting it.
 
That's not what I got from his statement. Wanting the same performance is based on not varying the base and boost clocks speeds for individual chips. All chips will meet the same standard of performance regardless of the fact that some chips may be able to achieve higher frequencies. I could be wrong but that's how I remember interpreting it.

No.

His actually quote: "It wouldn't make sense to run the console slower because it was in a hot room,so rather than looking at the actual temp, we look at the activities that the CPU and GPU are performing and set the frequencies on that basis, which makes everything deterministic and repeatable".

36:40 of the Road to PS5 video. If a developer designs a game to run at 2.23GHZ all the time, the console will hit those frequencies all the time.
 
That's the problem, what is most of the time? I know the majority of time spent on my console is probably watching Hulu/Netflix :LOL:. There's also a ton of very popular games right now that aren't exactly taxing for consoles to run. This is a pretty safe statement and it leaves a lot of room for interpretation in my opinion. Based on this statement, If you want to believe PS5 clocks run near max all the time you can. Based on this same statement you can also decide to believe "the majority of time" means they can't in the most demanding games. That's why we need more information.
Yes, this reinforces my point. Very muddy waters, yet we’re claiming the PS5 will only hit these clocks “under light loads” as if we’ve already been disclosed on the technology. We don’t know. We’re going to have to wait.
 
The 5700XT and the entire RDNA 1 architecture is irrelevant now, you can't use them as the basis of estimating power consumption at a given clocks or CU count, I have been saying that a GPU with 52 CUs @ 1825MHz would consume some where around 280W to 300W with RDNA 1, which remains true, the problem is consoles use RDNA 2, which is more power efficient than RDNA 1 (any where from 30% to 50%). Now we have no real ground to stand on in estimating power consumption of console GPUs, RDNA 2 remains uncharted territory now.
This is irrespective of archictecture, other cards (Nvidia and AMD) all show similar results. Makes sense, there is little reason why peak wattage of RDNA2 would suddenly be 2x/3x higher then RDNA1 (or Nvidia cards for that matter).

The 5700XT is RDNA 1.0.

Cerny said clocks are deterministic, so if a developer wants the full 10.28 TF all the time at 2.23Ghz, it will be achieved. He specifically said he wants the same performance across all consoles sold.
But he didnt say that. At all.
 
No.

His actually quote: "It wouldn't make sense to run the console slower because it was in a hot room,so rather than looking at the actual temp, we look at the activities that the CPU and GPU are performing and set the frequencies on that basis, which makes everything deterministic and repeatable".

36:40 of the Road to PS5 video. If a developer designs a game to run at 2.23GHZ all the time, the console will hit those frequencies all the time.
No, this is only in referrence to boost being based on TDP/heat, which it is not. Its based on power draw limit.

For activities which hit power draw celling, frequency will drop. If GPU runs harder and consumes more energy, frequency will drop. Simple as that.

Will developers have to manage it, or will they pick GPU/CPU profile, we dont know. My point about 5700XT (or take 2070S) is that average power draw and peak power draw are not that far away 9-10W.
 
No, this is only in referrence to boost being based on TDP/heat, which it is not. Its based on power draw limit.

For activities which hit power draw celling, frequency will drop. If GPU runs harder and consumes more energy, frequency will drop. Simple as that.

Will developers have to manage it, or will they pick GPU/CPU profile, we dont know. My point about 5700XT (or take 2070S) is that average power draw and peak power draw are not that far away 9-10W.

"Deterministic and repeatable" based on "CPU and GPU activities". I think it is very clear, a developer can design a game to hit 2.23 GHz the whole time.

And again, Cerny also said that a "2% frequency drop, allows for a 10% in power consumption".

These are statements that some people want to dismiss over and over again.

There is not going to be a substantial drop in performance, doesn't matter what anyone wishes are.
 
Cerny said clocks are deterministic, so if a developer wants the full 10.28 TF all the time at 2.23Ghz, it will be achieved. He specifically said he wants the same performance across all consoles sold.
It will be achieved if the developer limits the game to loads in which 10.28TF can be achieved.
It's not a case where developer gets just to decide what clocks he wants to run at, it's a case where they need to make sure the load they're putting on the system is within certain power limits at all times to have certain clocks at all times
 
Is there a transcript available for the Road to PS5 video? Found this at 37 minute mark. Cerny "running a GPU at a fixed 2Ghz target was looking unreachable with old fixed frequency." He's talking about AMD Smartshift here. It seems pretty clear that in order for the GPU to maintain 2.23Ghz, that power will have to come from decreasing the clock of the CPU. So again as a lot have been saying, it depends on what developers want for a balance between CPU and GPU performance. It may be able to maintain that GPU clock speed but we don't know what that's going to do to the CPU clock, as we don't know the base frequencies. Sony only mentions "up to" for both CPU and GPU frequencies. Cerny also mentions "running the CPU at 3Ghz was causing headaches with the old strategy" meaning fixed clocks. So it sounds to me like Sony was unable to run the GPU at a fixed 2.0Ghz while also having a fixed CPU clock of 3.0Ghz. That means that for the GPU to reach it's max frequency the CPU will be running somewhere below 3GHz.
 
Last edited:
"Deterministic and repeatable" based on "CPU and GPU activities". I think it is very clear, a developer can design a game to hit 2.23 GHz the whole time.

And again, Cerny also said that a "2% frequency drop, allows for a 10% in power consumption".

These are statements that some people want to dismiss over and over again.

There is not going to be a substantial drop in performance, doesn't matter what anyone wishes are.
No, again, you are misquoting him. If developers could design a game locked at 2.23GHz, and by Cernys own admission couple % (2-3% I guess) saves 10% TDP, why did he say they couldnt hit 2.0GHz target with old way of doing things (so, no variable frequency)?

And note that 2.0GHz is more then 10% reduction in clocks, not 2% ,so gain in TDP should be rather substantional.

He said :

"We expect GPU to spend most of its time at or close to that frequency"

He further said :

"Similarly running CPU at 3.0GHz was causing headaches with the old strategy, but now we can run it as high as 3.5GHz. in fact it spends most of its time at that frequency"

Most - how much is it?
Close to- how close is close?
Expect - ?

This is not "PS5 has HW RT in GPU" kind of thing. Cerny has been vague with this and made no comittment on actual numbers, bar max frequency.

Its not about what people wish, its about people discussing and speculating on actual results once consoles are out. As it stands, no hard data has been given therefore we will have to wait and see.
 
Is there a transcript available for the Road to PS5 video? Found this at 37 minute mark. Cerny "running a GPU at a fixed 2Ghz target was looking unreachable with old fixed frequency." He's talking about AMD Smartshift here. It seems pretty clear that in order for the GPU to maintain 2.23Ghz, that power will have to come from decreasing the clock of the CPU. So again as a lot have been saying, it depends on what developers want for a balance between CPU and GPU performance. It may be able to maintain that GPU clock speed but we don't know what that's going to do to the CPU clock, as we don't know the base frequencies. Sony only mentions "up to" for both CPU and GPU frequencies.
There's no actual transcribing application you can just readily use on youtube. Most of the time you have to do it manually. Actual AI transcribers from audio to text cost money per minute of transcribing. If you have your own trained model that can do it, then your'e in the luck. *correction there is transcribing

He did say that. Yes it wouldn't be able to reach 2.0Ghz without SmartShift.

And he's right, MS did not pass 1.8Ghz.
 
Last edited:
No, again, you are misquoting him. If developers could design a game locked at 2.23GHz, and by Cernys own admission couple % (2-3% I guess) saves 10% TDP, why did he say they couldnt hit 2.0GHz target with old way of doing things (so, no variable frequency)?

And note that 2.0GHz is more then 10% reduction in clocks, not 2% ,so gain in TDP should be rather substantional.
I'm unsure what you mean here, he said (and I quote it below) 'a couple percent reduction in frequency reduces power by 10%'

He said :

"We expect GPU to spend most of its time at or close to that frequency"

This is a genuine question; why do you not quote the bit where he said "When that worst case game arrives, it will run at a lower clock speed. But not too much lower, to reduce power by 10 per cent it only takes a couple of percent reduction in frequency, so I'd expect any downclocking to be pretty minor,"
 
Last edited:
He did say that. Yes it wouldn't be able to reach 2.0Ghz without SmartShift.

And he's right, MS did not pass 1.8Ghz.

It looks like Sony was originally targeting fixed clocks for both CPU and GPU (2ghz GPU/ 3ghz CPU) and it was not attainable, or at leas was causing some kind of issues. This leads me to believe that If developers target anywhere near max frequencies for PS5's GPU the XBSX is going to have a pretty massive CPU advantage. This also solidifies why MS went with a larger enclosure than normal for the XBSX. This also leads me to believe that if Sony wants to run their CPU at 3.5GHZ their GPU clock is going to have to be running at a frequency very close to XBSX GPU clock speed.
 
Cerny explains concepts, and uses examples and real world cases. Some of you guys seem to take the examples as if they are the claims, but they are just ways to explain why they think a determimistic variable clock is better than a fixed clock. The reasons given are ultimately all about cost/performance of the entire system. The limitations of fixed clocks are presented as either unused or borderline engineering margins because of lack of real world data and future data until it's too late. The example of the launch revision of the Pro was about predicting the required engineering margins which can only be guesswork. Guessing in engineering is really bad and caused a lot of problems in game consoles for over a decade, ever since we've had 150W-200W consoles.

The cooling is going to indicate what average power we can expect, and the games real world benchmarks will tell us how it performs.

I expect the cooling system size will be counter-intuitive, and I hope they will show it without giving any wattage figures so we can have some fun. ;)
 
No, again, you are misquoting him. If developers could design a game locked at 2.23GHz, and by Cernys own admission couple % (2-3% I guess) saves 10% TDP, why did he say they couldnt hit 2.0GHz target with old way of doing things (so, no variable frequency)?

And note that 2.0GHz is more then 10% reduction in clocks, not 2% ,so gain in TDP should be rather substantional.

He said :

"We expect GPU to spend most of its time at or close to that frequency"

He further said :

"Similarly running CPU at 3.0GHz was causing headaches with the old strategy, but now we can run it as high as 3.5GHz. in fact it spends most of its time at that frequency"

Most - how much is it?
Close to- how close is close?
Expect - ?

This is not "PS5 has HW RT in GPU" kind of thing. Cerny has been vague with this and made no comittment on actual numbers, bar max frequency.

Its not about what people wish, its about people discussing and speculating on actual results once consoles are out. As it stands, no hard data has been given therefore we will have to wait and see.
That sounds like most of the time it is hitting those or near those clocks in both CPU and GPU simultaneously
 
Is it typical for a modern console game to always maintain 100% CPU and GPU load? With all scene complexity and difficulty to test all possible cases even if they are deterministic, I thought it would be wise to have some margin except for things like in-engine cut scenes. There are cases where frame drop happens, but even in such cases it might be rare that both CPU and GPU are 100% utilized.
 
If PS5's gpu spends most of the time at 9.2TF then it would clearly contradicts to what Cerny is promising and be very disingenuous to the marketed 10.3 TF. I can only imagine the thundering uproar from the core community and media alike, it would possibly put a bad name to Sony or Playstation for next gen which is undesirable to the company. But if it hovers around 9.8-10TF under heavy load then it's gonna be totally fine. Also cpu speed is gonna be a non issue at 4k or close to 4k res, so a slight downclock would literally be unnoticed during gameplay.
During a multiplatform gameplay using XSX rendering at native 4k for comparison, if PS5 stays at 1800p most of the time then it must mean the gpu clock is heavily dropped and 9.2TF is most likely the standard number since there's a 44% pixel difference between 2160p and 1800p. If it stays at ~2000p then it would be less than 20% in pixel difference and Cerny would be correct after all. 2000p vs 2160p would be virtually undiscernable at a normal viewing distance or even face to the screen lol, it would require hardcore magic from DF to tell the story.
 
it would require hardcore magic from DF to tell the story.

Depends on what devs do entirely, maybe they use the extra power for higher settings, or have the option to, like on pc. We are already seeing more modes current gen.

Talking about magic, they must have done something very special (smartshift?), from not being able to sustain 2ghz/3ghz to a 2.23/3.5 basically all the time. If smartshift is that good, all other manufacturers, including ms, have missed a great oppertunity here, they must have totally looked the other way when and had rhe tech available.
 
Status
Not open for further replies.
Back
Top