Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
4) Sony has said, when the CPU and GPU aren't being fully utilized, they can run up to 3.5GHz and 2.223 GHz respectively. And that at higher loads, they will run at lower (as of yet undisclosed) clock-speeds. The only sense we have for clock-speeds in a hypothetical 100% load scenario across the entire SOC, are Cerny's comments that 3GHz on the CPU "was causing headaches" and 2GHz on the GPU "was looking like an unreachable target". As 100% utilization across the entire SoC in a practical sense is not possible just due to the inherent inefficiencies of real-world code. data sets, and compute requirements; the expectation Sony has is that they will run at or near the max clocks much of the time.

This may get me in trouble, but...

Apologies for the lack of technical knowledge on my part. Here is the problem I have with this "expectation". I cannot speak for others. What happens in 3 years time when developers are really starting to push the consoles and that "load" goes up? Those instances where the PS4/ Pro whined like a jet engine? Or any other instance where the machine is stressed? When those games start showing up for next gen, and they will, what happens to those clocks? As compared to its competitor? Or, I don't know the technical term, when the work performed exceeds the power profile and causes throttling? Plenty of examples running around for this gen. Cerny stated the expectation of "running at or near those clocks most of the time". (Inexactly paraphrased on my part I am sure)

I am not going to restart the debate on what "mostly" means. As mentioned, he also said they couldn't get the clocks to run at 3 and 2 consistently. Future games are going to stress the machine. Period. What exactly happens to those clocks? (I also have no intention of starting some lame TF debate. I know it is not the end all be all, but I also have no sympathy for a company that began using that term in their marketing back in the PS3 days to prove they have the most powerful console. Were the situation reversed, they most certainly would be touting it front and center. If the numbers don't go in your favor now then "hoisted on your own petard" is all I have to say to that particular marketing strategy.)

Maybe there is no telling what that future holds here. I am sure people developing for the PS5 will, of course, be taking all this into account. They will make it work whether the clocks are dynamic or chosen from a profile. Thing is, every previous generation gets taxed eventually. Just a matter of what game(s) and when. Maybe what Sony has done keeps the noise profile down. What I want to know, and I doubt it can be answered anytime soon, is when those games hit, what does the PS5 actually run at? You want to brag about your variable "continuous boost " clocks and the advantages provided? Best back that up with real world examples for upcoming games and not vague statements. Both companies must have a good idea of what the games currently in development are going to be doing in 18 months time. 3 years time? Some idea?

Bottom line: I have yet to read something, here or elsewhere, that allows me to buy what Cerny stated. (Sorry if that offends anyone, but this feels deliberately confusing.) In the short term - PS4 games? Cross platform titles? Sure. No trouble believing that you will hit those clocks "most of the time", however you wish to define "most." You want me to buy that the variable clock/ continuous boost that just so happens to place you just into the double digit TF's is something more than what it sounds like? (So very convenient that. I am too much a cynic, especially when marketing gets involved, to buy that this is just coincidence. Possible? Sure. I wouldn't bet anything on it unless someone gave me very long odds).

Is the XsX going to maintain its clocks under the same load while the PS5 has to throttle the hell out of the clocks? What happens with the PS5 under the nebulous "future loads" and what happens to the XsX under the same circumstances is the question I would want answered. (Purely from a curiosity standpoint. I haven't seen anything from either company to date that would seriously effect any purchasing decision.)
 
You say you dont have technical knowledge yet you claim you dont believe anything cerny says because reasons?

Arguing clock speeds without any context of actual real world game examples to test theories on is silly. Especially when it comes to being hostile.

Ps3 and 360 went into a power war and both sony and ms have had to recalibrate their messages due to changing enviorments. Its nothing new. It doesnt change that ps5 and series x will both be powerful next gen worth machines.

And i would say the same even if ps5 had been fixed at 2ghz gpu and 3.0ghz cpu.
 
This may get me in trouble, but...

Apologies for the lack of technical knowledge on my part. Here is the problem I have with this "expectation". I cannot speak for others. What happens in 3 years time when developers are really starting to push the consoles and that "load" goes up? Those instances where the PS4/ Pro whined like a jet engine? Or any other instance where the machine is stressed? When those games start showing up for next gen, and they will, what happens to those clocks? As compared to its competitor? Or, I don't know the technical term, when the work performed exceeds the power profile and causes throttling? Plenty of examples running around for this gen. Cerny stated the expectation of "running at or near those clocks most of the time". (Inexactly paraphrased on my part I am sure)

I am not going to restart the debate on what "mostly" means. As mentioned, he also said they couldn't get the clocks to run at 3 and 2 consistently. Future games are going to stress the machine. Period. What exactly happens to those clocks? (I also have no intention of starting some lame TF debate. I know it is not the end all be all, but I also have no sympathy for a company that began using that term in their marketing back in the PS3 days to prove they have the most powerful console. Were the situation reversed, they most certainly would be touting it front and center. If the numbers don't go in your favor now then "hoisted on your own petard" is all I have to say to that particular marketing strategy.)

Maybe there is no telling what that future holds here. I am sure people developing for the PS5 will, of course, be taking all this into account. They will make it work whether the clocks are dynamic or chosen from a profile. Thing is, every previous generation gets taxed eventually. Just a matter of what game(s) and when. Maybe what Sony has done keeps the noise profile down. What I want to know, and I doubt it can be answered anytime soon, is when those games hit, what does the PS5 actually run at? You want to brag about your variable "continuous boost " clocks and the advantages provided? Best back that up with real world examples for upcoming games and not vague statements. Both companies must have a good idea of what the games currently in development are going to be doing in 18 months time. 3 years time? Some idea?

Bottom line: I have yet to read something, here or elsewhere, that allows me to buy what Cerny stated. (Sorry if that offends anyone, but this feels deliberately confusing.) In the short term - PS4 games? Cross platform titles? Sure. No trouble believing that you will hit those clocks "most of the time", however you wish to define "most." You want me to buy that the variable clock/ continuous boost that just so happens to place you just into the double digit TF's is something more than what it sounds like? (So very convenient that. I am too much a cynic, especially when marketing gets involved, to buy that this is just coincidence. Possible? Sure. I wouldn't bet anything on it unless someone gave me very long odds).

Is the XsX going to maintain its clocks under the same load while the PS5 has to throttle the hell out of the clocks? What happens with the PS5 under the nebulous "future loads" and what happens to the XsX under the same circumstances is the question I would want answered. (Purely from a curiosity standpoint. I haven't seen anything from either company to date that would seriously effect any purchasing decision.)

Something to consider - in a couple of years when those games start to show up, Sony will have data on millions of PS5s. They’ll know their failure rates, how often they power limit, etc. They can make a reliability call and push a firmware update because they know a vast majority of consumer machines can tolerate the higher activity levels, and they’ll service the ones that don’t.
 
Something to consider - in a couple of years when those games start to show up, Sony will have data on millions of PS5s. They’ll know their failure rates, how often they power limit, etc. They can make a reliability call and push a firmware update because they know a vast majority of consumer machines can tolerate the higher activity levels, and they’ll service the ones that don’t.

Interesting point. One I will certainly grant. It just doesn't really answer the question. If you are saying Sony cannot answer it right now (the nebulous future), I can accept that. Just frustrating.

You say you dont have technical knowledge yet you claim you dont believe anything cerny says because reasons?

Arguing clock speeds without any context of actual real world game examples to test theories on is silly. Especially when it comes to being hostile.

Ps3 and 360 went into a power war and both sony and ms have had to recalibrate their messages due to changing enviorments. Its nothing new. It doesnt change that ps5 and series x will both be powerful next gen worth machines.

And i would say the same even if ps5 had been fixed at 2ghz gpu and 3.0ghz cpu.

When I state I am not technical, I mean not to the level of most of the posters around here. It doesn't mean I have no understanding. My reasons for my skepticism are clearly stated. What you are reading as hostility is more likely frustration. As stated previously, I have no doubt the PS5 will be a powerful machine and worthy of purchase. That is not my issue. After reading everything to date I can get my hands on for this, that presentation makes me feel "spun". As in "put a positive spin on it".
 
This may get me in trouble, but...

Apologies for the lack of technical knowledge on my part. Here is the problem I have with this "expectation". I cannot speak for others. What happens in 3 years time when developers are really starting to push the consoles and that "load" goes up? Those instances where the PS4/ Pro whined like a jet engine? Or any other instance where the machine is stressed? When those games start showing up for next gen, and they will, what happens to those clocks? As compared to its competitor? Or, I don't know the technical term, when the work performed exceeds the power profile and causes throttling? Plenty of examples running around for this gen. Cerny stated the expectation of "running at or near those clocks most of the time". (Inexactly paraphrased on my part I am sure)

I am not going to restart the debate on what "mostly" means. As mentioned, he also said they couldn't get the clocks to run at 3 and 2 consistently. Future games are going to stress the machine. Period. What exactly happens to those clocks? (I also have no intention of starting some lame TF debate. I know it is not the end all be all, but I also have no sympathy for a company that began using that term in their marketing back in the PS3 days to prove they have the most powerful console. Were the situation reversed, they most certainly would be touting it front and center. If the numbers don't go in your favor now then "hoisted on your own petard" is all I have to say to that particular marketing strategy.)

Maybe there is no telling what that future holds here. I am sure people developing for the PS5 will, of course, be taking all this into account. They will make it work whether the clocks are dynamic or chosen from a profile. Thing is, every previous generation gets taxed eventually. Just a matter of what game(s) and when. Maybe what Sony has done keeps the noise profile down. What I want to know, and I doubt it can be answered anytime soon, is when those games hit, what does the PS5 actually run at? You want to brag about your variable "continuous boost " clocks and the advantages provided? Best back that up with real world examples for upcoming games and not vague statements. Both companies must have a good idea of what the games currently in development are going to be doing in 18 months time. 3 years time? Some idea?

Bottom line: I have yet to read something, here or elsewhere, that allows me to buy what Cerny stated. (Sorry if that offends anyone, but this feels deliberately confusing.) In the short term - PS4 games? Cross platform titles? Sure. No trouble believing that you will hit those clocks "most of the time", however you wish to define "most." You want me to buy that the variable clock/ continuous boost that just so happens to place you just into the double digit TF's is something more than what it sounds like? (So very convenient that. I am too much a cynic, especially when marketing gets involved, to buy that this is just coincidence. Possible? Sure. I wouldn't bet anything on it unless someone gave me very long odds).

Is the XsX going to maintain its clocks under the same load while the PS5 has to throttle the hell out of the clocks? What happens with the PS5 under the nebulous "future loads" and what happens to the XsX under the same circumstances is the question I would want answered. (Purely from a curiosity standpoint. I haven't seen anything from either company to date that would seriously effect any purchasing decision.)

Games typically get graphically "better" as the generation goes on but does it mean they are more demanding power wise? Haven't seen any data to show if that's a trend throughout the generation..

I really think is going to going to come down to the particular engine in question and how good the developer is.
 
Games typically get graphically "better" as the generation goes on but does it mean they are more demanding power wise? Haven't seen any data to show if that's a trend throughout the generation..

I really think is going to going to come down to the particular engine in question and how good the developer is.
They tend to utilise the Hardware more over a Generation, so Yeah, more Power hungry. Games like Doom Eternal are in the danger zone for CPU and gpu utilisation in consoles. I would not be surprised if there Were not already a couple of launch games for these Systems that utilise very mature gpu Pipelines with dynamic res that saturate GPUs very well with few bubbles.
 
They tend to utilise the Hardware more over a Generation, so Yeah, more Power hungry. Games like Doom Eternal are in the danger zone for CPU and gpu utilisation in consoles.
If saturation is the amount of instructions being crunched per cycle, then yes; I would agree with this sentiment. If graphics keep improving with frequency and silicon staying the same, then the hardware must be crunching more instructions per cycle.

What's interesting in this debate of PS5's variable clock; the downclock in frequency means its doing more instructions per cycle. So even if the downclock is happening it's doing significantly more work per cycle.
 
They tend to utilise the Hardware more over a Generation, so Yeah, more Power hungry. Games like Doom Eternal are in the danger zone for CPU and gpu utilisation in consoles. I would not be surprised if there Were not already a couple of launch games for these Systems that utilise very mature gpu Pipelines with dynamic res that saturate GPUs very well with few bubbles.
If that’s the case, then those are the types of workloads Sony is currently profiling with, which should increase the confidence in Cerny’s statement about the rarity of downclocking.
 
I am wondering if raster advantage of PS5 will be felt or will games get BW limited way before that. Because PS4Pro raster advantage of 51% never materialized.

It seems MS made a better bet when it comes to TF/BW/Pixel Fillrate ratio (well, at least in theory). PS5 seems a bit more lopsided like Pro.

Without knowing RDNA2 BW usage, I am still worried about BW numbers for both.
 
Trying to explain this and keeping it simple:
...
Hope I made it clear.
I'm familiar with overclocking and power/heat management. My concern is with the statements "we were having trouble running the GPU at 2.0" and "now we can run it near 2.23 most of the time." Same for the CPU. So either Cerny was referring to corner cases consuming to much power at a fixed 2.0 for a reasonable cooling solution and the GPU will now drop below 2.0 when necessary, or running the GPU near 2.23 will mean a substantial CPU throttle. Something has to give. That would line up with Dictator's comments of devs choosing between high GPU or high CPU power profiles. Based on Cerny's problem statement, being able to run both GPU and CPU at the high power profile doesn't make sense. Happy to take this offline going forward.
 
But PS4pro had optimizations like Rapid Packed Math and Checkerboard Rendering that were absent on Xbox One X. We know that Series X supports math all the way down to 4bit, and can assume that PS5 supports at least fp16 because of hardware backwards compatibility, and probably checkerboard rendering as well. But based on what happened this generation, where checkerboarding seams to have been either ignored in favor of other reconstruction techniques. Or perhaps used on PS4 and another solution used on One X that is comparable, making CBR less of an advantage. There doesn't seam to be that secret sauce in this upcoming generation that can help Sony level the playing field. In fact, Sony has made no mention of things like VRS that could really swing performance in MS favor if Sony doesn't have it, and it's easy to implement.

I wonder if MS's support of multiple FP formats (FP4/FP8/FP16/FP32/FP64) is related to VRS. Most of the articles I've read explaining VRS in layman's terms basically use an example of rendering at lower resolutions, 2x1 or 1x4 pixel chunks for example. But what if there's a simple "Enable VRS" option that calculates an objects size and velocity on screen and lowers it's shading accuracy to match.
 
I'm familiar with overclocking and power/heat management. My concern is with the statements "we were having trouble running the GPU at 2.0" and "now we can run it near 2.23 most of the time." Same for the CPU. So either Cerny was referring to corner cases consuming to much power at a fixed 2.0 for a reasonable cooling solution and the GPU will now drop below 2.0 when necessary, or running the GPU near 2.23 will mean a substantial CPU throttle. Something has to give. That would line up with Dictator's comments of devs choosing between high GPU or high CPU power profiles. Based on Cerny's problem statement, being able to run both GPU and CPU at the high power profile doesn't make sense. Happy to take this offline going forward.
The context is in the transcript I posted.

The downclock is to avoid having to design the entire power/heatsink to allow AVX256 all the time which is a substantial margin required (we don't know what MS does about this). The reason given for smartshift is to take "unused" power from the cpu which would statistically allow the gpu to peak more often, "to squeeze every last drop of power available".

This makes sense if we think about what happens if it doesn's implement smartshift. It's invariably better with smartshift because when the cpu is waiting on a bunch of cachenmiss or is in a less compute intensive part of the pipeline, the gpu can be signalled to use more power than it would normally be limited to, so it will stay at it's peak more often. Therefore "to squeeze every last drop".
 
I'm familiar with overclocking and power/heat management. My concern is with the statements "we were having trouble running the GPU at 2.0" and "now we can run it near 2.23 most of the time." Same for the CPU. So either Cerny was referring to corner cases consuming to much power at a fixed 2.0 for a reasonable cooling solution and the GPU will now drop below 2.0 when necessary, or running the GPU near 2.23 will mean a substantial CPU throttle. Something has to give. That would line up with Dictator's comments of devs choosing between high GPU or high CPU power profiles. Based on Cerny's problem statement, being able to run both GPU and CPU at the high power profile doesn't make sense. Happy to take this offline going forward.

That is an assumption people are making. But those are not Cerny’s words. He claimed both the CPU and The GPU would be at full clock.
The “most of time” he added next may be the cause of the confusion, but that’s just a consequence of clock variations with the workload.
I believe the doubt also comes from comparing this with pc boost clocks. It has nothing to do with it.
Besides, in what cases will the CPU give power fo the GPU?
If that would not happen and the GPU would go over the expected power budget (those worst case scenarios Cerny spoke), the system would reduce clocks to keep the power budget.
With smartshift, if the CPU is not using it’s own power to the full, the GPU can use it. So the APU as a whole will keep within the expected power.

People keep talking about downclocks claiming both the CPU and GPU cannot both be used to the full, and that something has to give.
Cerny was not clear... I give you that. The doubts are logical. When I first hear him, I made the same assumptions.
But then I heard him again and again. And what I’m saying looks clear to me.
This is how I get it now.
Besides I cannot imagine a presentation about the strong points of your console, with the SSD, the Tempest Engine, and in it adding a large detailed explanation for a system that basicaly would.., downclock.
If that were the case we would simply save this to a later date, and just mention that the system would be capable of reaching 10,28 tflops.
This was what made me rewatch. And after some views I’m convinced this is how Playstation works.
A normal console, like all others, with an inovative power management system that allows for higher clocks and APU power management as a whole. And not a stupid solution to compensate for lower CU count with power management issues.
 
That is an assumption people are making. But those are not Cerny’s words. He claimed both the CPU and The GPU would be at full clock.
The “most of time” he added next may be the cause of the confusion, but that’s just a consequence of clock variations with the workload.
I believe the doubt also comes from comparing this with pc boost clocks. It has nothing to do with it.
Besides, in what cases will the CPU give power fo the GPU?
If that would not happen and the GPU would go over the expected power budget (those worst case scenarios Cerny spoke), the system would reduce clocks to keep the power budget.
With smartshift, if the CPU is not using it’s own power to the full, the GPU can use it. So the APU as a whole will keep within the expected power.

People keep talking about downclocks claiming both the CPU and GPU cannot both be used to the full, and that something has to give.
Cerny was not clear... I give you that. The doubts are logical. When I first hear him, I made the same assumptions.
But then I heard him again and again. And what I’m saying looks clear to me.
This is how I get it now.
Besides I cannot imagine a presentation about the strong points of your console, with the SSD, the Tempest Engine, and in it adding a large detailed explanation for a system that basicaly would.., downclock.
If that were the case we would simply save this to a later date, and just mention that the system would be capable of reaching 10,28 tflops.
This was what made me rewatch. And after some views I’m convinced this is how Playstation works.
A normal console, like all others, with an inovative power management system that allows for higher clocks and APU power management as a whole. And not a stupid solution to compensate for lower CU count with power management issues.

In gpu extreme one can think of furmark. on CPU side cerny specifically mentioned avx2 being power hungry and not being widely/heavily used in current engines. It really comes down to how well game developers manage to optimize their code, i.e. cpu/gpu utilization and using specific power hungry instructions heavily. Most developers likely will not manage to load cpu and gpu 100% and there is wiggle room in power draw. Some developers later in the cycle might be able to create insane loads with very little bubbles, hit max power draw and either cpu or gpu is preferred.

Cerny also specifically said the gpu clockspeed is not limited by power draw. He said there is something else that imposes hard ceiling for max clock. So hitting max clock is not same as hitting max power draw. Same applies to cpu, just hitting max clock is not hitting max power draw. To hit max power draw very specific and likely bubble free stream(s) of instructions is needed.
 
Last edited:
I like how several people wrote that they have some OC experience, therefore Cerny is a big fat liar and PS5 will not reach its peak performance for the most of time, because it will throttle and overheat and cause global warming.
You know I am something of scientist myself, I tune the engine in my old Renault Laguna in my spare time using screwdriver and some other basic tools. Having that in mind I would like to say that AMG and BMW M are amateurs and can't make decent sport cars.

Nevermind resources and budget. I know what I am saying.
 
Last edited:
Status
Not open for further replies.
Back
Top