Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Why in the hell would you want fixed clocks on the dev kits? If you game won't cause any downclocking you basically have a fixed clock without any intervention (flipping a fixed clock switch). If your game does force downclocking, wouldn't you want to know to address as soon as possible?
It could be useful for performance profiling.
 
Why in the hell would you want fixed clocks on the dev kits? If you game won't cause any downclocking you basically have a fixed clock without any intervention (flipping a fixed clock switch). If your game does force downclocking, wouldn't you want to know to address as soon as possible?

I think the kits have a few modes, amongst them will be a "retail" setup and at least one (possibly more than one) fixed clock setup.

As @anexanhume says, it could be useful for profiling. For example getting accurate numbers on the the cost of various parts of your game, independently of other systems affecting turbo and skewing results.

Another reason might be so you can create a game that is tolerant of any amount of throttling on either CPU or GPU (e.g. totally stable, consistent frame rate, doesn't introduce frame pacing issues or stutter).
 
I mean, my whole spiel is that people should listen to Cerny. But real Cerny and not some console war spin on what he said.

I really don't have a problem with some dev saying it spends most of its time at the clocks to make it a "~10 tf machine". It's what I'd expect based on what I've seen, especially when it's early days. Of course, as the machine is pushed harder through release and in the years to come, those clocks will probably drop a little, just as XSX will probably push its PSU and aspects of its cooling harder.

Edit: I'm also expecting fewer of the sharp, narrow troughs you see on some PC frequency graphs! I'm sure Sony will have tools to help you engineer out PC style wild dips. No doubt the profiler and lack of PC style driver will help!

I hope this isn't aimed at me, I'm essentially saying what @Globalisateur just said but I find other ramblings implying the PS5 will not perform as well as implied by Cerny.

When we watch what he says he talks (previously) about the challenge of predicting what the devs might squeeze out of PS4 and they essentially guessed wrong and that's (part of the reason) we have loud PS4s. This (in my mind) is because generally games won't be able to fully utilise the GPU/CPU and as such it'll never be maxed out, with GoW as an example of a game really pushing the system hard.

So the the implication (or my interpretation) is that early games don't stress the system and so you have a console running well within it's limitations until later in the generation when devs are getting to grips and getting the most from the system.

Now back to what Cerny went on to say - that 'when that worse case game arrives' - so implying this isn't going to happen much at all and then '2% frequency is a 10% power saving' - so a small drop saves a bigger amount of power.

So (again, my interpretation) the PS5 is designed to run as fast as it can and only slow down when stressed - which Cerny seems to imply won't be by much and won't be often and I believe will be those games that are really pushing the system - more likely later in the console generation (or maybe poor efficiency code).

What I'm saying is, too much is being made of the drops - and those drops likely won't be noticeable until later in the generation. I kind of think that early in the generation this might actually help the 3rd party games (outside of RT) look fairly close to XSX, however later in the generation when games are pushing the systems harder and utilising more of the CUs we will probably see the gap get a bit bigger.

I'm likely totally wrong, it's just my uneducated take...so go easy on me if I'm rambling like an idiot.
 
Last edited:
I mean, my whole spiel is that people should listen to Cerny. But real Cerny and not some console war spin on what he said.

I really don't have a problem with some dev saying it spends most of its time at the clocks to make it a "~10 tf machine". It's what I'd expect based on what I've seen, especially when it's early days. Of course, as the machine is pushed harder through release and in the years to come, those clocks will probably drop a little, just as XSX will probably push its PSU and aspects of its cooling harder.

Edit: I'm also expecting fewer of the sharp, narrow troughs you see on some PC frequency graphs! I'm sure Sony will have tools to help you engineer out PC style wild dips. No doubt the profiler and lack of PC style driver will help!
I actually expect the frequencies to drop less later in the gen because devs will learn to display more stuff will less supervised (official) load. I think this is where the locked GPU frequencies will help them in order to determine which way of doing will give them the max clocks or so while rendering the same stuff. I also expect some devs will find ways to trick a little the whole system.
 
u12v1y1.png
 
I actually expect the frequencies to drop less later in the gen because devs will learn to display more stuff will less supervised (official) load. I think this is where the locked GPU frequencies will help them in order to determine which way of doing will give them the max clocks or so while rendering the same stuff. I also expect some devs will find ways to trick a little the whole system.

Seeing a specific MHZ number being hit does not correspond to better performance on PS5. It follows a preset table where xWORKLOAD = xMHZ or xTBP = xMHZ.

The debate is how much does it lower? How much smartshift can alleviate it?

Seeing a high MHZ number on PS5 means lower GPU utilization. It will only benefit those developers whose code is badly optimized.

Because it follows a table, devs could chose: 90% GPU utilization @ 2.23Ghz or 100% at 2Ghz. Aside from small benefits in other aspects of the GPU, framerate would be almost the same in both scenarios.
 
Because it follows a table, devs could chose: 90% GPU utilization @ 2.23Ghz or 100% at 2Ghz. Aside from small benefits in other aspects of the GPU, framerate would be almost the same in both scenarios.

I guess forbidden west with its amazing graphics is going for the full 9.2TF power, 100% gpu.
 
I'm not understanding that at all. It feels counter intuitive to have lower clock at 100% utilisation, and full fat 2.23GHz at 90%? Am I stupid?

My understanding is that the more it’s utilised the more chance/need to downclock - as most games don’t utilise 100% then most wont downclock - or when that worse case arrives it will be minor as a 2% drop saves 10% power;)
 
Seeing a specific MHZ number being hit does not correspond to better performance on PS5. It follows a preset table where xWORKLOAD = xMHZ or xTBP = xMHZ.

The debate is how much does it lower? How much smartshift can alleviate it?

Seeing a high MHZ number on PS5 means lower GPU utilization. It will only benefit those developers whose code is badly optimized.

Because it follows a table, devs could chose: 90% GPU utilization @ 2.23Ghz or 100% at 2Ghz. Aside from small benefits in other aspects of the GPU, framerate would be almost the same in both scenarios.
You are right. I worded it poorly. I meant they are going to compare the performance of the max clocks vs dynamic clocks with different ways of rendering the stuff and pick the best way in order to have the same performance of max clocks in the dynamic mode.

Those modes are only available on the devkit, the retail mode will always use the dynamic mode.
 
I guess forbidden west with its amazing graphics is going for the full 9.2TF power, 100% gpu.

Do you not understand the Tflop figures are a math calculation that does not mean anything for the actual performance of the machine? And yes, it means Cerny was just playing the marketing game as well, knowing very well people around the web will pay attention to that meaningless number either way.
 
I'm not understanding that at all. It feels counter intuitive to have lower clock at 100% utilisation, and full fat 2.23GHz at 90%? Am I stupid?

Search "Furmark" or "powervirus" on how they hit the TBP hard limits of desktop GPUs and they are forced to downclock to stay bellow the hard limit. When TBP limited like the PS5 is, actual performance potential is not limited Mhz but by the power its using, and that power usage depends on workloads and types of workloads.


edit: switched Measured for Limited. Its not limited by Mhz but by power.
 
I'm not understanding that at all. It feels counter intuitive to have lower clock at 100% utilisation, and full fat 2.23GHz at 90%? Am I stupid?

Consider a game that uses async compute to do both graphics and compute loads concurrently, thus are hitting more parts of the GPU in a given time frame thus increasing the power consumption.

Or hyperthreaded CPU operation - we have a clear example with even the SX CPU modes where it is 3.8GHz without SMT, or it can operate at a lower clock with SMT.
 
I'm an artist at Activision and worked for the second revision of the Goon05B 3d model, the main gun, and I've been part of the small team that produced the ps5 theme. It's a nice group of people, they always keep pulling out jokes because I'm the only bald in room 27 on the second floor.
But I want to stay anonymous, so can't add anything more.
 
Search "Furmark" or "powervirus" on how they hit the TBP hard limits of desktop GPUs and they are forced to downclock to stay bellow the hard limit. When TBP limited like the PS5 is, actual performance potential is not limited Mhz but by the power its using, and that power usage depends on workloads and types of workloads.


edit: switched Measured for Limited. Its not limited by Mhz but by power.
Actually PS5 games won't be limited (directly) by power usage, at all. The system needs to be 100% deterministic.
 
Status
Not open for further replies.
Back
Top