Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Not on the competitor's console.

Yep. That's what I'm hinting at.

Or exactly the opposite: the spikes in CPU and GPU power exactly cancel each other in the regular operation. And you're running full speed all the time.
I'm only talking about PS5. There's no issue with XSX. It's just going to keep trying harder to cool the unit until it fails to and then stops running. This is typical behaviour for a console to protect the hardware.

How could spikes in CPU and GPU power cancel each other out? What are you referring you?
 
  • GPU is over it's power budget but CPU is underloaded and the GPU can borrow some of the power budget from the CPU to maintain max clock.
  • GPU is over it's budget and CPU isn't sufficiently underloaded to the point where it has enough power to spare the GPU to maintain max clocks. GPU throttles clocks to the degree required to make up the difference.
  • The reverse of the above two scenarios but with the CPU needing more power.
  • Both exceeding their power budget at max clocks and both needing to throttle clocks.

Isn't that what DF/Alex said?
 
It's beautiful.
It's breathtaking.


I guess it makes sense that L1 would thrash more with hyperthreading? Then L2 is per core, but L3 is shared amongst all, and then DRAM is hit harder.

Kind of curious to see how console Zen 2 will fare with the huge bandwidth available along with not having to traverse RomeI/O. Infinity fabric will be handling cache snooping etc. now too instead of Garlic and Onion.

I'm sure betanumerica brought up that curiosity already.
 
Last edited:
For example when GPU's in high load, CPU is not, and vice versa.
Assume that the power shift happens with a frequency of ~1000Hz, for example.
There has to be a limit to how low you will allow the CPU and GPU to run at otherwise other issues start to crop up.
There is a base clock speed for both. Either GPU or CPU should not drop below their base clock as a result of the power shift.

Under heavy contention both will need to reduce in frequency.

The absolute worse case scenario (full contention) the GPU and CPU will be running at their base clocks.
 
There has to be a limit to how low you will allow the CPU and GPU to run at otherwise other issues start to crop up.
There is a base clock speed for both. Either GPU or CPU should not drop below their base clock as a result of the power shift.

Under heavy contention both will need to reduce in frequency.

The absolute worse case scenario (full contention) the GPU and CPU will be running at their base clocks.
I don't think you understand what high load means here. It's not about frequency. It's about some jobs that make work the GPU (and CPU) much harder than others, even if the GPU or CPU are 100% busy in a profiling chart.
 
The absolute worse case scenario (full contention) the GPU and CPU will be running at their base clocks.

Yep. That's obvious.
The question is: how often it happens?
I know from PS2 days: at the end of the generation VU1 was used at 90% and VU2 at ~70%.
I think with each later generation the under-utilization was bigger and bigger.
Do you have any data?
 
likely maximum speeds unless it's doing some heavy lifting in the background.

He's right in that power != frequency.

But the type of load and the amount of load on a CPU/GPU will definitely increase power requirements.
All you need is 8 cores all going at once on the CPU doing different things, or you are pushing the limits of your GPU and making all sorts of async calls so that it's filling every gap with something.
pshaw, 14 CPU threads + 4 GPU contexts.

The math checks out.
 
Regarding BC for PS3, call me crazy but couldn't Sony port the IP of the CELL BE to modern nodes? I understand the porting process in itself may be pricey and their may also be some licensing concerns with IBM/Toshiba, but I'd think the eventual unit cost would be incredibly cheap, it'll draw tiny amounts of power and the die would be absolutely minuscule. It could simply be placed on the motherboard and like any chip, over time; it could continue to be shrunk with future nodes.

As it is now and assuming we don't get legacy PS support on PS5, I feel like Sony is just kicking the can down the road and all these things will eventually be lost in time; or they'll have to scramble down the road if ever they have a change of heart. The sooner they get this all done now, the less they may have to think bout it later. For PS2 I'd suspect they have the brute force power and resources to emulate it by now and as for PS1 emulation they could just include a potato and some jumper cables at this point.

PS Now would stand to benefit from all of this too, they wouldn't require racks of old PS3s anymore. I just think its important they make a concerted, focused effort to get PS1, PS2, PS3 BC integrated in their ecosystem sooner rather than later.

I'm aware there's a cost benefit ratio but I feel supporting your legacy, showing you care about it and carrying it forward is a very positive thing, it's good publicity and fosters good will amongst your audience.

I guess it's great at least that PS4 BC is inherent to the PS5 architecture which should make it easier to carry that forward in future generations.

Anywho, back to the technical angle... Is there any real issue why a tiny CELL BE couldn't be recreated today outside the porting cost and licensing?

This, of course assuming Sony don't pull a rabbit out of a hat and find a way to emulate it..
 
I don't think it's going to be as simple as shifting CPU and GPU load. The impression I got from Cerny if you listen carefully was that it just another tool to afford the GPU more power. But in general, the CPU is relatively low powered vs the GPU and it has to function in some capacity, so SmartShift in and of itself probably not enough. I'd expect additional GPU downclocking necessary, even if shifting some CPU power in high-load scenarios.

Not sure where techpowerup gets their info but everything looks pretty accurate and they list the PS5 GPU base clock at 1750MHz and game clock at 1900MHz. https://www.techpowerup.com/gpu-specs/playstation-5-gpu.c3480
My guess, and we'll see when the dev docs are available, is that the scheduler is constrained in some way at higher clocks if Dictator is right with his reporting that there are power profiles the developer chooses from.
 
I don't think you understand what high load means here. It's not about frequency. It's about some jobs that make work the GPU (and CPU) much harder than others, even if the GPU or CPU are 100% busy in a profiling chart.
I'm well aware of what high load means. And that is _exactly_ what I'm referring to.
 
Nailed it.
I just don't like people using point (1) as the general basis of comparison. It's unlikely that all the titles will not push the hardware. And if it is like that, that's unfortunate.

I just don't think it matters very much. They are all theoretical numbers anyway. Eventually we will have side by side comparisons of these systems running the same game and be able to see how the hardware differences manifest as actual performance differences.
 
Status
Not open for further replies.
Back
Top