Digital Foundry Article Technical Discussion [2024]

SmartShift is about shifting power between the CPU and GPU based on workload, which you would already expect to be happening in an APU. Dynamic clocks have been on the PC for a long time, though I assume Sony's algorithm is different.

PS5 (and PS5 Pro) allocate CPU power to GPU when the CPU isn't using it, increasing headroom for GPU boost. On Series X the clocks of CPU and GPU are fixed, and completely independent.

PS5's dynamic clocks are deterministic and uniform across all consoles, being using activity counters in the hardware and basing speeds on some kind of lowest common denominator version of worst performing silicon within the PS5's used bin of silicon.

This will result in some PS5's leaving a little headroom unused, though massively, massively less than Series consoles.

One of the proper hardware experts here a few years back explained that the kind of activity counters that would be needed had first been built into AMD hardware many years ago, though I'm unsure whether consoles would have naturally included it all. AMD turbo clocks have used predicted power draw of various workloads to inform their boost clocks for years, instead of just monitoring current and voltage and reacting.

Whatever Sony are doing with PS5, it will be based on some version of AMD's PC technology. And it was a very good choice to use it too.
 
I don't get the hostility towards DF by some. Digital Foundry said ages ago they heard developers say the tools of the Xbox consoles weren't as good as Sony's are for the PS5.. that's literally what they're repeating again. So what else has really come from this "new revelation"? Apparently that PS5 has higher clocks which we already knew (like as if DF and everyone else doesn't know that higher clocks improve throughput in many scenarios)... As far as I know nobody at DF ever said Mark Cerny was wrong... and yet there are people acting like as if they were out there denying that what Cerny was saying was true.. Nobody knew just exactly what next generation gaming workloads would be like and whether they'd favor one architecture over the other.

Outside of the better dev tools, Cerny deserves praise for putting together a well thought out machine with some smart tweaks to make it as performant and cost effective as can be.. there's no question about that. But regardless, I'm stubborn and still say it comes down to better APIs/tools on Sony's side, and priority from developers. They can make a game performant and optimized for PS5... and simply stop optimizing the Xbox side if it's already around what the PS5 is... because in the end there's not a whole lot of incentive to push beyond.

Where Microsoft has undoubtedly failed, is in their own teams creating software that very clearly shows it's generational improvement. I fully believe that none of these petty arguments would be happening if MS was releasing their own super graphically impressive titles which clearly took advantage of their hardware, like Sony does. If they did that, you could easily handwave away some 3rd party titles performing better on PS5 relative to XSX as them just focusing on the market leader. It's MS' own studios which are failing to make it's hardware live up to the hype they build around the teraflops number.
 
Putting out an opinion early on doesn’t have value if you can’t verify it’s correct.

Plenty of developers were coming out and saying all of this before, during, and after the consoles released. No better verification than the developers who are actually building games for these machines. Look no further than the post above. Loads more where that came from.
 
I don't get the hostility towards DF by some.

That kind of snarky jabbing is common for NXGamer relative to DF (and the poster who linked to it), it's a personal thing (he felt his 'idea' of longform technical analysis was basically stolen from him by DF for one, lol).

Also, please with the "I explained all of this" - the point was we've had 3+ years of an evolving API and SDK for both, which was the entire point of the viewers question submitted to DF Direct. A 4 year old video where he compares a 750ti to an Xbox One S is not exactly relevant to the main 'discovery' of DF's talk with developers, specifically about the superiority of PS5's shader compiler being a big advantage. That video would not reveal anything to anyone wondering why the PS5 was outperforming the SX in some titles, apparently he felt just noticing 'not all teraflops are the same', especially when it comes to 2 significantly different architectures (relative to SX vs PS5) is explanatory enough, something I think DF and most of their viewers understand. The exact reasons other than "shit's different' (or when you emphasize something like the 750ti's limited vram relative to the OneS, wtf does what have to do with SX vs PS5?!), especially when coming from actual developers of both environments is what was asked.
 
Last edited:
Plenty of developers were coming out and saying all of this before, during, and after the consoles released. No better verification than the developers who are actually building games for these machines. Look no further than the post above. Loads more where that came from.
The launch of a console is where you expect a lot of noise in terms of data points. In order to actually speak to something definitive you need a trend, a strong signal. 4 years ago prior to launch, I doubt that is enough data points unless every single developer is speaking to him. DF talks about the most 2 common trends, even among all the changes over the years.

I have my doubts there was enough data even after the first 2 years post launch. So much of both consoles were still changing and still are.
 
Last edited:
I'm surprised no one has looked at the massive gulf in power consumption between these consoles. Both consoles have the same ~230w max wattage and PS5 is almost always running near its maximum according to my power meter. But the X runs more typically in the 160-190w range which suggests that it's often quite underutilized.

The latest FIFA/EA FC24 is a prime example of this and runs in the usual 220-230w range on PS5. But on Xbox it only runs at 150-160 watts. And it's far from being an isolated case.
 
I'm surprised no one has looked at the massive gulf in power consumption between these consoles. Both consoles have the same ~230w max wattage and PS5 is almost always running near its maximum according to my power meter. But the X runs more typically in the 160-190w range which suggests that it's often quite underutilized.

The latest FIFA/EA FC24 is a prime example of this and runs in the usual 220-230w range on PS5. But on Xbox it only runs at 150-160 watts. And it's far from being an isolated case.

I wouldn't be surprised to discover that the fixed clock nature of the GPU on the XSX is really hurting them in terms of total power.
GPU loads aren't known for being all that consistent, if your GPU is idling at max clock, but not doing much useful work your burning power for no reason,
power that could - potentially - otherwise be used to boost the clock when the GPU does need it!

Over just a single second, this couls really hurt the max perf of the XSX if it constantly required to be below a certain power draw.
 
I'm surprised no one has looked at the massive gulf in power consumption between these consoles. Both consoles have the same ~230w max wattage and PS5 is almost always running near its maximum according to my power meter. But the X runs more typically in the 160-190w range which suggests that it's often quite underutilized.

The latest FIFA/EA FC24 is a prime example of this and runs in the usual 220-230w range on PS5. But on Xbox it only runs at 150-160 watts. And it's far from being an isolated case.
I haven't kept up on the image quality between sports games on Series and PS5. Are there any graphical differences between the Series and PS5 FIFA/EAFC games? If not, wouldn't it suggest that Series is simple more efficient more so than it being underutilized? If there isn't any real differences, Series X would be producing essentially identical results using lower clocks and less overall wattage.
 
I'm surprised no one has looked at the massive gulf in power consumption between these consoles. Both consoles have the same ~230w max wattage and PS5 is almost always running near its maximum according to my power meter. But the X runs more typically in the 160-190w range which suggests that it's often quite underutilized.

The latest FIFA/EA FC24 is a prime example of this and runs in the usual 220-230w range on PS5. But on Xbox it only runs at 150-160 watts. And it's far from being an isolated case.
In the power equation for silicon, frequency is the leading coefficient, as it is cubic. So that clock speed differential is going to be heavily drawing on power. It’s the main reason why we move to more cores or more processing units. We take scaling loss in compensation from dropping the power requirements dramatically.

Otherwise we would have stuck to single core and maximum frequency. But as a cubic coefficient, that little chip will have more power density than a nuclear reactor very quickly as you keep pushing it upwards.

It is understandable why PS5 uses so much power from that standpoint alone. That’s the risk that Sony takes with its running higher clock speeds. Your yield has to support it.
 
Xbox has fixed clocks - during even the most demanding game conditions the frequency shouldn't budge. This means leaving a lot of frequency for peak or even typical clocks on the table.

clock-vs-voltage.png


There are points where the frequency is dropping into the 1900 ~ 2000 mhz range during gaming. This is despite the 6700XT having a higher draw on its own than the entire Series X at the wall.

That chart doesn't show why they're dropping, CPU limited, temperature, streaming hitch, looking at a wall/sky/floor.....there are many reasons that can cause clock fluctuations that aren't simply power related.
 
That chart doesn't show why they're dropping, CPU limited, temperature, streaming hitch, looking at a wall/sky/floor.....there are many reasons that can cause clock fluctuations that aren't simply power related.

This is true, but as some of the clocks are dropping at above 1.1v or even at the chips max of 1.2v, I interpret that as meaning there's probably power related throttling going on. At say 0.88v or 0.95v I'd guess that was because the GPU was twiddling its thumbs.

Interestingly, they measured Furmark at 1.006 to 1.031 V, meaning with the right (wrong?) workload you can become power limited well below maximum V.

Bit OT, but I've always thought that Nvidia were better at avoiding those kind of transient micro power draw spikes that seem to affect even mid range AMD GPU's. Look at Furmark, Gaming and Vsync on the aforementioned 6700 XT. Aggressive little fella, frequently boosting too aggressively .... 😬 I'd guess console makers don't want to have to fit a 1000W Gold five rail PSU in their BOM conscious units.

power-consumption.png
 
Calculated risk for Sony. You don't know if yields will improve or not or by how much over time. Fast silicon costs more than slower silicon just due to yield. There is only 20% differential in size between the 2 chips at launch (300mm2 vs 360mm2). MS may have played it overly safe from the beginning to ensure that their costs were inline, and Sony has more freedom to play with given their position in the market.

Ultimately you pay per wafer. A 20% SoC area difference can be made up by a 20% yield difference. Sony gets more chips per wafer, but they're hoping that yield improves so that they can claw their money back. There was a very specific reason why you didn't see many PS5 Digitals in the wild at launch. It was a loss, even by today's standard it's still likely to be.

Given how they were both priced fairly equally (even today), we can assume that they SoC cost about the same at launch. PS5 was likely sitting at 73% yield, XSX was likely sitting closer to 88%. With these numbers both would produce about 172-173 chips per wafer.
Yields is not only defined by max clocks but actually mainly by max power the silicon is able to sustain (which is why overclocking is often done with undervolting). XSX is usually consuming 30W less than PS5 but in some cases it must be able to consume pretty much the same amount of PS5. We saw this in a Matrix demo analysis comparison. My point being yields from launch systems might not be that different.
 
Back
Top