Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
XSX has ~17% more compute and 25% more memory bandwidth so if PS5 was maintaining maximum clocks at all times it will still be between ~17-25% slower (On paper) which is well within your 15-18% figure.

The fact that PS5 is competing as well as it is in multiplats goes to somewhat prove it's not having issues maintaining and running at it's maximum allowed clocks all the time.

What also needs to be kept in mind is there are certain aspects of the GPU pipeline that run ~17% faster on PS5 due to these higher clocks so it's not as simple as saying XSX's GPU is faster across the board then PS5's.

So as it stands there's nothing at all to show that variable clocks have no space in a console and you could even argue that currently PS5 is showing they can work very well.

What would be interesting to know is what the lowest clock speed PS5's GPU is allowed to drop too during extreme loads.

With that i was referring to MS's statement that they offer fixed level of performance to the devs, as opposed to Sony's variable performance levels. On the other hand, Sony's claiming its all automatic and developers dont have to thinker about regarding the GPU clocking lower when the CPU's needing more juice (or vice versa). The truth is probably in the middle as far as MS and Sony's claims on this. A fixed level of performance is quite in place for consoles, on the other hand Sony's solution aswell since saving power is all that crucial in a console where hardware, temps, load etc are constrained.
Still, i think that say for multiplat games, designed for PC/XBOX/PS and more platforms, what if a game is designed to tax both the CPU (cpu intensive things like 120fps) while also saturating the GPU, for the XSX there wouldnt be a need to reduce cpu load to not eat from the GPU, but the ps5 version would either need a somewhat reduced setting in a certain scene, or the resolution/framerate suffers. I'd guess that developers who dont thinker here would result in slight performance decreases for the PS5 in CPU/GPU intensive scenes.

Though, its quite too early to determine how well downclocking from base clocks (and trading cpu for GPU load and vice versa) pans out. 100% certain very well for games designed primary for the system since studios optimize the game for it and you wont ever know what was sacrificed. But for multiplat games that totally tax the system? That remains to be seen. Theres games who perform just aswell on the PS5 as XSX, or even where the PS5 has a slight advantage. On the other hand, when the XSX has a lead, its more substantional, sometimes in the 20% range, in more rare cases 40%.
MS's box does have a close to 20% more capable GPU aswell as much more bandwith to play with (very important for GPUs even these days). Their system doesnt trade load between CPU and GPU either, the CPU always clocks at 3.6 and the GPU at its 12.2TF metric, there should never occur situations where either clocks down for lower perf levels.

Regarding the 'PS5 competing so well', were too early in the generation where basically everything is cross-generation or based on last generation rendering technologies. To start, ray tracing seems to scale better with a higher/wider GPU TF count. Also, XSX is equipped with RDNA2 features the PS5 doesnt have like VRS which has quite the potentional if dev tweets are to be believed. MS's system also shares the DX12Ultimate api, which could be having an edge since its sharing it with other platforms whereas the PS5, while closer to the metal, is more uniqe, like tempest not getting as much attention in multiplat games as Atmos does (CP2077 is one example).
 
Funny that patch on xsx is about same perf as FPS boost. But hey at least they didn’t charge 10 bucks for updating one ini file
Can anybody confirm that now that the game is X|S optimized, does it force you to put the game on internal storage? Being able to play from any external HDD or SSD that I might hook up to the system is much better for last gen games in my opinion. If you are now forced to use internal storage I think it would have been better to leave the game as it was. I'm really not seeing the improvement here for the X. Congrats to people who have the S though.
 
Can anybody confirm that now that the game is X|S optimized, does it force you to put the game on internal storage? Being able to play from any external HDD or SSD that I might hook up to the system is much better for last gen games in my opinion. If you are now forced to use internal storage I think it would have been better to leave the game as it was. I'm really not seeing the improvement here for the X. Congrats to people who have the S though.
Only native ports require to be installed on the SSD AFAIK and this update "enhance" FPS via BC
Crysis 2 and 3 are also not native PS5/XBX versions, as they mention on the video
 
But they did not add a whole new island and quests either.

Yeah but the price for extra content is higher then 10 bucks, 10 bucks is only for specific PS5 upgrades wich are what slighty higher res and haptic feedback?
Anyway i didnt want to criticise GOT upgrade pricing but i feel somehow disappointed that official "patch" is no better than FPS boost hack.
Very little actual work was involved in this patch and as little creative thinking, on ps5 at least they implemented CB on xsx and xss they didnt do anything.
PS5 even with CB suffers from occasional minor frame dips so who decided to go with native res on xsx? They didn't do any work on series side and they didnt even put any thinking into this patch.
Disappointing.
 
Yeah but the price for extra content is higher then 10 bucks, 10 bucks is only for specific PS5 upgrades wich are what slighty higher res and haptic feedback?
Anyway i didnt want to criticise GOT upgrade pricing but i feel somehow disappointed that official "patch" is no better than FPS boost hack.
Very little actual work was involved in this patch and as little creative thinking, on ps5 at least they implemented CB on xsx and xss they didnt do anything.
PS5 even with CB suffers from occasional minor frame dips so who decided to go with native res on xsx? They didn't do any work on series side and they didnt even put any thinking into this patch.
Disappointing.
it turns out they didn't implement cb for ps5 as ps4pro already use cb 1872p, so they just jump res to 2160p cb
 
Yeah but the price for extra content is higher then 10 bucks, 10 bucks is only for specific PS5 upgrades wich are what slighty higher res and haptic feedback?

i guess it's to keep with standard PS5 games prices, or else everybody would buy the PS4 version instead if the PS5 upgrade was free.
But i agree that next gen titles price is too expensive, and not justified, at least for now.
 

Crysis 3 on the Switch. Hopefully the materials are revised like in Crysis 2. According to the various statements in the video Crysis 3 will unfortunately have fewer changes. Probably the materials are like in the PC version from 2013. Visually, I think Crysis 2 will go more in the direction of Ryse through the PBR.

Crysis 3 looks very good on the Switch. DF stated in the past that TAA on the Switch often makes the image too soft and that Anti-Aliasing methods such as FXAA would suit the Switch better. In my opinion the image of Crysis 3 is still good even with TAA. The shown DOOM in the video looks much more blurred. FXAA would also flicker a lot with that amount of vegetation and shiny materials. Since Alien Isolation uses TAA on the Switch it looks better there than on the PlayStation 4 in my view.

For Crysis 3 I always used a temporal AA called TXAA . This uses MSAA as well which makes it much more resource-intensive.



Players who only evaluate a degree of optimisation in view of the highest settings are stupid anyway and should rather not say anything.

They should make separate graphics settings for more intelligent players but they still should be found in the game menu.
so impressive. I am subscribed to Digital Foundry although I don't usually watch console based videos -maybe except those which feature Switch or similar hardware-, but this one got me very interested. It looks better than on the PS3 and has a much better AA solution, imho.

always wished this engine got more attention. At a time were 90%+ games were Unreal based and characters and graphics looked like Gears of War, Crytek's engine was a breath of fresh air and I loved how it was more suited for everything, from corridor games to sandbox games.

Their engine was in my opinion, one of the best ever made, and until new iD's engine for Doom came out, it was my favourite.
 
Last edited:
Yeah but the price for extra content is higher then 10 bucks, 10 bucks is only for specific PS5 upgrades wich are what slighty higher res and haptic feedback?
Anyway i didnt want to criticise GOT upgrade pricing but i feel somehow disappointed that official "patch" is no better than FPS boost hack.
Very little actual work was involved in this patch and as little creative thinking, on ps5 at least they implemented CB on xsx and xss they didnt do anything.
PS5 even with CB suffers from occasional minor frame dips so who decided to go with native res on xsx? They didn't do any work on series side and they didnt even put any thinking into this patch.
Disappointing.
For better or worse at least they gave you a way to “upgrade” your existing copy, unlike Control where you have to buy a new copy (the Ultimate Edition). It’s still a fuck-you to early adopters, but a minor one compared to Control.
 
Only native ports require to be installed on the SSD AFAIK and this update "enhance" FPS via BC
Crysis 2 and 3 are also not native PS5/XBX versions, as they mention on the video
Sorry I should have mentioned that I am talking about Shadow of the Tomb Raider not Crysis. According to the Microsoft store Shadow of the Tomb Raider is now a X|S Optimized title.
 
Last edited:
With that i was referring to MS's statement that they offer fixed level of performance to the devs, as opposed to Sony's variable performance levels.

But we don't know how variable PS5's clocks are.

On the other hand, Sony's claiming its all automatic and developers dont have to thinker about regarding the GPU clocking lower when the CPU's needing more juice (or vice versa). The truth is probably in the middle as far as MS and Sony's claims on this. A fixed level of performance is quite in place for consoles, on the other hand Sony's solution aswell since saving power is all that crucial in a console where hardware, temps, load etc are constrained.

See above reply

what if a game is designed to tax both the CPU (cpu intensive things like 120fps) while also saturating the GPU, for the XSX there wouldnt be a need to reduce cpu load to not eat from the GPU, but the ps5 version would either need a somewhat reduced setting in a certain scene, or the resolution/framerate suffers. I'd guess that developers who dont thinker here would result in slight performance decreases for the PS5 in CPU/GPU intensive scenes.

You game on PC yes? Load up MSI afterburner and tell me what game and settings max your PC's CPU and GPU out at the same time.

Though, its quite too early to determine how well downclocking from base clocks (and trading cpu for GPU load and vice versa) pans out.

Where is this even coming from? Do you have some secret source of information about PS5's clocks, how they function and what the lower bouds are?

But for multiplat games that totally tax the system? That remains to be seen.

Games never max out a system, this is something easy to see if you load MSI Afterburner on your PC with v-sync turned on.

Theres games who perform just aswell on the PS5 as XSX, or even where the PS5 has a slight advantage. On the other hand, when the XSX has a lead, its more substantional, sometimes in the 20% range, in more rare cases 40%.

The games that show a larger then normal gap on XSX are either BC games, games that use One-X and PS4 Pro as the base and thus have the same limits or games where PS5 for some reason is capped at a resolution lower then it should be.

MS's box does have a close to 20% more capable GPU aswell as much more bandwith to play with (very important for GPUs even these days)

On paper......... I also wouldn't call it 'much more bandwidth' as it's a complex set-up when compared to PS5's.

It's also likely going to limit XSX to 10Gb of 'VRAM' for the duration of it's life if you want full speed memory for the GPU, an issue PS5 won't have.

Their system doesnt trade load between CPU and GPU either, the CPU always clocks at 3.6 and the GPU at its 12.2TF metric, there should never occur situations where either clocks down for lower perf levels.

Nearly 12 months since they released as we've not seen a single claim from anyone credible that PS5 is down clocking.

Regarding the 'PS5 competing so well', were too early in the generation where basically everything is cross-generation or based on last generation rendering technologies.

Or it could just be that PS5's 17% clock speed advantage is affording it wins in certain situations that benefit from the rendering pipe line being faster on PS5's GPU.

To start, ray tracing seems to scale better with a higher/wider GPU TF count.

Heavy RT scenes such as the corridor of doom in the Medium show virtually no difference between the 2 machines.

Also, XSX is equipped with RDNA2 features the PS5 doesnt have like VRS which has quite the potentional if dev tweets are to be believed.

Won't really be used all that much, hardware feature that have limited support and only work on certain platforms rarely do.

I expect developers to adopt a single software solution that works on all platforms.

MS's system also shares the DX12Ultimate api, which could be having an edge since its sharing it with other platforms whereas the PS5

DX9, 10, 11 and 12 being shared between platforms didn't help Xbox One S and X and DX12U will be no different.

I know from running custom BIOS's on Nvidia GPU's that disabling GPU boost and running straight clocks would produce slower performance in benchmarks at the same voltage and power level then using boost would.

If Microsoft adopted a similar boost that Sony have they may have been able to ship XSX with a 2Ghz GPU clock.
 
But we don't know how variable PS5's clocks are.

Thats exactly what i ment, you dont know. With my 2080Ti, i know exactly what im getting at the lowest, aswell as its max clocks if temps allow, but the base performance of 13.5TF stands.

You game on PC yes? Load up MSI afterburner and tell me what game and settings max your PC's CPU and GPU out at the same time.

For the first, both dont necceseraly have to be in use to 100% at the same time on the PS5 for things to be downclocked. If it basically never happens, then why even bother implementing an advanced power delivery system? Hell they could have clocked some mhz lower on the GPU.

Where is this even coming from? Do you have some secret source of information about PS5's clocks, how they function and what the lower bouds are?

Why would i even need a source for that quoted line of text? It sure can downclock either the CPU or GPU, Sony themselfs told the world. Further, like i said in that quoted text, we have to see how well variable down-boosting pans out as we have never had that kind of system in a home console. Its too early to tell with everything being cross-gen and no true next gen games to compare to the XSX which doesnt have downclocking systems.

Games never max out a system, this is something easy to see if you load MSI Afterburner on your PC with v-sync turned on.

Not really true. Someone else might want to explain to you, i dont atleast.

The games that show a larger then normal gap on XSX are either BC games, games that use One-X and PS4 Pro as the base and thus have the same limits or games where PS5 for some reason is capped at a resolution lower then it should be.

And then i can ask for a source aswell, explaining that those are the reasons for the performance differences in those games.

On paper......... I also wouldn't call it 'much more bandwidth' as it's a complex set-up when compared to PS5's.

It does have more bandwith though, and it can argued the XSX truly has that much more of a complex memory setup.

It's also likely going to limit XSX to 10Gb of 'VRAM' for the duration of it's life if you want full speed memory for the GPU, an issue PS5 won't have.

Again something we will have to wait and see if theres any truth to that. I seriously doubt the PS5 is going to use more than 10gb for videoram for most of the next generation games (if even that).

Nearly 12 months since they released as we've not seen a single claim from anyone credible that PS5 is down clocking.

Except for Sony and Cerny themselfs then. We just dont know how much and how often. One thing is certain, when the system is getting pushed more and more, ray tracing, framerates during true next generational multiplat titles hitting, its probably going to downclock lower and more often as time goes on.

Or it could just be that PS5's 17% clock speed advantage is affording it wins in certain situations that benefit from the rendering pipe line being faster on PS5's GPU.

That would be true if both the PS5 and XSX where sporting 10TF gpus, the same memory bandwith (and bw saving features). Usually the higher clocked gpu would edge out then, but thats not the case. The XSX does sport a more powerfull GPU, higher gddr bandwith and more rdna2 features packed. Its true the PS5's gpus pipelines are filled faster, but once the XSX's gpu is satured over its 52cu's, it would theoretically have the edge. Ray tracing intensive titles like Control is one example.
Anyway, its the PS5 competing in some games, not boasting 15 to 40% higher framerates like the XSX has achieved.

Heavy RT scenes such as the corridor of doom in the Medium show virtually no difference between the 2 machines.

But then again in other RT scenes the reverse is true, watch the DF video again, the corridor of doom is suspected to have something else going on on all platforms.

Won't really be used all that much, hardware feature that have limited support and only work on certain platforms rarely do.

VRS is already put to use, and since the XSX shares DX12U and its hardware features with the PC/other platforms, its bound to be used more then the exotic solutions sony decided to go with.

DX9, 10, 11 and 12 being shared between platforms didn't help Xbox One S and X and DX12U will be no different.

DX9 and DX12Ultimate are so different theres not even a comparison there, aswell as the time eras. Xbox and PC have never been this aligned as they are now.

I know from running custom BIOS's on Nvidia GPU's

The comparison between console and PC hardware is faulty to begin with in the sense of BIOS, software and hardware integration. Also, PC gpus have a base clock, and can clock up from there. Its different to what the XSX does, and VERY different to what the PS5 does. The XSX is like a PC gpu with a baseclock but no boost clock. The PS5 does have a baseclock where it can clock down from. Different beasts.

If Microsoft adopted a similar boost that Sony have they may have been able to ship XSX with a 2Ghz GPU clock.

I'm glad they didnt.
 
Thats exactly what i ment, you dont know. With my 2080Ti, i know exactly what im getting at the lowest, aswell as its max clocks if temps allow, but the base performance of 13.5TF stands.

I'm not the one trying to argue PS5's boost clocks being an issue, you are.

For the first, both dont necceseraly have to be in use to 100% at the same time on the PS5 for things to be downclocked.

I would very much like to see your source of this claim.

Why would i even need a source for that quoted line of text? It sure can downclock either the CPU or GPU, Sony themselfs told the world

But you're speaking like it's clocks are already an issue.

Not really true. Someone else might want to explain to you, i dont atleast.

It is true, I've gamed on PC for years and never, ever seen a game or even a scene in a game that's pegged both my CPU and GPU to breaking point at the same time.

And then I can ask for a source aswell, explaining that those are the reasons for the performance differences in those games.

It's called common sense, Hitman has 40% higher resolution on XSX? PS5 in the garden scene can have a 42% higher frame rate.

There's clearly head room in PS5 to render at a higher resolution but for some reason it's been limited, which isn't a bad thing as the frame rate benefits.

If you bought a PC GPU that was 20% slower then the fastest GPU available and when you tested it was 40% slower you would question it.

It does have more bandwith though, and it can argued the XSX truly has that much more of a complex memory setup.

To 10Gb of memory.

Again something we will have to wait and see if theres any truth to that. I seriously doubt the PS5 is going to use more than 10gb for videoram for most of the next generation games (if even that).

Why not? If the options there and it works for the game why wouldn't developers use more then 10Gb?

It's not as if they incur a drop in bandwidth like they would on XSX.

This might actually result in XSX having to drop settings later on it's life and not PS5 as we saw games where PS4 Pro was limited due to lack of physical RAM and not lack of GPU power.

Except for Sony and Cerny themselfs then.

Sony said it could, not that it is.

He also said it would only down clock a few percent as power scaling isn't linear so by Mark Cerny's words any talking about boost clocks being an issue as pointless as you never notice a drop of 'a few percent'

That would be true if both the PS5 and XSX where sporting 10TF gpus, the same memory bandwith (and bw saving features).

No it's not, if your, for example geometry limited then no amount of extra bandwidth or compute will change that.

The XSX does sport a more powerfull GPU

No, it's the faster GPU in certain area's, PS5's GPU is the faster in others.

Its true the PS5's gpus pipelines are filled faster, but once the XSX's gpu is satured over its 52cu's, it would theoretically have the edge.

As long as it's not limited by a stage that faster on PS5.

Anyway, its the PS5 competing in some games, not boasting 15 to 40% higher framerates like the XSX has achieved.

As above, it's 42% higher in Hitman and that high or higher in 120fps modes like DMC5 and COD.

Crash also has a good deal better performance on PS5 too as do a few other games.

But then again in other RT scenes the reverse is true, watch the DF video again, the corridor of doom is suspected to have something else going on on all platforms.

In real time gameplay situations the difference will be next to none existent as already shown during real gameplay load tests.

VRS is already put to use, and since the XSX shares DX12U and its hardware features with the PC/other platforms, its bound to be used more then the exotic solutions sony decided to go with.

What exotic solutions did Sony go with?

DX9 and DX12Ultimate are so different theres not even a comparison there, aswell as the time eras. Xbox and PC have never been this aligned as they are now.

No they're not different.

The comparison between console and PC hardware is faulty to begin with

And yet I'm sure I've seen you make such comparisons when comparing the performance of a PC GPU to a console one in game benchmarks.
 
I'm not the one trying to argue PS5's boost clocks being an issue, you are.

Im saying we have yet to see how they pan out, thats different to being an issue.

I would very much like to see your source of this claim.

Wheres yours?

But you're speaking like it's clocks are already an issue.

No, im saying the PS5 has the functionality to downclock when loads need to be balanced.

It is true, I've gamed on PC for years and never, ever seen a game or even a scene in a game that's pegged both my CPU and GPU to breaking point at the same time.

GPU load at max or not wont be always telling the whole story. Its something you as a playstation gamer never had and never will have to worry about.

There's clearly head room in PS5

And so there is for the XSX.

If you bought a PC GPU that was 20% slower then the fastest GPU available and when you tested it was 40% slower you would question it.

What the hell has that do with our discussion? I bought my 2080Ti, it said 13.5TF 'on the box', and thats what im getting, more if termals, loads etc alighn.

To 10Gb of memory.

XSX has indeed a much higher bandiwth allocation to its 10gb gddr6 memory. I doubt such high B/W is needed for OS functions and other game logic. Anyway, wasnt the SSD more important according to you last year? Memory B/W still's a worthy discussion almost a year later.

Why not? If the options there and it works for the game why wouldn't developers use more then 10Gb?

It's not as if they incur a drop in bandwidth like they would on XSX.

This might actually result in XSX having to drop settings later on it's life and not PS5 as we saw games where PS4 Pro was limited due to lack of physical RAM and not lack of GPU power.

Because well, last generation, almost half of the total ram went to something else than framebuffer allocation. 3.5GB for vram was about the norm for high-end AAA titles. (source GG)
If 10gb is being used for Vram only this gen for a majority of games, i'd think its quite alot.

Sony said it could, not that it is.

So, they implemented a feature that will never see any action. Thats called over-engineering if that would be the case. Imagine, all the resources, hardware, software/firmware, telling it to the world, marketing it even and then the darn thing never sees any variable clocks. Cant imagine it.
The high clocks (for a console), and the powerdraw, and huge size of the console itself are a good indication, though. Aswell as oberon test kits running at 9.2TF and Sony sources stating that 3 respective 2ghz for the gpu wouldnt even be achieved without downclocking system implementation.

No it's not, if your, for example geometry limited then no amount of extra bandwidth or compute will change that.

Id hazard a wild guess that going forward, geometry limited games wont be an issue seeing whats going on.

No, it's the faster GPU in certain area's, PS5's GPU is the faster in others.

Wrong. The XSX has the more capable GPU, it was in MS's right to make that claim as its true in every sense. Its +12TF as opposed to +10TF. You know, theres pc low/mid range PC gpus clocked lower than the higher end ones, their not exactly more capable for that.

As long as it's not limited by a stage that faster on PS5.

It all depends on if games are going to be stuck in last generation. I think thats not going to happen though.

No idea what happened to the rest of your post, but anyway, the tempest audio cu is quite exotic. The xbox went with atmos and it was utilized in 2077, the PS5 went without tempest support. The Xbox has more commonly available hw features. For example.
DX9 and DX12U are no different? lol.

You were tired of PC/console comparisons already last year, yet now your back all over the map doing it yourself.
 
Overall the only sensible thing with the ps5 clocks is to wait and see -- we'll see evidence that it was good or bad when high end multiplat next gen only game start coming out. Until then, why worry? We have every reason to believe both consoles are well designed.

But,

It's called common sense, Hitman has 40% higher resolution on XSX? PS5 in the garden scene can have a 42% higher frame rate.

There's clearly head room in PS5 to render at a higher resolution but for some reason it's been limited, which isn't a bad thing as the frame rate benefits.

If you bought a PC GPU that was 20% slower then the fastest GPU available and when you tested it was 40% slower you would question it.

You guys gotta put the hitman case to rest.

"For some reason it's been limited"
the devs are the only people with truly concrete performance info and they decided it needed a much lower res, on their game, on their engine. It was limited because that was the resolution it could hit their performance target at.

Regarding the overdraw hell scene, it's not unusual for one piece of hardware (any hardware) to significantly underperform in a single exotic edge case.
 
To start, ray tracing seems to scale better with a higher/wider GPU TF count.
Its not what practice show. It scale same as generaly compute/tflops advantage of xsx. Metro exodus ee very similar difference as any other native nextgen version game computed heavy, ther are some even examples like doom eternal when there is smaller difference between consoles in rt mode than without rt.
 
i don't think people choosing the XsS are so much interested in 120fps modes, devs certainly tried and were not satisfied with the performance.
 
Status
Not open for further replies.
Back
Top