Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
How would *you* define the relative perfomance of the two consoles based on this video?
The relative performance for me is inconclusive, but mainly because both are capped at vsync 60. I only trust that IOI tuned the releases for the best possible settings for both consoles. There's not really much to debate, DF offers additional data points for those of us interested in the technical underpinnings of the consoles. But, overall, without a rendering pipeline graph, there's really nothing ever that will be 'conclusive'.

The 44% differential is far surpassed the 22% difference in both compute and bandwidth. And to many that may seem incredibly unfair/suboptimal. But this has been the case for consoles for a great degree of time, we saw this exact thing play out for mid-gen when at times we saw XSX with 100% more resolution than 4Pro and often PS4 and 4Pro had many titles that both shared 1080p. Games are not so regulated and perfect that every ounce of power is extracted perfectly by every team. not ever game is designed the same, and different games will hit different bottlenecks for different setups. Without unlocked framerates, and equal settings, there's just no way to tell unfortunately. And that is to be expected, but when i wrote about this phenomenon, I was attacked pretty badly. This could be one of those cases, where the mixture of a lack of DRS and it was just too unstable for IOI standards at 4K that 1800p was going to be the best experience for the players considering how close it was to 4K for most people. Perhaps they would have chosen differently if PS5 already had a VRR solution in place, I don't know.

The most important take away for me, is that if there was a devkit issue, that problem is clearly gone. Honestly, I was going to give it 6 months of XSX of being out of place. But after Hitman 3, I suspect there might be 2 months remaining left where we may see a rogue title or so where XSX underperforms the 5700. But afterwards it should be clear. I think post 6 months it performing below a 5700 I would start to dive deeper to look for architectural issues.

PS5 is performing very well, it's performing where I thought it would, at least for these cross generational titles.
 
Last edited:
The relative performance for me is inconclusive, but mainly because both are capped at vsync 60. I only trust that IOI tuned the releases for the best possible settings for both consoles. There's not really much to debate, DF offers additional data points for those of us interested in the technical underpinnings of the consoles. But, overall, without a rendering pipeline graph, there's really nothing ever that will be 'conclusive'.

The 44% differential is far surpassed the 22% difference in both compute and bandwidth. And to many that may seem incredibly unfair/suboptimal. But this has been the case for consoles for a great degree of time, we saw this exact thing play out for mid-gen when at times we saw XSX with 100% more resolution than 4Pro and often PS4 and 4Pro had many titles that both shared 1080p. Games are not so regulated and perfect that every ounce of power is extracted perfectly by every team. not ever game is designed the same, and different games will hit different bottlenecks for different setups. Without unlocked framerates, and equal settings, there's just no way to tell unfortunately. And that is to be expected, but when i wrote about this phenomenon, I was attacked pretty badly. This could be one of those cases, where the mixture of a lack of DRS that 1800p was going to be the best experience for the players considering how close it was to 4K for most people.

The most important take away for me, is that if there was a devkit issue, that problem is clearly gone. Honestly, I was going to give it 6 months of XSX of being out of place. But after Hitman 3, I suspect there might be 2 months remaining left where we may see a rogue title or so where XSX underperforms the 5700. But afterwards it should be clear. I think post 6 months it performing below a 5700 I would start to dive deeper to look for architectural issues.

PS5 is performing very well, it's performing where I thought it would, at least for these cross generational titles.

Completely agree with you here. I'm looking forward to seeing further comparisons and how the software and hardware differences manifest over the course of the generation.
 
The most important take away for me, is that if there was a devkit issue, that problem is clearly gone. Honestly, I was going to give it 6 months of XSX of being out of place. But after Hitman 3, I suspect there might be 2 months remaining left where we may see a rogue title or so where XSX underperforms the 5700. But afterwards it should be clear. I think post 6 months it performing below a 5700 I would start to dive deeper to look for architectural issues.
PS5 is performing very well, it's performing where I thought it would, at least for these cross generational titles.
It could be true if as in other games ps5 was little above 5700xt and not slower. So hard to say devkit maturity of xsx hasy any to do here, xsx is just little faster than 5700xt as should be and ps5 slower which is main difference here.
 
It could be true if as in other games ps5 was little above 5700xt and not slower. So hard to say devkit maturity of xsx hasy any to do here, xsx is just faster that 5700xt as should be and ps5 slower which is main difference here.
Well, that's what we expect right. I've written about how the lack of variable clock on fillrate could be hurting it, or the split memory pools. But if suddenly it's consistently performing better than 5700 from this point onwards, it's doing a lot better than those initial launch titles were it was doing substantially worse. And that could be for a variety of reasons of course, but as long as it doesn't fall back into that, below 5700 range I think XSX is probably performing more or less where it should be relative to the PC equivalent GPUs. MS indicated that it's around a 2080 for Gears and Alex has said that it's around 2070S. So when I see it dip below 2070S that, is where I generally where I start to look for why.

I still don't know what PS5 is doing and what impacts their customizations have, and we won't know until we see Unreal 5 engine games imo. Next gen titles are far from here.
 
Well, that's what we expect right. I've written about how the lack of variable clock on fillrate could be hurting it, or the split memory pools. But if suddenly it's consistently performing better than 5700 from this point onwards, it's doing a lot better than those initial launch titles were it was doing substantially worse. And that could be for a variety of reasons of course, but as long as it doesn't fall back into that, below 5700 range I think XSX is probably performing more or less where it should be relative to the PC equivalent GPUs. MS indicated that it's around a 2080 for Gears and Alex has said that it's around 2070S. So when I see it dip below 2070S that, is where I generally where I start to look for why.

I still don't know what PS5 is doing and what impacts their customizations have, and we won't know until we see Unreal 5 engine games imo. Next gen titles are far from here.
I would say xsx in many games was underperforming and didn't behave like 12tf rdna2 and ps5 did behave like 10tf rdna2.
 
Thats taking Stop the Count to a new level...

There is a lot of "politics" going on in regards to consoles.

Take the "infamous" demo of UE5 done on a laptop.
It was pure PR (politics) that it got pulled of the internet (and encounted the Streisand-effect) and in my view it was because it gave a solid data point that did not scream "next gen" compared to the PC.
(It did not show the DirectStorage tech as some "special wonder-sauce" and eg. NVIDIA was quick to counter with RTX IO PC side)

Funnily the datapoint was just around where we see e.g. Digital Foundry place the performance in their latest video's.

Now some people are arguing where on the last-gen PC SKU scale these consoles are, so I see some progression to the positive as there is no more denial about it being least-gen PC performance, now people are just arguing WHAT last-gen (midrange) PC SKU the consoles match up with....but with even more tenacity and now attacking reviewers because their pet hardware "looses" in reviews.

I for one am for MORE data, the more data, the less anomalies and outliers matter.
 
There is a lot of "politics" going on in regards to consoles.

Take the "infamous" demo of UE5 done on a laptop.
It was pure PR (politics) that it got pulled of the internet (and encounted the Streisand-effect) and in my view it was because it gave a solid data point that did not scream "next gen" compared to the PC.
(It did not show the DirectStorage tech as some "special wonder-sauce" and eg. NVIDIA was quick to counter with RTX IO PC side)

Funnily the datapoint was just around where we see e.g. Digital Foundry place the performance in their latest video's.

Now some people are arguing where on the last-gen PC SKU scale these consoles are, so I see some progression to the positive as there is no more denial about it being least-gen PC performance, now people are just arguing WHAT last-gen (midrange) PC SKU the consoles match up with....but with even more tenacity and now attacking reviewers because their pet hardware "looses" in reviews.

I for one am for MORE data, the more data, the less anomalies and outliers matter.
...............what?
 
The relative performance for me is inconclusive, but mainly because both are capped at vsync 60. I only trust that IOI tuned the releases for the best possible settings for both consoles. There's not really much to debate, DF offers additional data points for those of us interested in the technical underpinnings of the consoles. But, overall, without a rendering pipeline graph, there's really nothing ever that will be 'conclusive'.

The 44% differential is far surpassed the 22% difference in both compute and bandwidth. And to many that may seem incredibly unfair/suboptimal. But this has been the case for consoles for a great degree of time, we saw this exact thing play out for mid-gen when at times we saw XSX with 100% more resolution than 4Pro and often PS4 and 4Pro had many titles that both shared 1080p. Games are not so regulated and perfect that every ounce of power is extracted perfectly by every team. not ever game is designed the same, and different games will hit different bottlenecks for different setups. Without unlocked framerates, and equal settings, there's just no way to tell unfortunately. And that is to be expected, but when i wrote about this phenomenon, I was attacked pretty badly. This could be one of those cases, where the mixture of a lack of DRS and it was just too unstable for IOI standards at 4K that 1800p was going to be the best experience for the players considering how close it was to 4K for most people. Perhaps they would have chosen differently if PS5 already had a VRR solution in place, I don't know.

The most important take away for me, is that if there was a devkit issue, that problem is clearly gone. Honestly, I was going to give it 6 months of XSX of being out of place. But after Hitman 3, I suspect there might be 2 months remaining left where we may see a rogue title or so where XSX underperforms the 5700. But afterwards it should be clear. I think post 6 months it performing below a 5700 I would start to dive deeper to look for architectural issues.

PS5 is performing very well, it's performing where I thought it would, at least for these cross generational titles.

Yep, that's the Sega Saturn timescale if there ever was one xD. Although early optimizations with that system were...odd. IIRC VF Remix was already nearly done by that May but they still released the terribly buggy VF1 port anyway.

I don't think the 44% resolution difference is too odd considering XSX has 44% more CUs, but OTOH maybe it is kind of odd considering it would suggest PS5's version is running well below 2.23 GHz...yet from what Cerny's mentioned the only reason that would happen is if the power budget were being taxed and extra power can't be allocated from the CPU's budget to shift to GPU.

Thing is, I don't think Hitman 3 is that particularly taxing, but I also remember reading that a program doesn't need to actually be pushing technical limits (not to say Hitman 3 isn't visually impressive; I think it is outside of maybe some of the character models) to use up most of a processor's resources. Again I'm thinking of what Mark Cerny said about some menus on select PS4 games oddly kicking the fans into high gear despite it being just a simple menu screen.

So I don't know if this hints at an unoptimized PS5 port or not, and maybe Hitman 3's code profile is structure is just more suited for Series X's design, fixed clocks, wider GPU etc. than PS5's variable frequency. So there's a chance the Series X version is more optimized, though again I don't want to give impression that a game being optimized means it's "pushing" the hardware to its limits, because I don't think we'll see those games on either system until starting probably early 2023, or a bit later, it just depends on when devs (3P and 1P) decide to ditch 8th-gen for good.

Even then I think "pushing" these new consoles to their limit is more an issue of lack of budgets/time/dev workforces than it is needing to "learn" the hardware; these systems aren't complex exotic designs like consoles from the '90s or say 7th-gen units like PS3. They're mostly straightforward, but we can clearly tie the most ambitious and technically impressive games of 8th-gen to those with the largest budgets, largest teams, and/or most amount of time in dev (RDR2, TLOUII, HZD etc.).

WRT to the devkit stuff, I think that's a good way to approach it. People like BRiT have posted some updates on GDK clearly showing work-in-progress with parts of it that are or were at least fairly recent. I think even if Series X starts to take the lead in majority of 3P game performance going forward there will be outliers where PS5 has the advantage, but if that longer-term lead in 3P game performance for Series X in fact doesn't start to regularly manifest by a few months time (or certainly, heading into Fall), then yeah there will need to be some serious questions asked about architectural limitations having a negative impact, I think the two people would gravitate to most are the fast/slow RAM pools and lack of GPU cache scrubbers.

But that is a worst-case scenario and from everything we know on both systems so far I don't expect it to ever actually manifest. I just hope by this fall, discussions on system hardware specifications in terms of trying to peg who's bottlenecked where, disagreeing on where the systems fall in relation to one another etc. ceases, especially if Sony continue to be coy about some of their own hardware specifics (basically allowing some rather ridiculous rumors to fly on their behalf that the majority of which probably aren't even true).
 
WRT to the devkit stuff, I think that's a good way to approach it. People like BRiT have posted some updates on GDK clearly showing work-in-progress with parts of it that are or were at least fairly recent. I think even if Series X starts to take the lead in majority of 3P game performance going forward there will be outliers where PS5 has the advantage, but if that longer-term lead in 3P game performance for Series X in fact doesn't start to regularly manifest by a few months time (or certainly, heading into Fall), then yeah there will need to be some serious questions asked about architectural limitations having a negative impact, I think the two people would gravitate to most are the fast/slow RAM pools and lack of GPU cache scrubbers.
I would like to see same difference in game that ps5 behave as should so slightly above or similar to rx5700xt level and not one that it's below this level ;)
 
I would like to see same difference in game that ps5 behave as should so slightly above or similar to rx5700xt level and not one that it's below this level ;)

I mean yeah, that's essentially what we would expect when you see the GPU TF #s and that comes to about 18%, and a game like Hitman 3 shouldn't really be taxing PS5 to the point of dragging the clock down and widening that gap. And like I was saying before it's also possible that the game is somewhat more optimized for Series X than it is for PS5, but that's just a guess.

That said though I think this is just another example why relying on paper specs in and of themselves means nothing; we're pretty well aware of Sony's customizations and aspects of their design that are wholly their own, but aspects of this are a bit more vague (if possible) on Microsoft's side. Though honestly I don't think we need to start speculating to that degree to rationalize the performance we're seeing; either the game's less optimized on PS5 or it has a design structure that benefits more from Series X's setup compared to PS5's, are good means to speculate on.

As while it is an advantage for Series X in this game, that's one out of a decent number of prior 3P crossgen native ports where PS5 was performing notably better by whatever given range. So is Hitman 3 an outlier or is it the start of the shift? The simplest form of that is to just take what's known with the paper specs and assume it's the start of the shift, but it's also like iroboto was saying: if it actually is an outlier, and we're still getting the PS5 retaining notable leads going into, say, the Fall, then we can either ask if it's a design problem with Microsoft's system, or a competency problem in terms of getting the dev tools and surrounding environment up to par for all partners.
 
  • Like
Reactions: snc
People just have to get used to it. In most games PS5 is either performing similarly or better than XSX and Hitman 3 was just an outlier.
 
People just have to get used to it. In most games PS5 is either performing similarly or better than XSX and Hitman 3 was just an outlier.

This special case doesn't fit into "most games", since this game was coded long before PS5/SeriesX existed. Absolutely zero conclusions can be drawn about the relative performance capabilities of the 9th gen systems based on how they each run a remaster of a 7th gen game using two very different backwards compatibility systems.
Maybe BC performance on current cross-gen titles can tell us something, but not a game like this.
 
Status
Not open for further replies.
Back
Top