Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
It may become a larger trend in the future if we continue to move towards accelerators. But in the context here, it’s an appropriate time to call it out. They’ve had 3 years of data points now, what could have been a bunch of anomalies is now just a baseline. Every reason that it could have been has had sufficient time for maturity to be ironed out.

The deficiencies within the XSX architecture (whichever they are, split pool, rops, fixed clock speed, imbalanced cu to SE ratio or a combination of) has put it into a position that it is unable to separate itself from ps5. It’s a bit of a shame, they didn’t need much more to fix it. Remove the split pool, and have a full complement of ROPs. Allow for variable clocks. I’m not sure what that would have done to the price point, the split mobo probably didn’t help imo.

I’m sure in most circumstances it helps to be the lead console, but they can’t rely on that considering their market position.

I think MS bet heavily on ray tracing being used more extensively and intensively than it has been. When talking about Series X and its memory arrangement at hotchips, the MS engineer talking specifically mentioned RT as being a big consumer of bandwidth. So clearly the Series X is able to make use of its high compute and bandwidth, but only under some circumstances.

If you look at Control's raytracing photo mode, DF noted the Series X running 3 ~ 36% faster than the PS5, with an average difference of 16% in favour of the X. So around where the TF are and approaching the BW difference.


The thing is that RT is usually so pared back on consoles I doubt the Series X is able to really lean into this advantage. Take RT poster child Cyberpunk for example - even its eventual RT mode doesn't use reflections on consoles. And even on new gen systems Capcom prefers to use its awful SSRs rather than use RT reflections.
 
I think MS bet heavily on ray tracing being used more extensively and intensively than it has been. When talking about Series X and its memory arrangement at hotchips, the MS engineer talking specifically mentioned RT as being a big consumer of bandwidth. So clearly the Series X is able to make use of its high compute and bandwidth, but only under some circumstances.

If you look at Control's raytracing photo mode, DF noted the Series X running 3 ~ 36% faster than the PS5, with an average difference of 16% in favour of the X. So around where the TF are and approaching the BW difference.


The thing is that RT is usually so pared back on consoles I doubt the Series X is able to really lean into this advantage. Take RT poster child Cyberpunk for example - even its eventual RT mode doesn't use reflections on consoles. And even on new gen systems Capcom prefers to use its awful SSRs rather than use RT reflections.
The problem with that theory is that RT performance on Series X has been completely lackluster... to the point where I wonder what went wrong. You get games like Dead Space where the PS5 is clearly superior with regards to the RT implementation.. You have games which were supposed to get RT, such as Minecraft, get completely forgotten about.

The best console implementations of RT have essentially all been on Playstation, which is something I never would have predicted going into this generation. If it's an API issue... Microsoft's DirectX team is failing their hardware.
 
The problem with that theory is that RT performance on Series X has been completely lackluster... to the point where I wonder what went wrong.

There's a big difference between RT performance and the performance of games that have RT.

Ratchet and Clank - a game with RT - runs better on PS5 using RT than it does on some RTX 2xxx cards. This does not mean that the PS5 is better at RT than these RTX cards.

You get games like Dead Space where the PS5 is clearly superior with regards to the RT implementation..

In what way? I've just gone back over the DF Dead Space update video, and it seems that RT modes on both systems are capped the same, perform the same and are running at the same res.

The RT mode only features RTAO iirc.

 
It may become a larger trend in the future if we continue to move towards accelerators. But in the context here, it’s an appropriate time to call it out. They’ve had 3 years of data points now, what could have been a bunch of anomalies is now just a baseline. Every reason that it could have been has had sufficient time for maturity to be ironed out.

The deficiencies within the XSX architecture (whichever they are, split pool, rops, fixed clock speed, imbalanced cu to SE ratio or a combination of) has put it into a position that it is unable to separate itself from ps5. It’s a bit of a shame, they didn’t need much more to fix it. Remove the split pool, and have a full complement of ROPs. Allow for variable clocks. I’m not sure what that would have done to the price point, the split mobo probably didn’t help imo.

I’m sure in most circumstances it helps to be the lead console, but they can’t rely on that considering their market position.
I am not sure why not having a variable clock speed is somehow a negative or a deficiency. For the most part, the deficiencies you list here are unconfirmed as causing issues for the XSX. The PS5 is the dominant and lead development platform for most titles so its eccentricities are catered for first and foremost and then ported to the XSX in a good enough manner. The dev environment of the GDK and having to develop for both the XSX and XSS also factor into all this.
 
I am not sure why not having a variable clock speed is somehow a negative or a deficiency. For the most part, the deficiencies you list here are unconfirmed as causing issues for the XSX. The PS5 is the dominant and lead development platform for most titles so its eccentricities are catered for first and foremost and then ported to the XSX in a good enough manner. The dev environment of the GDK and having to develop for both the XSX and XSS also factor into all this.
you mean Fixed clock speed as being a deficiency?

I guess it’s just comes down to not getting the most from the silicon.

I have no doubts that PS5 being the lead platform is likely to play out in its advantages, but eventually on their second and third wave of games, they should have had significant time to mature the optimizations on Xbox as well.
 
you mean Fixed clock speed as being a deficiency?

I guess it’s just comes down to not getting the most from the silicon.

I have no doubts that PS5 being the lead platform is likely to play out in its advantages, but eventually on their second and third wave of games, they should have had significant time to mature the optimizations on Xbox as well.
Absolutely, there's a reason that all silicon vendors have introduced boost clocks.
 
There's a big difference between RT performance and the performance of games that have RT.

Ratchet and Clank - a game with RT - runs better on PS5 using RT than it does on some RTX 2xxx cards. This does not mean that the PS5 is better at RT than these RTX cards.
No, there's just the performance of the game period.

In your example, if the game on PS5 runs better than on those RTX2000 cards, than the RT implementation better LOOK better on those RTX cards to prove that it's actually better. The problem is... on Series X and PS5... it never looks better on Series X. They look the same. Run very similar.

In what way? I've just gone back over the DF Dead Space update video, and it seems that RT modes on both systems are capped the same, perform the same and are running at the same res.

The RT mode only features RTAO iirc.

Oh my bad I didn't mean Dead Space, I meant The Callisto Protocol.

recent-update-from-the-callisto-protocol-fixed-ray-tracing-v0-vyte52l3fi5a1.jpg


But even then your response somewhat proves my point... In Dead Space it's not like Series X does RT better. Series X doesn't have a better visual implementation of RT than the PS5...

What good is being better at RT if your FPS is capped the same as the competition, RT resolution is the same, RT object distance is the same, detail levels are the same?

My point is that there's nothing that tangibly demonstrates that the Series X is better at RT than PS5... and that's disappointing, considering the expectations were that it would be better.
 
Last edited:
I’m sure in most circumstances it helps to be the lead console, but they can’t rely on that considering their market position.

Couldn't you argue then that the XSX's hardware advantage was needed just to simply maintain parity?

I'm not sure exactly what the expectations were for the XSX even just going with the paper specs.

The gap is much smaller than that of the PS4 and Xbox One, so I don't think people should've been expecting a reversal this gen.

In terms of actual practical differences well the gap is smaller than the that of say the 3060ti and 3070, and I would bet if you manually just tuned PC game settings and did A/B blind testing people would be hard pressed to spot the difference. And that is with rather higher level "optimizing" with 0 advantages going to the lower hardware spec.

As an aside personally I feel that if MS and Sony rely on a common supplier for the main processing hardware MS is going to have a hard time gaining any type of hardware differentiation advantage just due to market dynamics.
 
Absolutely, there's a reason that all silicon vendors have introduced boost clocks.
Because PC games having to reliably deal with varying gpu clocks anyway. They can’t depend on the fixed clocks/performance across all PC gpus. Consoles can provide certain guarantees like Xbox series can guarantee 2.0 GBps over a 250ms window from the SSD while PC titles have to readily deal with a range of 100 MBps to 10+ GBps depending on the disk drive performance and type.
 
Last edited:
I am not sure why not having a variable clock speed is somehow a negative or a deficiency. For the most part, the deficiencies you list here are unconfirmed as causing issues for the XSX. The PS5 is the dominant and lead development platform for most titles so its eccentricities are catered for first and foremost and then ported to the XSX in a good enough manner.
Variable clock rates are how almost all hardware has worked for a decade. PCs. Macs. Phones. Tablets. PS5. The Xbox Series with a fixed clock is very much the technical outlier.
 
No, there's just the performance of the game period.

In your example, if the game on PS5 runs better than on those RTX2000 cards, than the RT implementation better LOOK better on those RTX cards to prove that it's actually better. The problem is... on Series X and PS5... it never looks better on Series X. They look the same. Run very similar.


Oh my bad I didn't mean Dead Space, I meant The Callisto Protocol.

recent-update-from-the-callisto-protocol-fixed-ray-tracing-v0-vyte52l3fi5a1.jpg


But even then your response somewhat proves my point... In Dead Space it's not like Series X does RT better. Series X doesn't have a better visual implementation of RT than the PS5...

What good is being better at RT if your FPS is capped the same as the competition, RT resolution is the same, RT object distance is the same, detail levels are the same?

My point is that there's nothing that tangibly demonstrates that the Series X is better at RT than PS5... and that's disappointing, considering the expectations were that it would be better.
I guess that example is just an implementation problem. Look closer at your screenshot. It looks like somehow all the details of the reflection are there, but get blurred. But still, every minor detail is in that blurred reflection. It just looks like on Xbox the material was implemented to show a blurred reflection.

But RT in general was marketed much to extreme in this generation. Even on PS5 the deficits are easily visible (e.g. low view distance, low poly models, even lower res reflections as available on PC, reflection where reconstruction techs were used,...).
 
I guess that example is just an implementation problem. Look closer at your screenshot. It looks like somehow all the details of the reflection are there, but get blurred. But still, every minor detail is in that blurred reflection. It just looks like on Xbox the material was implemented to show a blurred reflection.

But RT in general was marketed much to extreme in this generation. Even on PS5 the deficits are easily visible (e.g. low view distance, low poly models, even lower res reflections as available on PC, reflection where reconstruction techs were used,...).
Definitely agree in that particular game's case. There was a strong focus on PS5.. ect.. I also mentioned that if it's an API issue.. MS' teams are failing their hardware's theoretical capabilities. If developers are having a harder time implementing RT on Xbox.. that's a problem, and IMO that constitutes a part of whether RT is better on Xbox or not.

We're just not seeing it bear fruit, for whatever reason. At least that's my opinion currently.
 
What's hurting XSX is we're at the point of diminishing returns.

17% more compute in a world of upscalers isn't really going to do a lot.

It's the same for RT, 17% more RT performance isn't going to make a noticeable difference in resolution or enable XSX to run additional RT effects over PS5.

Best case is XSX's extra grunt will just allow it to maintain less frame rate drops in compute and RT heavy loads over PS5.
 
AFAIK it's not pirate PS5. The game is still a bought copy. They aren't running pirated software.

I think you're confusing 'hackers' with 'pirates'; this is work of a hacker, someone who works around a tech, and not a pirate who is someone who works on getting games without paying for them. The latter is dependent on the former.
 
What's hurting XSX is we're at the point of diminishing returns.

17% more compute in a world of upscalers isn't really going to do a lot.

It's the same for RT, 17% more RT performance isn't going to make a noticeable difference in resolution or enable XSX to run additional RT effects over PS5.

Best case is XSX's extra grunt will just allow it to maintain less frame rate drops in compute and RT heavy loads over PS5.
It’s also 17% more RT performance on weakest RT hw available on the market. Not a huge win.
 
aren't pirates more talented than actual developers? Either that or Sony aren't allowing some games to run at 60fps.


Sony should be ashamed that the pirate PS5 is better than the official PS5.
Some hack that a modder uses to make something work is not necessarily sufficient for a professional software release. There are usually much higher standards for that.

That said, it should at least demonstrate that solutions should be possible if developers are given the time/resources to get it done properly. And if there's any developer on Earth that should be able to command the allocation of such time/resources, it's Rockstar. It's beyond bizarre to see these quite significant releases in their history being given such clearly pitiful priority.
 
Status
Not open for further replies.
Back
Top