This comes across as purposefully ignorant to data interpretability and is entirely unnecessary to attack sites who are putting in significant effort to reduce variability in measurements for the sake of providing data to its viewers.The SeriesX launched 3 months ago, and when DigitalFoundry couldn't observe its predicted 15-20% performance advantage over the PS5 in multiplatform titles, we started hearing about how there's a future devkit that will unlock the full power of the console.
Aside from Hitman 3 which for all we know it's the exception to the rule (and where we can't really compare both consoles because they're running different resolutions at over 60FPS), we're yet to see anything running better on the SeriesX in any measurably substantial way.
There is no magic to either console, these are mathematical devices that are fixed never to be changed. The performance of the titles will have a distribution of performance over time, and as such over time the hardware can be placed on a line of regression like all other GPUs that have come before it in existence. No one, including digital foundry or any other journalist, is making any sort of controversial claims if their claims are to just follow the trends of that line of regression anyway.
There has never been a controversial claim that within any family of architecture cards, that the more compute units and more bandwidth would lead to greater performance. If that is not controversial, then one should ask why you are making out such a big deal of this claim. It is _expected_ that XSX outperform PS5 by this margin because that is what is expected on the line of regression.
When we look to determine why XSX is not performing better than PS5, we do not look at PS5, we look to the line of regression. Is the XSX performing where it should be relative to the data points around it? Many could easily look at the surrounding data points around XSX and determine that the answer is no. We find it's performing worse than a 5700 in a great many of the initial launch titles, and this is without any sort of magic the PS5 could have. Until recently do we see it is performing better than the PC GPUS that we expect it to with Hitman 3.
Is PS5 performing better than where it lies on the line of regression? And it would appear, with proper and aligned settings to the PC, the answer is no. We see with Hitman 3 it's performing in line with the expectations of where it belongs with other data points around it. Does that mean PS5 can't be better? No. We expect data points to actually fall above and below, and hit directly on the line of regression.
As you can see, the reason why there is more faith around DF, and not the other sites is because they (DF) aim to illustrate of a variety of data points around where the consoles exist. They do not simply pit one console versus the other. They bring in additional data points from PC to cross check and validate results, also known as cross validated evidence, all of which you are largely ignoring the amount of painstaking process it requires to perform this. Digital Foundry is not just providing evidence via repetition, they have and are providing evidence via redundancy. The importance of this is that to pressure test any theory, it should be able to survive such theories from a great deal of a number of attack vectors. And even if the measurements are not nearly exact, they (DF) are still able to provide sufficient context on the subject to ballpark where the systems should land given if they could find exact settings.
There are 72 months in a generation of consoles, there are 70 months remaining approximately before the next generation begins. You're implications that Hitman 3 could be an exception to the rule without explicitly saying is as far as I'm concerned misinformation campaign. 2/72 months of data less than 1.3% of the data has arrived but you're putting a majority of the weight of your statements on the first 1/72th of data. Everything is going to be an exception to the rule until the first 30% of titles are released if you want to get technical, singling out Hitman 3 is quite frankly willful ignorance. To remove a data point while simultaneously attacking the people who do the work to defend an argument, that as far as I can see, contain no relevance to the discussion pertaining to the performance of the geometry engine.
Last edited: