Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Regardl
My feeling is that people just looked at the paper specs and declared XBSX the faster console without proof and now everyone is in shock and awe when they see actual software running on both systems. Now you have the classic looking for scapegoats crap as to why that is when they believed the opposite for so long. "TFs" is the new "Bits" after all :rolleyes:

? I don't think people are looking for a "scapegoat" on here. They're trying to understand the results from the first cross platform game comparison we have. Paper specs do matter and if the PS5 really does outperform where most expected it would be, or the XBSX underperforms then there's a reason for that, and it's fun to speculate. PS5 and XBSX are the most similar consoles we've ever had as far as hardware. It was always going to be a situation where most "normal" people probably wouldn't be able to tell the difference between a PS5 or XBSX version of a game anyway.
 
impressive perf from ps5, does rx5700 (not xt) run this game on pc with similar performance ? ;) On techpowerup rx5700xt on avarage in 1440p 53 fps so rx5700 for sure even worse (but don't know what console settings are)
For all we know the game on console could average 70fps as they average >59fps when capped at 60fps. Anyways it's an impressive performance for the RX5700 (so they say) GPU inside PS5 indeed. The XSX is obviously much less impressive for its theoretical specs.

Superb video/article. Been waiting for this one with baited breath! I love the fact that @Dictator was able to match settings so exactly using the config files and I'm really surprised that the PC is using higher res textures than the next gen consoles. I don't think anyone saw that coming!

I must admit I'm pretty shocked at the performance level. With one quality analysis we've just gone from the XSX looking like it might be faster than an RTX3090 (given the 3090 falls below 30fps at max settings 4k) to the XSX being a little bit less performant than an RTX 2060S. Settings matter people!

I hope this isn't indicative of the RT performance we'll see in the RX6xxx series though.
Maybe some said this, but many others were saying the XSX settings were quite low / medium compared to the PC footage shown.
 
so rt worse on new consoles than on rtx 2060super, not big surpise, we know amd has worse implementation of rt than nvidia currently

This is just on Series X though and it is very early especially for Xbox, for which it wasn’t even sure the software would be ready for launch, and indeed Nvidia has a bit of a head start here anyway. But I do think the quality of RT will be the biggest difference newer PC cards will start showing versus consoles as time goes on in multi platform games ...
 
You cant expect a huge performance gulf with the same architechture when the power consumption is basically the same. The question is why XSX is not using more when the GPU is bigger?
 
This is just on Series X though and it is very early especially for Xbox, for which it wasn’t even sure the software would be ready for launch, and indeed Nvidia has a bit of a head start here anyway. But I do think the quality of RT will be the biggest difference newer PC cards will start showing versus consoles as time goes on in multi platform games ...

PS5 is going to run the same settings as Series X according to the video. I've only been half paying attention to it as I work, but it seems like the new consoles are going to be making more compromises than PC. Not unexpected for the price and power (watts) difference.
 
For all we know the game on console could average 70fps as they average >59fps when capped at 60fps. Anyways it's an impressive performance for the RX5700 (so they say) GPU inside PS5 indeed.

Not sure who says this but it wouldn't be correct. The XT would be a closer match but even that is slower on paper, especially the front end. And that's before you consider the RDNA2 IPC uplift.

Maybe some said this, but many others were saying the XSX settings were quite low / medium compared to the PC footage shown.

Yes to be fair once we started getting details of the dynamic resolution and RT quality differences this did start to become apparent. I was really going back a couple of weeks when we still assumed the XSX was running native 4K at PC max settings.

RT might really change the performance dynamic this generation. Then again, consoles will make a lot of that up through reconstruction techniques and dynamic resolution. Unless of course DLSS or an AMD equivalent become more widely supported.
 
I think it's going to be who is implementing the RT. Spider-man Mile Morales RT seems to be pretty good for a console.

Alex did explain in his video the differences between the two implementations. Spiderman is making quite a few compromises compared with WDL. For example reflections within reflections and RT reflections on water. I'm not sure this is a case of Insomniac getting more raw performance out of the RT hardware as opposed to simply making better (arguably?) artistic use of the resources that are available.
 
My feeling is that people just looked at the paper specs and declared XBSX the faster console without proof and now everyone is in shock and awe when they see actual software running on both systems. Now you have the classic looking for scapegoats crap as to why that is when they believed the opposite for so long. "TFs" is the new "Bits" after all :rolleyes:

I think the problem is that some are simply looking at raw TF performance as the end-all indicator towards real-world gaming performance, when it’s not. If we look at the PS5 hardware as an whole, many aspects (i.e., cache scrubbers, coherence engines, Kraken decompressor, etc.) of it was built for maximum efficiency on reducing all manners of latency throughout the systems design (hell, even the PS5 motherboard electrical traces are carefully curated for maximum efficiency), which could be paying off at the moment. There is no denying XBSX GDK toolsets are still early and impacting overall gaming performance, but I don’t expect 'secret-sauce' levels of hidden performance... but slightly more headroom on maintaining higher framerates over PS5. In the end, it’s up to the game developers on how they maximize performance across these systems, and more than likely it will simply be parity that wins out.
 
VG Tech just made a video comparing Valhalla on both machines. The framerate is actually very similar between both versions. But still, the framerate seems to be a tiny bit better on PS5 judging by the stats.


Anyone know if PS5 was the lead platform for the next-gen versions? Hearing that its devkits have been more mature than the Series systems, that would at least suggest it was the lead platform, which in addition to the difference in devkit maturity would play an impact on the differences we're seeing, imho.

My feeling is that people just looked at the paper specs and declared XBSX the faster console without proof and now everyone is in shock and awe when they see actual software running on both systems. Now you have the classic looking for scapegoats crap as to why that is when they believed the opposite for so long. "TFs" is the new "Bits" after all :rolleyes:

It's not that simple though. Now, I was never really on the "hype train" (maybe better to call it fanboy train) that Series X was going to decimate PS5 or some of the stuff trying to push PS5 as a "9.2, sometimes 10.3" RDNA 1 etc. system. All the same, it's probably a bit premature to use these early launch games as indicators of where the systems will fall in terms of performance to one another once the heavy-hitting next-gen games start to arrive.

And in fairness, if people want to do that, that is perfectly fine. But that should probably also apply to the SSD I/O results we're seeing too because MS's systems are performing a lot closer with Sony's in that regard than what spec details given earlier would've implied. And in that example, there's the recent firmware patch for some games on PS4 that kind of give weight to the idea that, IMO, Sony and MS's approaches are just tackling the same problem in different ways which is why using the basic specs listed on paper there was never really applicable.

For things outside of the data I/O scope? Again, can't really just go with paper specs, but I'd say the systems are much closer architecturally speaking in that regard, since they're more reliant on AMD's own technologies there. I won't be surprised if there's a few more launch games that have somewhat better performance on PS5, or better-than-expected load times and streaming on Series X (or better load times in general). In fact there'll probably be one-off examples throughout the generation of these kind of "surprises" popping in.

But I do think we still need to wait a while until at least more 3P devs get the updated Gamecore SDKs from MS. There are still things PS5 will be easier to work with, like the memory setup for example, but I think performances will even out a bit more in terms of 3P offerings across the board starting shortly after the launch window offerings.
 
Last edited:
In the end, it’s up to the game developers on how they maximize performance across these systems, and more than likely it will simply be parity that wins out.
In the end, PS5 first party devs will also only have the PS5 to develop for while for the Xbox you have the Series S/X to develop for and whatever limitations the lower end console has.
 
Anyone know if PS5 was the lead platform for the next-gen versions? Hearing that its devkits have been more mature than the Series systems, that would at least suggest it was the lead platform, which in addition to the difference in devkit maturity would play an impact on the differences we're seeing, imho.
The lead was probably PCs with mostly Nvidia cards. :rolleyes:
 
Status
Not open for further replies.
Back
Top