Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
I thought I was smart and I put my PS5 in rest mode before the boss in Demon's Souls. Then we had a black out =( 7000 souls gone. Sigh, I really do wish Quick Resume existed, this would be a non issue.
Anyway, catching up here; I'll offer some thoughts on XSX performance.
The type of graphs we want to see is both ps5 and xsx generally following each other both up and down. That tells us that they're reaching similiar bottlenecks, whether there is a compute gap or not, as long as they follow that same trajectory, it's an ideal case. The mega drop for XSX while PS5 is really the first problem that needs ironing out.

If your only objective is to ship a product and you don't give 2 cahoots about performance, you could ignore the split pool on XSX given how generous it is. However, it can, or could explain why, in certain situations you could see a dip in performance, in particular a fall from 120. The higher the frame rate, the more bandwidth the CPU eats up. If some of your rendering is in that slower pool and you're pushing higher frame rates, there may not be enough bandwidth to sustain a very high frame rate and render out. So it comes down in frame rate naturally.

This may explain why, more npcs, more city scenes, more complex scenes, XSX could be on shakier ground. Because at higher frame rates the CPU is eating significantly more bandwidth than that slow pool has available to support both it and the higher frame rate.

As for the dips on DMC5, the drop from 120 to 60 and then back up. Could also possibly be an indication that they are rendering out of the slow pool again. So while it's holding this 560 GB/s and doing well, suddenly it's dropping to 60fps for no reason. 336GB/s may be sufficient to sustain something like 60fps at 4K (which is higher than X1X), but lower than what may be required to sustain that speed at 120fps for instance.

So keep your eyes peeled to look for these types of oddities. I do agree that SDK has something to do with performance problems, but not the big drops we are seeing; there may not be an effort to optimize the split pool during covid. Hopefully the 'mega-drop' is resolved soon. But that may require a lot more optimization by the developer here to get around to fitting all rendering in the 10GB and relying on that SSD to stream in only what is required.

The compute gap or lack thereof, may just come down to cross gen titles for the time being. @function wrote a great post somewhere about increasing the quality per pixel each generation that I think would signal things to come and we aren't reflective of that at the moment.

and I'm off to enjoy more gaming see you guys when I get another moment.
Demon's Souls is like Tony Horton, you love him, but you hate him. I can't believe they would just drop the Tower Knight on you like that. Like, give me a fucking break, give me some warning it's boss time or a new map.

@Dictator great job with all the recent videos, they've been awesome. Let the team know they're doing a good job trying to provide data an analysis while also trying to curb the typical fanboy fare that comes from the viewers.
 
Last edited:
@iroboto Nice theory but 2 things:

- Those are cross gen games that need only 5.5GB on Pro. I doubt they need more than 10GB on XSX as they seem to use the same assets than the Pro game. We'll know more about this as NXGamer is going to release his full analysis (he already hinted that there are some rather big graphical differences between versions).

- Even if they needed the slow memory to work, the added memory contention would not be enough to drastically limit the GPU bandwidth that would cause such big drops. On resetera I have seen an analysis predicting max ~40GB/s of bandwidth (in the worst case scenario where the CPU is busy 100% of the time) eaten by the 'split' memory (on top of the usual memory contention that would also happen on PS5). That would still leave XSX GPU with more bandwidth than PS5 (520GB/s vs 448GB/s).
 
@iroboto Nice theory but 2 things:

- Those are cross gen games that need only 5.5GB on Pro. I doubt they need more than 10GB on XSX as they seem to use the same assets than the Pro game. We'll know more about this as NXGamer is going to release his full analysis (he already hinted that there are some rather big graphical differences between versions).

- Even if they needed the slow memory to work, the added memory contention would not be enough to drastically limit the GPU bandwidth that would cause such big drops. On resetera I have seen an analysis predicting max ~40GB/s of bandwidth (in the worst case scenario where the CPU is busy 100% of the time) eaten by the 'split' memory (on top of the usual memory contention that would also happen on PS5). That would still leave XSX GPU with more bandwidth than PS5 (520GB/s vs 448GB/s).

While typically assets do take up the majority of space on memory, that does not mean the entire allocation is textures. The GPU must reserve a large amount of memory for Render targets too. Asset sizes compete with render targets in that respect. If you have 50-80 4K render targets, that's a lot of space being eaten up while sharing that space with the textures needed in the render scene as well as textures not in the scene. In this particular case, 4K render targets are 4x the size of 1080p render targets, but they've only doubled the amount of memory from last gen, not quadrupled.

Typically the bandwidth targets from last gen, as I understand, were about 20GB/s for the jaguars and most of those titles ran at 30fps. 60fps would make that 40GB/s and 120fps would make that 80 GB/s
Of you 336 you subtract away 80GBs, and you lose asymmetrical bandwidth additionally due to contention as per that older PS4 slide, you might be getting only very little remaining from that 336GB/s. If your still have things from there to render, the total bandwidth of the system will drop dramatically as it interchanges between the two pools with a very small of frame time available.

Just my thoughts here.

I mean, there are hundreds of possibilities that one could look at; but there is probably a very short list of things that would cause a sudden 50% frame drop. And this isn't uncommon in unoptimized scenarios; Call of Duty launched 720p on XBO but 1080p on PS4. But it would be very unlikely that PS5 would have something onto of their 20% compute deficit and bandwidth deficit that could bring them +100% frame advantage. Those of the type of things that should be seen a mile away during silicon simulations.

I can't say this explains everything, but narrowing down a drop that large should be something we can isolate to a handful of things. Extracting (or why devs cannot extract) more performance from xbox while related to this, is separate from explaining the massive drops in frame rate.
 
I was assuming that whatever the BC team is doing in the software layer uses the GDK as well including compilers and libraries? Utilize the GPU better than it already does.

I know nothing about this, but I was under the impression that GDK = SDK but the G is for Games?
And the Xbox BC are virtual machines with all those libraries and compiled source code runs inside that and the virtualisation of the hardware for the vm's have nothing to do with GDK/SDK which runs inside the VMs.
 
Follow up: game suspend mode on PS5 will last through a power outage on rest mode. but I had to deal with a database rebuild and some really wonky performance in the game for a little bit before it smoothed out again.
 
When we can - production is not easy. John lost a lot of his demon soul's work... and the way assassins creed tears means our tools are a bit useless. I am currently hand painting in tear locations in videos... thousands of frames by hand.
seriously hand painting tears? I mean, this is an opportunity here to be solved
 
What's worse, John even had to fend off trolls who of course used that to say that his PS5 was broken, or that this was a PS5 issue (or something along those lines).
 
When we can - production is not easy. John lost a lot of his demon soul's work... and the way assassins creed tears means our tools are a bit useless. I am currently hand painting in tear locations in videos... thousands of frames by hand.

Wow you guys go down to a seriously impressive level of detail.
 
Think they will have to go back to PS360 style of comparisons (like weekly - 4-5 games with each getting one page on perf).

Big indept articles would be released for big releases such as CP2077, because doing every game in detail the way they did last gen would be impossible.
 
I don't see they rush, the consoles launch stock is sold out. Folks who have a console have picked a platform.

There are game reviews out there if you want to know if a game is your cup of tea, I see these launch window games more as a potential lens to the systems themselves.

The videos will be just as informative and fresh a week later.

Further on in the generation having content close to game launch may be more critical but "in the here and now" just detailed game and system dissection is perfect whenever it lands.
 
I know nothing about this, but I was under the impression that GDK = SDK but the G is for Games?
And the Xbox BC are virtual machines with all those libraries and compiled source code runs inside that and the virtualisation of the hardware for the vm's have nothing to do with GDK/SDK which runs inside the VMs.

its not as discrete as that, this was described in the series x hotchips presentation, the game communicates to the 'smash' driver, that interacts with the hardware. the games virtualization layer is 'smashed' together with the drivers and compiled game code into one software layer, eliminating the overhead of virtualization.
So its not game -> drivers -> vm host -> drivers -> console hardware
its game -> smash driver - > console hardware

The GDK is distinct from the environment that was used on the xbox one gen, which is the XDK. GDK is where gamecore comes in, so in theory, its a write once, run anywhere solution (well anywhere supporting DX12U) I suspect this is more of the reason that microsoft waited for 'Full RDNA2', as waiting for it, and all the baseline features that DX12U requires, allows them to make their games for one platform, gamecore and have it run on console / streaming / PC. They have even made some API's that can determine what device is interfacing with it, so a game can display the appropriate interface and controls too. So in the console disk copy of a game there will be all the code for the touchscreen streaming version and the PC version, it just exposes the different UI for each device.
 
Status
Not open for further replies.
Back
Top