Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Yup, well, both Series X|S are afflicted with the same issues in terms of what functionalities are available in the GDK. It's unfortunate that MS got blind sided by COVID here (perhaps their deadlines too ambitious and didn't see it the possibility of a massive slowdown) and didn't build their tools out enough to have a good launch. I think a proper fix would have been to eat the cost of just doubling the remaining memory to be symmetric and not deal with the tooling issues around controlling the split pools. There are other issues, but this isn't an ideal launch considering their marketing.

Yeah, that's where this REALLY stings hardest. They've spent almost a full year talking specs, touting their perceived advantages etc., but now we've got multiple 3P games just running better on PS5 outright. I'm already seeing some shots for DiRT5 on both platforms and there's spots where the two games literally look a generation apart between Series X and PS5, in favor of the latter. Like, entire levels of geometry detail, texture detail etc. are just missing on MS's platform in some of those spots (at least from screengrabs I've seen posted elsewhere).

Now I don't even necessarily know if the entire Gamecore stuff is 100% accurate, going from the interview with the DiRT 5 technical director. Apparently for them, the GDK is "just fine" (or something like that), but they did specify that's in relation for what they were trying to do. Some of these results for DiRT 5 on Series X say otherwise, though; I don't see how GDK can be "just fine" if results between platforms are producing almost generations-apart levels of detail and texture quality on two next-gen peers.

Really hoping this at the very least, convinces MS to not rely too much anymore on 3P games to showcase their system's capabilities off, it's a bit clear that they can't do such if the tools aren't there yet. Cyberpunk might be their best shot to start turning some of these unfortunate optics around, but even that has a good chance of looking & running better on PS5 now going by these trends. It's time they start relying on their 1P to show what their design is actually capable of, they gotta take a page from Sega here (where it was literally 1P titles like VF2 that helped start curbing back some of the bad optics the Saturn was having at that point from a technical POV among some gamers and large parts of the gaming press of the day).

Truly I hope you're right about the Gamecore stuff just being functional enough to get the games out the door for launch and there's a large amount of improvement/optimization that can be had; MS have so many other things going well for them regarding Xbox (studio acquisitions, a few nice surprise 1P games this year of very high quality, Gamepass, Xcloud etc.); I'd hate to see bad optics forming around Series X being an underperformer, hurt their long-term plans. But the early adopters, who traditionally drive console sales at the start of a gen, they DO care about these performance metrics and where they go, the casual/mainstream usually tend to follow. That's why more than anything MS needs to start reversing course on this.

Doubling of memory was straight forward way to ease the development and have upper hand in one dept, but by doing that they would have to do the same with XSS and that would I guess result in considerably higher loss.

What they should have done, if they wanted absolute power crown besides spec sheet, is gone with 48CUs (44 active) and 256bit bus with 16Gbps chips. Clock it similarly to PS5, retain rasterization parity and gain shader advantage while not having to deal with virtual memory pools (+have 15mm² bigger chip instead of 50mm²)

I really like PS5 design, its very nice and straightforward.

Yeah but then if they took this route, it wouldn't fit in as well with their dual-purpose use of Series X APU for Azure. Thinking on it now it does seem like it's pretty true that for as much as BC may've limited Sony's design choices, it's also limited MS's. They wanted 4 XBO S instances on a single APU so they needed to go relatively wide on the CU count. They knew they'd be using the APU in Azure servers so it has to maintain constant levels of guaranteed performance, that means lower GPU clocks (relative to PS5's; it's likely they would've stuck with 14 Gbps chips too because server systems tend to use slower main memory than consumer PCs, for example). They knew they wanted to make a smaller, cheaper variant and so storage had to be developed to accommodate both models while hitting a nice, simple TDP hence (on-paper) more conservative raw SSD I/O specs relative Sony and the proprietary expansion card design.

I'm curious if in the future it will ever be possible for GPUs (not just AMD; Nvidia and Intel, too) to allow for a sort of variable frequency approach that is just specific to portions or "blocks" of the GPU. It's something I've been thinking about the past few days; this way if you did go for a wider and slower design, but certain workloads benefit from faster clock and you can't actually clock the entire GPU higher or else you kill your power budget, why not just significantly reduce the clock rate on some of the GPU "blocks" and significantly increase the clock rate on other parts of the GPU "blocks" while gating off compute to just the parts with the higher clocks?

This should be theoretically possible in some way, right? Maybe in the future? And in a way that's dynamically adjustable in real-time on a per-cycle basis. I guess power delivery to the GPU would have to be redesigned but maybe there is a future in this sort of design, hopefully. Because I think if it were possible for Series X, it would help out a lot with the way some of these early games are seemingly being designed.

120fps mode:
1qrka2.png

2sdk62.png

Image quality 60fps mode:
EnW1JUDXMAM2xCr
The latter is basically 900p vs 1080p (relative perceptible resolution difference, the res is higher on both machines, from 1440p to 4K)

Yep, these are the shots I saw earlier, and it hurts pretty hard. They almost look a generation apart.

We might have a good understanding to why this is happening in terms of the circumstances producing these results, but most of the people watching this stuff on YT, and most gamers for the most part, don't know and don't care.

They're just going to see stuff like this and assume it's down to one (Series X) having a poorer architecture and being weaker outright, and it stings even harder if they are aware of some of MS's messaging over the year. So I'm really hoping 3P performance on Series X starts to iron out the kinks.

But moreso than that, I'm ready to see some of that next-gen 1P content, MS. You gotta make something happen at TGA, and that Halo Infinite update? It better be amazing. Can't afford any "meh, it's alright" responses from at large, not anymore.
 
Last edited:
I have to say that moving my comments on the irony of the Xbox marketing does actually remove some of the context considering it relates directly to this DigitalFoundry comparison. A bit disappointing.
 
Ye, agree. Was it DF that mentioned it being like a toyota? They build it simple, but still advanced and reliable. A Lexus perhaps.
MS really has to improve things now, the masses see the PS5 as the more powerfull console atm, as every single title runs and looks better on it, aside from faster loading and the controller. The XSX should have been the choice for multiplats.
Sony really learned from the PS3 launch? Looking at how simpler was PS4 and now PS5.

Yeah, that's where this REALLY stings hardest. They've spent almost a full year talking specs, touting their perceived advantages etc., but now we've got multiple 3P games just running better on PS5 outright. I'm already seeing some shots for DiRT5 on both platforms and there's spots where the two games literally look a generation apart between Series X and PS5, in favor of the latter. Like, entire levels of geometry detail, texture detail etc. are just missing on MS's platform in some of those spots (at least from screengrabs I've seen posted elsewhere).
It is as if Sony and MS exchanged places during PS3 & Xbox launch. Ironically, it seems to be true that Sony could launch a year earlier than MS (like Xbox 360 launched earlier).

Really hoping this at the very least, convinces MS to not rely too much anymore on 3P games to showcase their system's capabilities off, it's a bit clear that they can't do such if the tools aren't there yet. Cyberpunk might be their best shot to start turning some of these unfortunate optics around, but even that has a good chance of looking & running better on PS5 now going by these trends. It's time they start relying on their 1P to show what their design is actually capable of, they gotta take a page from Sega here (where it was literally 1P titles like VF2 that helped start curbing back some of the bad optics the Saturn was having at that point from a technical POV among some gamers and large parts of the gaming press of the day).
Well if Cyberpunk runs worse then MS is a huge trouble for 1-2 years.

They are like 1-2 gen behind Sony in all their issues. In Xbox One they relied too much on 3rd party for game exclusivity, now they relied on 3rd party for launch...Instead of nurturing the first party studios.

But moreso than that, I'm ready to see some of that next-gen 1P content, MS. You gotta make something happen at TGA, and that Halo Infinite update? It better be amazing. Can't afford any "meh, it's alright" responses from at large, not anymore.
Well, Halo Infinite did not look like the next-gen game either...
 
I only know the memory commands seem incomplete or suboptimal. I have no idea what type of effect that has on the games. There seems to be some challenges for developers if they need to shuffle their memory around, and typically if all you're doing is working on Series X/S then, it's a non issue because you can lay out your memory from the get go. But not evreyone has that luxury except for 1P studios.
But if you're juggling building out multiple platforms, and you find you need to shuffle memory, you're SOL it seems like, and if you shuffle it's going to be really impactful, they will be working on methods to get around this issue in future releases.

This is just 1 aspect. But an important one to note.

That's interesting. I'd assumed the XSX devkits came with all the tools to investigate what the best arrangement of data was, and then gave you the tools to put it there as and when needed.

Situation for XSX hopefully won't be as complicated, as all chips are the same size and presumeably only the system reserve lives on the clamshell chips. That shouldnt need more than a few GB/s, you'd a thunk...
 
Is Cyberpunk 2077 going to be running in BC mode on the XSX/PS5?

Maybe? Or, They may have a quick-n-dirty version running native but without using the new goodies. Where the extensive use of goodies comes months later.
 
Hum, I'm starting wonder about a certain developer who was attacked about his comments dealing with PS5 and XBSX. And other developers agreed with his comments, but of course console warriors and corporate shills ruined any honest discussions to be had.
Yeah, and there were those that said he needed to apologise for his remarks. Maybe they should subscribe to his channel and apologise to him personally.
 
Maybe? Or, They may have a quick-n-dirty version running native but without using the new goodies. Where the extensive use of goodies comes months later.
I have it marked at XDK; as of the download file on my Series X today. I'll check again next month closer to release to see if it gets moved to Gen9Aware
 
Yeah, and there were those that said he needed to apologise for his remarks. Maybe they should subscribe to his channel and apologise to him personally.

The developer(s) I'm talking about aren't Cherno, nor do they have YouTube content accounts. A quick and simple search dealing with developers comments within the timeframe of "The Road to PS5" conference are some of the ones that I'm talking about. There is one particular comment that I remember where one developer asked why didn't Sony or Cerny go into further detail about PS5 architecture (certain parts) which they knew about, but didn't want to disclose to the public. But there are some interesting developers comments if you look for them... I believe a lot of their opinions got ignored in favor of console warring, corporate hyperbole and armchair quarterbacking spectatorship. Simply put, these systems aren't going to be shitting on each other as some would hope for.
 
Last edited:
Dynamic 4K, 60fps, image quality. I'd say this shows, again, a CPU bottleneck on XSX compared to PS5.

"high fps dips means cpu" has never been good reasoning, but worked ok as a barometer last gen because the cpus were so incredibly weak compared to what games expected. Sure there could be something wrong with the cpu code, but its hard to imagine that the console is fundamentally cpu bottlenecked on a game that also shipped on xbox one and ps4 (and with a 120fps mode!).
 
Dynamic 4K, 60fps, image quality. I'd say this shows, again, a CPU bottleneck on XSX compared to PS5.

aKQLCn4.jpg

It's not going to be a CPU bottleneck. Here we see the 3950x on the PC pushing a 1% minimum of 104fps and even then it's almost certainly GPU bottlenecked. Sure it's faster than the XSX CPU but not by that much, and that's while it's pushing through PC overheads.

https://www.techspot.com/review/2144-amd-radeon-6800-xt/
 
It's not going to be a CPU bottleneck. Here we see the 3950x on the PC pushing a 1% minimum of 104fps and even then it's almost certainly GPU bottlenecked. Sure it's faster than the XSX CPU but not by that much, and that's while it's pushing through PC overheads.

https://www.techspot.com/review/2144-amd-radeon-6800-xt/

There are videos of DIRT 5 with 3700x CPUs and there isn't any hint of the CPU being even barely stressed. Not likely the issue here.

edit: screenshot of benchmark (3700x / 3070 at 1440p). The GPU is close to full load, so at least on PC looks GPU bound.

upload_2020-11-21_20-51-23.png

I've also found this Dirt 5 / AMD video where they explain they use DX12U advanced techniques such as VRS Tier 2 (timestamped), so it looks like Codemasters did have time to adapt the game to the new GPU features.

 
Last edited by a moderator:
frame rate can be an issue with anything, ranging from CPU or GPU.
Without understanding what the cause could be, you're basically throwing darts out there hoping something will land.
Yeah, just watching the video, it doesn't seem particularly attributable to what's on screen. There are scenes where a high number of vehicles is rendering at 60fps in one instance, but then it drops as shown here.

If you just watch the performance over the course of the race, it's somewhat unpredictable. Towards 15m35s, the framerate smooths out completely while still being in the same general area and background environment with still the same number of vehicles. Then it just drops for some reason.




Around 16m10s, there are a high number of transparencies on screen with a much wilder framerate, so I wonder if there's ROP & bandwidth thing going on. Both platforms get hit harder during this segment. But then at 16m30s, there's a similar scene with heavy transparencies and the framerate is perfect. It's weird.
 
I'm leaning toward the bug theory in relation to the 120hz issue on XBSX, but this seems like a thing you notice right away during the QA analysis. One wonders if they pushed too much to have this game so close to launch, a bit sad as I really feel this is Motorstorm spiritual successor and it had a rough launch.

edit: apparently IQ is better on PS5 on the fidelity mode as well.

EnW1JUDXMAM2xCr
Could be VRS.
 
To continue to throw darts, maybe related to Xbox still running suboptimal traditional shaders and not using NGG Primitive Shaders functionality yet?

https://forum.beyond3d.com/posts/2176901/
https://forum.beyond3d.com/posts/2176709/
This was a theory I had few days ago, partly because of how Cerny referred to usage of GE in Road To PS5. He said you can use it as standard GPU, but if you want to dig into performance benefits there are many ways to cull excessive geometry before you waste any cycles outputting it and shading afterwards.

Really interested what is MS current stand on this.
 
Status
Not open for further replies.
Back
Top