Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
I do believe something is GPU related on Xbox that is causing a form of bottleneck/hitching, but the power scaling argument you bring forward doesn't seem applicable outside of this particular scenario.
I don't think we should talk about bottlenecks at this point because at this point all those games basically ports in a sense, that doesn't leverage console power. But it will be rough for XSX if the issues are not resolved somehow.

But I do wonder why MS did not go with Sony's solution. Less performant GPU and split memory pool?
 
Whatever the reason it's seems to be a CPU bottleneck on XSX, like in AC Valhalla. Hypothetically, if PS5 had a unified L3 cache, then it would be easier to explain PS5 edge here without invoking the usual bug or bad tools excuses.
I was under teh assumption since patch 1.04 they proved by dropping resolution resolved XSX hitching problems in AC Valhalla. That doesn't necessarily indicate a CPU issue here.

I'm not sure why you are targeting the CPU again, memory management, even a small mistake can hang the entire system as well.
Typically, poor memory management you can run into hitching problems all the time given how long your system can sit idle waiting for data to arrive.
 
I don't think we should talk about bottlenecks at this point because at this point all those games basically ports in a sense, that doesn't leverage console power. But it will be rough for XSX if the issues are not resolved somehow.

But I do wonder why MS did not go with Sony's solution. Less performant GPU and split memory pool?
I'm just using bottleneck as a generic term, poor programming causes bottlenecks for instance. I could generate objects during run time and that would just make any system crawl. So sometimes things coded 1 way may not have an issue but on another memory setup could have an issue. The idea here is that it's very unlikely that game code is optimal and there are more things to worry about on XSX side from an architectural perspective than Sony's side. Particular the split memory pool is something that needs to be managed that doesn't need to be for PS5. And it really can't be understated how PS5 is likely the lead platform for development.

The larger GPU does require more bandwidth. This split memory pool problem could have been resolved by eating costs to make the remaining 4 chips 2GB instead of 1GB and having a final tally of 20GB. That would be an easier resolution than having to go back to modify the SoC.

At the end of the day, everything comes down to costs. 1 person may be lasered focused on performance. Another on making it as cheap as possible. And another looking at how to make things as reliable as possible. And they work out what they think is the most optimal setup they can support. Maybe they should have spent more money on the console and less on marketing, which tends to be the greater cost of things. Unfortunately it's too late to turn back on those decisions, both consoles are going forward with mainly the same components.
 
Last edited:
Whatever the reason it's seems to be a CPU bottleneck on XSX, like in AC Valhalla. Hypothetically, if PS5 had a unified L3 cache, then it would be easier to explain PS5 edge here without invoking the usual bug or bad tools excuses.

I don't know it, but I don't think the PS5 has a unified L3 cache. The Sony / Cerny patent is massively different to what AMD have spent years architecting for Zen 3.

If this NBA drop is on the CPU side, it's software. And not necessarily on the developer end.
 
Didn't MS at one point boast that they were running 3 OSes at the same time?
If the PS5 only has to run a single OS, then this could explain the performance discrepancy in almost every released title up until this point.
 
Didn't MS at one point boast that they were running 3 OSes at the same time?
If the PS5 only has to run a single OS, then this could explain the performance discrepancy in almost every released title up until this point.

Not really. Virtualisation of OSes with the necessary hardware has a very low overhead.
 
I don't think we should talk about bottlenecks at this point because at this point all those games basically ports in a sense, that doesn't leverage console power. But it will be rough for XSX if the issues are not resolved somehow.

But I do wonder why MS did not go with Sony's solution. Less performant GPU and split memory pool?

Well, MS wanted the Series X APU to be dual-use: for both a marquee console and in their Azure server blades for Xcloud. They wanted to be able to run four virtualized instances of Xbox One S games on a Series X, the best way to do that was to ensure 4 SAs were in the design with enough horsepower to do it. And they needed to make sure that the clocks were at a point where stability could be maintained in a server environment which would translate to the console implementation, hence 1.825 GHz.

Other things like 320-bit bus would allow them to double up the memory capacities on server (maybe through clamshell? Or with doubled density modules, but 4 GB GDDR6 modules don't exist at the moment AFAIK). The SSD I/O baseline may've been set the way it is in order to ensure something that was performant enough yet low enough on power consumption for Series S, while ensuring parity between that and Series X. Stuff like that, basically.

I agree that trying to make definitive statements on where each system lands from these early launch titles is way too early, but MS created this problem for themselves by A)messaging and marketing off being the "most powerful" console for months and B)not having even a single 1P title ready for launch. They honestly should've just had Ninja Theory polish Bleeding Edge even further and held that off for a cross-gen launch between XBO/Series S/Series X this November. Would've benefited the new consoles AND Bleeding Edge in terms of polish and exposure, especially if they kept it free-to-play.

Seeing though that they let 343i go unchecked as long as they have on Halo Infinite kind of signals to me MS's QA and oversight over at least some of their studios is too loose. They really should've clamped on 343i years ago to get them up to par. The heads at that studio should've been removed well before the July showing if they thought what they had in July would've been satisfactory. I also agree that MS don't have a lot of time to turn around the current issues here; power in and of itself isn't the deciding factor, but it's a major factor in the early phase. MS did so much messaging on having the power narrative and how things are playing out right now, it's kind of brutal. Exacerbated because there's no (new) 1P games available at the time which show what the system can really do. Gears 5 is arguably the closest but it also released last year and the new DLC isn't coming until a bit later this month IIRC.

Something'll have to start changing up for them on that front around early time next year. Either that, or some of these H1 exclusives (The Medium, Scorn, Bright Memory Infinite etc.) hit pretty hard. If they live up to what's been shown so far (especially BMI from the May event), I think they'll be good. Same for Exo-Mecha and The Ascent. MS needs to be giving these games all the engineering support they can get, they need to be showcases for Series X and Series S if their 1P stuff like FS2020 port and Halo Infinite aren't hitting until H2 2021.

I think they can do it, but we'll have to see.

Not really. Virtualisation of OSes with the necessary hardware has a very low overhead.

Yeah; there was the main OS and two hypervisors. Virtual machines, basically. Nowhere near the same thing as three actual OSes (you'd have to reboot every time you wanted to play a game or watch a movie x3).
 
I don't know it, but I don't think the PS5 has a unified L3 cache. The Sony / Cerny patent is massively different to what AMD have spent years architecting for Zen 3.

If this NBA drop is on the CPU side, it's software. And not necessarily on the developer end.
Could be something as simple as draw calls being more expensive on Xbox. Depending on how the crowd in drawn, it could be stalling because of that.
 
Whatever the reason it's seems to be a CPU bottleneck on XSX, like in AC Valhalla. Hypothetically, if PS5 had a unified L3 cache, then it would be easier to explain PS5 edge here without invoking the usual bug or bad tools excuses.
CPU bottleneck in Valhalla? How would one get to that conclusion? That game dropped frames in retail Patch at moments when resolution would drop, implying a gpu limitation of some sort. Patch 1. 0. 4 coms out and lower the lowest res XSX ca drop to and now it tears nd uns better than PS5 in the scenes have tested. That Shows quite conclusively a GPU related Situation to me and not a CPU one.
 
I don't think we should talk about bottlenecks at this point because at this point all those games basically ports in a sense, that doesn't leverage console power. But it will be rough for XSX if the issues are not resolved somehow.

The only games that are directly comparable are third party games which by their nature are going to be ports. Even when developers talk about a lead platform, it's not like they're tailoring game design and tech around that particular hardware setup. Minor differences and weird hitches that nobody will care about are likely all that will ever separate the performance on the new consoles.

The most interesting stuff I'm seeing this gen is how devs are approach the Series S.

Could be something as simple as draw calls being more expensive on Xbox. Depending on how the crowd in drawn, it could be stalling because of that.

I haven't played a sports game for decades but I was hugely impressed the crowd details in this game.
 
CPU bottleneck in Valhalla? How would one get to that conclusion? That game dropped frames in retail Patch at moments when resolution would drop, implying a gpu limitation of some sort. Patch 1. 0. 4 coms out and lower the lowest res XSX ca drop to and now it tears nd uns better than PS5 in the scenes have tested. That Shows quite conclusively a GPU related Situation to me and not a CPU one.
This game runs also on ps4 with so weak, underclocked jaguars... should not be CPU intensive.
 
They wanted to be able to run four virtualized instances of Xbox One S games on a Series X,
But I thought they wanted to use Series X for streaming instead of Xbox One S because it limits to 720p?

I am also doubt MS did not know about the issues in third-party games. Looking at how Gears 5 runs, I am sure XSX can be better in 3rd party games.
 
Last edited:
But I thought they wanted to use Series X for streaming instead of Xbox One S because it limits to 720p?

I am also doubt MS did not know about the issues in third-party games. Looking at how Gears 5 runs, I am sure XSX can be better in 3rd party games.


The 720p streaming limit with xcloud is due to the external media encoder that they use with the current xcloud blades, based on the One S console, they did talk about it, it was something to do with the latency being too high when streaming at a higher resolution than 720p. The Series X Soc has a hardware encoder built in that can do 4k/60 encoding, at negligible additional latency, much reduced from the current external encoder. Bit of am assumption, but I would assume that it can do four 4 1080p streams.

I wonder what the ram setups are like on the new xcloud blades? They must have more ram if they are running 4 xbox one games simultaneously, the One S has 5 gb of ram available for games, so maybe 24gb per blade?
 
I wonder what the ram setups are like on the new xcloud blades? They must have more ram if they are running 4 xbox one games simultaneously, the One S has 5 gb of ram available for games, so maybe 24gb per blade?

The bus on the SoC is 320-bit, so it could potentially go to 20 GB single sided or 40 GB in clamshell mode. They could maybe save a bit on costs by just doing what the XSX is with different sizes of memory chips (mix of 8 and 16 gbit chips), so that would be 32GB.

They could probably use any mix of chips they needed to get to where they want to be, but to maximise capability for none gaming tasks they might want to max out the systems at 40 GB.
 
The only games that are directly comparable are third party games which by their nature are going to be ports. Even when developers talk about a lead platform, it's not like they're tailoring game design and tech around that particular hardware setup.

I think that poster means lastgen ports, which are definitely gonna tend to have a difference performance profile from (dx12, vulkan) PC ports or ps5->xbox ports. One thing i'm really curious about is when we're gonna start seeing comparisons of games we know have really modern engines -- fifa 21 (any frostbite game really), doom eternal (nextgen port, if its ever coming), and so on.

That's gonna answer the questions about whether current games are just underutilising the xbox's wide gpu I think. If xbox is still dropping frames in fifa there's something wrong other than just the different approaches imo.
 
PS5 putting in that foliage work... :runaway:

Edit: As expected, as reported/rumored for more than a year now, PS5 rasterization is around RTX 2080. Not bad at all...

Great work Alex!!!
Thanks :)
It took a hilarious amount of time to get He performamce data here...watching the game's opening soooo many times. Also, this is a pretty not too great PC version. So barebones!
 
Last edited:
Status
Not open for further replies.
Back
Top