Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
How can it run on a higher average of the frame rates where unlocked it it already drops frames the locked 60 to 50s in both machines during gameplay. Photomode can't be used as a benchmark because it doesn't represent gameplay and as u said it takes the cpu out of the picture so what was the point. You can't draw conclusions on photomode when the cpu is out of the picture because you need both a cpu and gpu to play

If a game is locked at 30 FPS it's generally running above that but tied down to the consistent 30 because the varying FPS would not be nice.

In an ideal world games would allow the user to decide if they want to play locked or unlocked, if they let us switch off the lock then we'd be able to get a measure of true performance gap rather than this muted one.

Df said it themselves that both consoles drop frames during gameplay particularly during combat and the Xbox drops even more because of the irregular stutters that happen, so saying that the series x could do 1080 60 with raytracing better than a ps5 just because of photo mode is ridiculous since both consoles perform the same during gameplay already. What difference would 1080p make. He should have instead said series x will hold above 60fps during phototmode at 1080p.

I tend to agree but the thought is that maybe the XSX has some untapped potential in RT mode...the problem is that (as I've said) the photo mode is not game mode, so I believe the assumption is incorrect until whatever is holding back the XSX in game mode is resolved...I believe he also mentioned that the stuttering would need to be fixed.

Lowering to 1080p just gives you a little more headroom to hit 60 more frequently and 30 maybe move up to 40 therefore offering a potential where it might average mid-high 50s giving the user the option to take these fluctuations for an overall better experience.
 
One way to improve this is to use seperate, proper motion vectors to reflections.
Pretty sure Frostbite had something on this in their presentations.
Hmm I hope that gets adopted in games using RT reflections quickly, as the ghosting can be really immersion breaking. It was so bad in Control on console that performance mode actually produces better image.
 
DF Written Article @ https://www.eurogamer.net/articles/...upgrade-hits-60fps-but-somethings-up-with-ps5

The Division 2's next-gen upgrade is impressive - but something's up with PS5
60fps is transformative but visual effects are missing on the Sony console.

On the face of it, The Division 2's upgrade for next generation consoles should be fairly simple to describe, with wholly predictable results. Similar to titles like God of War and Days Gone, the last-gen codebase is updated with the game aware it's running on new hardware, unlocking the frame-rate in the process. The end result should be a capped 30fps experience now running at 60 frames per second - or close to it - with little or nothing else changed in the process. That's effectively what you're getting on Xbox Series X, but something is definitely amiss with the PS5 build, which is missing important graphics effects - visual features that aren't just present on Xbox consoles, but on PS4 Pro too.

But still, the headline is that all versions now run at 60fps, lifting the 30fps cap from the last-gen experience. It feels vastly smoother of course, transformative for a third person shooter, and there are improvements to loading times too - plus improved texture filtering on Xbox Series consoles. In terms of resolutions, The Division 2 retains the game's impressive temporal reconstruction technique, meaning we had to jump through hoops somewhat to discern actual native pixel counts. Dynamic resolution is in play on all systems, meaning that the 60fps action is delivered with a 900p to 1080p resolution range on Xbox Series S, rising to a 1800p-2160p range on Series X. Meanwhile, PlayStation 5 operates with a much wider range - 1080p is seemingly the lowest recorded resolution, rising to a maximum of 1890p.

Graphics settings on the Xbox consoles look to be a close match to the last-gen Xbox One X, but the move to solid-state storage and improved CPUs boosts the efficiency of the background streaming systems, with texture and geometry pop-in minimised to a certain extent - good stuff! In effect, Xbox Series consoles essentially get the game-changing boost in frame-rate married with additional visual refinements, mostly delivered by the system level back-compat feature set and the raw horsepower of the new hardware.

...
 
Yes, but it is still a mess. :yep2:

I have to agree with DF, this shouldn't have been released on PS5 while in it's current state.

Are there any user-facing system level settings that can be toyed with, like the 4Pro has?
 
If a game is locked at 30 FPS it's generally running above that but tied down to the consistent 30 because the varying FPS would not be nice.

In an ideal world games would allow the user to decide if they want to play locked or unlocked, if they let us switch off the lock then we'd be able to get a measure of true performance gap rather than this muted one.
Exactly a true measurement would be unlocked 30 during gameplay this is the only benchmark I'd understand.

I tend to agree but the thought is that maybe the XSX has some untapped potential in RT mode...the problem is that (as I've said) the photo mode is not game mode, so I believe the assumption is incorrect until whatever is holding back the XSX in game mode is resolved...I believe he also mentioned that the stuttering would need to be fixed.

Lowering to 1080p just gives you a little more headroom to hit 60 more frequently and 30 maybe move up to 40 therefore offering a potential where it might average mid-high 50s giving the user the option to take these fluctuations for an overall better experience.
Both consoles can hit 60 at 1080p with raytracing I think remedy just either didn't bother or didn't have time to do it, and it's also a cross gen game using legacy code so I wouldn't use such a game for benchmarking purposes. About series x raytracing potential I don't think any dev is holding it back! Such an assumption is ridiculous because games have to be capped at 30,60 and 120 anyway so the idea that a 1080 60 mode on series x would some how favour the series x is wrong because the ps5 will also look 1080 60 with no issues in gameplay.

we saw also on call of duty Cold War both consoles hit 60fps with dynamic res but the ps5 having slightly better quality in raytracing. I don't believe rumours such as the "tools" excuse and developers holding back because it's been a dozen games already and most of them perform similar in both consoles most better on ps5 and some on series x. I think people are still stuck with the teraflops are everything cult and fail to understand why ps5 performs as good as it does and therefore they create paradoxes when the series x can't outperform the ps5. Until nextgen games start being compared then we'll get the final picture of how this consoles match up cause you can't make solid comclusions on cross gen and bc titles.
 
Exactly a true measurement would be unlocked 30 during gameplay this is the only benchmark I'd understand.


Both consoles can hit 60 at 1080p with raytracing I think remedy just either didn't bother or didn't have time to do it, and it's also a cross gen game using legacy code so I wouldn't use such a game for benchmarking purposes. About series x raytracing potential I don't think any dev is holding it back! Such an assumption is ridiculous because games have to be capped at 30,60 and 120 anyway so the idea that a 1080 60 mode on series x would some how favour the series x is wrong because the ps5 will also look 1080 60 with no issues in gameplay.

we saw also on call of duty Cold War both consoles hit 60fps with dynamic res but the ps5 having slightly better quality in raytracing. I don't believe rumours such as the "tools" excuse and developers holding back because it's been a dozen games already and most of them perform similar in both consoles most better on ps5 and some on series x. I think people are still stuck with the teraflops are everything cult and fail to understand why ps5 performs as good as it does and therefore they create paradoxes when the series x can't outperform the ps5. Until nextgen games start being compared then we'll get the final picture of how this consoles match up cause you can't make solid comclusions on cross gen and bc titles.
I’m not suggesting devs are holding it back, but clearly something is. I appreciate TF isn’t everything but the XSX has other advantages so it wouldn’t be unexpected to see most games running at least a little better on XSX...don’t forget these machines are using very similar components.

I personally don’t think either console would be able to perform a good enough 1080p60 with RT and that’s why we ended up with 1440p30...but I might be wrong, guess we’ll never know.
 
Jesus they butchered ps5 version of division. Missing effect low resolution and why the hell it loads in almost 17 seconds while xsx loads in approx 7..... what a great patch
 
Xbox Series S, rising to a 1800p-2160p range on Series X. Meanwhile, PlayStation 5 operates with a much wider range - 1080p is seemingly the lowest recorded resolution, rising to a maximum of 1890p.
The most logical explanation is that the GPU clocks must be much lower than 2.23ghz on Pro and the game can reach 1890p in CPU limited areas. As the game was already 1080p on base PS4 such a resolution wouldn't be strange using Pro clocks and 36CUs. We have already seen that with AC Unity. The unpatched game (another Ubisoft game BTW) is usually running at 60fps with odd framerate drops to the 30s in GPU limited scenes. Others Ubisoft games are working with bugs on PS5 like an AC syndicate so I am not surprised to see the volumetric fog bug in the Division 2 running via BC on PS5.

Interestingly it was DF who were the first to explain the odd framerate drops in AC Unity on PS5 because of GPU clocks using default (or in between) Pro clocks.
 
Isn't it just the 4Pro version?
It should be, with double fps with slight boost in resolution.
But it's not, somehow they've managed to break it.
The fact that effects are totally missing in the game almost seems like the ini file had them set to zero or something.
No SSR and volumetric effects.
Rock solid 60fps though.
 
I think people are still stuck with the teraflops are everything cult and fail to understand why ps5 performs as good as it does and therefore they create paradoxes when the series x can't outperform the ps5.
You think users on this forum are stuck with TF? Really? Maybe users are stuck with the CPU, GPU and RAM bandwith advantage instead.
Jesus they butchered ps5 version of division. Missing effect low resolution and why the hell it loads in almost 17 seconds while xsx loads in approx 7..... what a great patch
Let's hope the bugs get fixed, but don't expect loading to improve much. This game is running in BC mode and most games in BC have slower loading on PS5 compared to Xbox.
 
I sometimes like to imagine there being a formula for every game and every engine. where
F(x,y,z) = frames per second
where x is resolution, y is some other stuff, and z is some other stuff.

When I look at any polynomial, I'm always reminded of the highest power of any polynomial, because that power is going to largely determine the shape of the graph. So x^5y + x^3y + x^2y + x, is largely still going to look like a x^5 graph despite all the elements trailing behind it. So in my mind, if RT is going to have this daunting impact on the GPUs, everything else relatively speaking is diminished in it's impact to frame rate. ie, the rest of the render pipeline is stalled at RT and after it's stalled it can go as fast as it can. So it doesn't really matter what comes before or even after the stall, because the game can only render as fast as that stall point. This idea that you can somehow 'catch up' after the stall is sort of not making a lot of sense when you think about it. The 'catch up' part of it would have to be so long and relevant to make up for the RT stall and we just don't see that because in all the benchmarks the XSX beats the PS5 in the camera modes. If there was such a discrepancy between alpha/rop performance, PS5 would blow by XSX in non RT challenging scenes.

Both consoles are being rocked by RT. And we see that happening with the 6800XT.
The reason that XSX is barely better than PS5 in the corridor of death is because ultimately it only has marginally better RT. And we see that with the 6800xt, even with 80CUs and tons of bandwidth, 2x the ray calculation dropped their performance back to 34fps with console settings. So it's RT as being the limiter on all these benchmarks.

There's a lot of truth here but at the same time, this game's a bit of a quick port by a small team to churn out some extra revenue and profit on the title. I'm not necessarily certain if Control is the best standard to measure RT performance for either Series X or PS5 on. After all, there are a few other games out there at least around its level for technical prowess, such as Miles Morales, which have arguably as good (if not better) RT and can even do so in a 1080p60 mode option.

So with a little more time and more targeted optimization I think RT performance on RDNA 2-based architectures should improve somewhat notably over the span of the next year or so, especially when things like Super Resolution are readily available. But all the same, I just like being an optimist in these kind of situations, I could easily be blissfully hopeful to a fault and RT performance will be lukewarm throughout the generation xD.
 
Well yes (and no) - I'm actually agreeing that Control plays better (or at least as good when stutter is fixed) on PS5 in all modes - it's a wash.

I believe Alex has drawn the conclusion that because of the photo mode showing such a good performance gain (well, to the expected level of gap) that 'if dropped to 1080p XSX could run RT capped at 60'.

I can see why he has concluded this from the photo mode, but in the same breath we have the 2 consoles in gameplay neck and neck. So my conclusion (and I'm no expert - Alex knows a lot more than me!) is that something in the gameplay element of the game is dragging the XSX back...as such if both were dropped to 1080p 60 with RT I'd personally expect a similar performance.

From memory (in this tread) I believe Alex said the texture streaming is off in photo mode and others have said the CPU is not really utilised at all in photo mode...I would ass-u-me if those were being utilised the graphs would level off to a similar figure (from my ass) of 30-50?. We've seen examples where the photo mode varies wildly (from 32 to 60+) so you can imagine such a varying FPS would be jarring and hence the 30FPS cap.

I guess the main question is, has the PS5 got something that in game boosts performance (eg IO/cache scrubbers) or is the XSX being held back due to immature dev kits (or whatever the technical term is)...I'd guess the latter, this photo mode (for me) cements the XSX potential and the paper specs show it shouldn't be any more bottle-necked than PS5 (unless I missed something).

Quite interesting, isn't it ;)?

Personally, given this game is a relatively quick port by a small porting team, I think calling some of the performance discrepancy on the devs here is valid, as crummy as that talking point can sometimes get (because it might insinuate "lazy devs", which isn't the case here at all).

However, if it IS something more to the lack of tools maturation and/or familiarity by devs, then the question has to be asked: how long will it take before the tools or familiarity with them is actually at a good point? Because the longer it drags out, the more that hurts optics.

If, though, it's down to something in the hardware design, well from this benchmark it clearly wouldn't be the GPU causing issue, in fact this would also disprove the talking point that the GPU is harder to saturate on the frontend (that's been a particularly popular one in the grapevine, ignoring the fact there are literal 80 CU GPUs on the market and rumored 160-CU chiplets with RDNA 3 coming so CLEARLY CU saturation is not a problem with AMD's architecture xD).

So, I had a brief thought that if anything, it could be the CPU. I don't know how CPU-intensive this game is though with all the environmental deformation, those seem like they'd be CPU-intensive operations, unless the game is utilizing GPGPU asynchronous compute to handle the physics calcs there (with how crappy the XBO/PS4 CPUs were, this might've been the case).

Anyway, I figure since the difference in CPU speeds for both systems when in SMT Mode is only 100 MHz, and MS's CPU is handling a bit of the I/O stack management (a very small amount but still more than compared to PS5's CPU in any case), there may be a case that MS's CPU isn't providing enough room to issue out the drawcalls for the GPU to saturate it at a sufficient rate on certain events to keep the framerates locked as consistently as Sony's.

Considering the difference in GPU clocks and the game running at same settings on both, if we assume it's saturating PS5's GPU at peak clocks you'd need about 44 of the Series X CUs saturated to match the PS5's workload. I'm guessing that's, what, maybe a 10% - 11% extra in resources to saturate on Xbox's side? Though again back with the Photo Mode benchmarks we're seeing the 16% - 18% difference in favor of Series X on the average (with peaks well above that), so I'm just curious what CPU you'd need to hit that in actual gameplay?

So, just me spitballing here but, maybe it's simply a case the CPU needs to be clocked faster still in SMT Mode to ensure the GPU can be saturated with work and issued its drawcalls at a rate it can be fully maximized given the spec headroom over PS5's GPU, and also taking into account the CPU on MS's side still needs to handle some of the I/O stack plus management of the segmented memory at least on some level by the OS running on the reserved core. That'd be my first guess if we get to a point where we have to start asking if it's something ingrained in the hardware design causing the results we're seeing regularly between both platforms.

Also almost forgot to say but, none of this would necessarily also prevent pondering on what Sony may've done to optimize for their performance results simultaneously. It doesn't really have to be an either/or. For example there's the cache scrubbers as you've mentioned, but there's also the somewhat pestering speculation of unified L3$ on PS5's CPU. If that happens to be true (and that's a very BIG if), for operations within SMT Mode I feel that'd give PS5 the advantage even with a 100 MHz clock speed disadvantage. That'd mean easier time issuing the drawcalls to the GPU and that ultimately would mean higher framerates if settings are same or similar on it and Series X versions of games.

Again though, this is all just speculation to keep in the back pocket until we see more results to truly ascertain what may or may not be happening; for now I predominantly peg performance differences in Control between the systems on time/resource crunch from the porting team, and maybe some tools immaturity/unfamiliarity behind that.
 
Status
Not open for further replies.
Back
Top