Digital Foundry Article Technical Discussion [2018]

Status
Not open for further replies.
DF Article on Overwatch Xbox One X patch, which was only a 650 meg update: http://www.eurogamer.net/articles/digitalfoundry-2018-overwatch-xbox-one-x-patch-analysis

Now this is more like it. Blizzard's Xbox One X update adds a dynamic 4K solution which adjusts the pixel count to suit the 60fps performance. The reality is a range between 2112x2160 and 3840x2160, though typically at numbers in between when the action kicks off. It's a big advantage


Beyond this you're broadly getting the same package as PS4 Pro. Shadow resolution is a match between the two, falling short of PC's top quality shadow setting, and while the improvement is only very subtle at best, ambient occlusion presents improvement on the Xbox One X as well. Beyond that, the visual feature set of the game remains firmly in console territory, so PC's higher end reflection settings aren't implemented.

In terms of performance, 60fps is crucial to the Overwatch experience and on PS4 Pro, there is a sense that fixed 1920x1080 resolution gives it a comfortable overhead in keeping 60fps performance locked down. By comparison, Xbox One X shows small and occasional signs of strain in sustaining its target frame-rate. In a nutshell, the game uses an adaptive v-sync to render out incomplete frames, if they exceed the render time budget. What that means in practical terms is screen-tear on Xbox One X. At times, you'll see the top third of the screen briefly showing tear artefacts as the renderer tries to keep up with the action.

As such, actual frame drops are very rare. Barring one or two hitches, 60fps is practically locked down on Xbox One X. The only small downside next to PS4 Pro is the outbursts of tearing at the top of the screen, manifesting as a slight wobble, but that's a relatively small price to pay for the resolution increase. Again, taking areas which show the lowest pixel count of 2112x2160 - taken as the worst-case scenario for the engine from our test range - the frame-rate only drops to 58fps.

It's an excellent showing overall and the implementation here is so impressive that it only deepens the mystery surrounding the lacklustre Pro upgrade. Xbox One X does have big compute and memory bandwidth advantages over its PlayStation rival, but at the minimum there's a 2.2x increase in pixel-count when the X's dynamic scaler is pushed hardest, rising to a more typical 2.9x in busy gameplay, propelled upwards again to a full-on 4x in less busy scenes. Despite the spec advantage, we still feel sure that PS4 Pro could deliver much more.
well, maybe overwatch on consoles is very memory-hungry. This is really a big flaw of the PS4 Pro. It has much more power (than the PS4) but the "added" memory and bandwidth is really disappointing.
Blizzard was never a master in optimizing their games for hardware-specs. Blizzard always tried to optimize them for the fun-factor.
Right now they are trying to upgrade the Engine from World of Warcraft to DX12, really doubt they could squeeze everything out of the consoles. More or less the "just" doubled (I know it is a bit more) the res on XboX and quadrupled it in best case. Nothing really astonishing if you look at the specs of the X and the base hardware.
I don't want to downplay the patch or the work of the blizzard team, but you can see that they haven't reinvented the game-engine to get it on the X, they "just" upped the dynamic resolution. And we need more patches that do right that (e.g. BF1, Mass Effect Andromeda, ...).
If they bring out patches, they don't even need to use the newer SDK or something like that, they just need to increase dynamic res for base-hardware. As the XboX has 3TF GPU, more bandwidth and cpu-power available for base games it could automatically increase resolution beyond the base-consoles. Only memory could be a problem with this "old" mode.
 
I mean, standard PS4 is already 1080p/locked 60fps... Pro simply adds a 4K hud and higher AF. Do you honestly think that's all that could be done?
 
Not only that, but there is almost complete parity between the PS4/XB1. If bandwidth was a problem, the game should have much worse performances on XB1... also, this means that the Pro can't really exceed XB1 performances on this game...
 
Last edited:
Not only that, but there is almost complete parity between the PS4/XB1. If bandwidth was a problem, the game should have much worse performances on XB1... also, this means that the Pro can't really exceed XB1 performances on this game...
you forget that the xb1 and PS4 already have dynamic res in place. Also the bandwidth is not really a big problem if you can work with the 32 mb sram. But that doesn't matter right now, because it is about ps4pro and xb1x. Yes the ps4 pro could do better, but if the limit is the memory (we don't know what the engine does) that the ps4 pro has a problem.
As I wrote, blizzard normally doesn't optimize their games that much (technically).
It is much easier to get a higher res with much more ram and bandwidth than without ;)
 
you forget that the xb1 and PS4 already have dynamic res in pace.

Overwatch virtually runs at 1080p : "Both of the console versions feature a dynamic framebuffer, which hits native 1080p most of the time. Pixel counts are lowered slightly when the engine is under stress, but for the most part the use of resolution scaling is fairly restrained, with drops below 1080p rarely lasting for more than a few seconds before image quality is restored to its fullest. This tends to happen more often on Xbox One, but it's not something that distracts on either platform, and resolution never falls far away from the intended 1080p target once gameplay begins."

http://www.eurogamer.net/articles/digitalfoundry-2016-overwatch-face-off

Also the bandwidth is not really a big problem if you can work with the 32 mb sram.

Yet, many developers said that 32mb wasn't enough and that ESRAM was a problem in some situations.

Also, in most of games the PS4/XB1 differences exceed what should be expected from the GPU specs alone, especially in the most recent games : much lower resolution, worse framerate, worse assets.

Yes the ps4 pro could do better, but if the limit is the memory (we don't know what the engine does) that the ps4 pro has a problem.

So, once again, you're basically saying that the Pro can't do better than the XB1 at the same resolution. It doesn't make any sense. The anwser is really simple : they didn't do anything to support the PS4Pro.

 
you forget that the xb1 and PS4 already have dynamic res in place. Also the bandwidth is not really a big problem if you can work with the 32 mb sram. But that doesn't matter right now, because it is about ps4pro and xb1x. Yes the ps4 pro could do better, but if the limit is the memory (we don't know what the engine does) that the ps4 pro has a problem.
As I wrote, blizzard normally doesn't optimize their games that much (technically).
It is much easier to get a higher res with much more ram and bandwidth than without ;)
Well pro has 500MB more ram for games and more bandwidth. That would be perfectly enough for a 1440p dynamic resolution. There is no technical cause that could explain the current Pro version.
 
Well pro has 500MB more ram for games and more bandwidth. That would be perfectly enough for a 1440p dynamic resolution. There is no technical cause that could explain the current Pro version.
Unless you know the specifics of what the game is keeping in memory, it's somewhat pointless to definitively say that they have the memory to allocate for all their rendering buffers (1.77x) after loading in 4K-UI assets.

It'd have been nice to see a comparison of load times, but it's possible they prioritize faster loading between game matches so they dedicate a much larger portion of memory to that.
 
Last edited:
well, maybe overwatch on consoles is very memory-hungry. This is really a big flaw of the PS4 Pro. It has much more power (than the PS4) but the "added" memory and bandwidth is really disappointing.
As a PS4 pro owner I've been consistently let down by the console. Some games look great, with huge improvements in image quality or performance, but most don't. I do think it's memory constrained, but I think the real problem is fillrate. CPU got a 30% boost over the vanilla PS4, memory bandwidth got a 24% boost, and shaders/compute got a 130% boost, but fillrate is only 14% higher. I'll admit I didn't take a deep look at the specs before I upgraded, but I think it's unrealistic the expect hardware that can only dray 14% more pixels to draw 400% more without a performance penalty. The hardware is great if the games are texture, shader or CPU limited on standard PS4s, but not if you are fill limited.
 
As a PS4 pro owner I've been consistently let down by the console. Some games look great, with huge improvements in image quality or performance, but most don't. I do think it's memory constrained, but I think the real problem is fillrate. CPU got a 30% boost over the vanilla PS4, memory bandwidth got a 24% boost, and shaders/compute got a 130% boost, but fillrate is only 14% higher. I'll admit I didn't take a deep look at the specs before I upgraded, but I think it's unrealistic the expect hardware that can only dray 14% more pixels to draw 400% more without a performance penalty. The hardware is great if the games are texture, shader or CPU limited on standard PS4s, but not if you are fill limited.
Well fill-rate should never been a problem, even on base xb1. There is more fill-rate available than the memory-bandwidth can handle. But who knows what blizzard is doing in it's game code :)

But I know what you mean. My xb1x is also ... well there are not so many titles that use the full GPU. It is a nice to have and everything works better, but I wish they would implement an optional AA-technique that could be forced for titles that have not been patched to use the whole GPU.
E.g. Watch Dogs 2 looks horrible on a 4k screen. A little AA would make it much better. (even if it would be Post AA)
 
Well fill-rate should never been a problem, even on base xb1. There is more fill-rate available than the memory-bandwidth can handle. But who knows what blizzard is doing in it's game code :)
I don't know. A GTX 1070 has the same memory bandwidth as PS4Pro but much higher fillrate (over 3x more) and performs much better.I know there is some contention because or the UMA, but by most estimates that's less than 20% of the total bandwidth at most, and usually less than 15%.
 
As a PS4 pro owner I've been consistently let down by the console. Some games look great, with huge improvements in image quality or performance, but most don't. I do think it's memory constrained, but I think the real problem is fillrate. CPU got a 30% boost over the vanilla PS4, memory bandwidth got a 24% boost, and shaders/compute got a 130% boost, but fillrate is only 14% higher. I'll admit I didn't take a deep look at the specs before I upgraded, but I think it's unrealistic the expect hardware that can only dray 14% more pixels to draw 400% more without a performance penalty. The hardware is great if the games are texture, shader or CPU limited on standard PS4s, but not if you are fill limited.
Actually fill-rate is around 65% higher in a typical read & write scenario (real bench, not theoretical).

For 1440p you only need to draw 78% more, so around 65% should be enough. (If you talk of % more then 4K is 300% more to draw than 1080p, not 400%).

In the case of Overwatch, well the game simply didn't get decently enhanced by the devs. They added 16xAF and called it a day.
 
Last edited:
I don't know. A GTX 1070 has the same memory bandwidth as PS4Pro but much higher fillrate (over 3x more) and performs much better.I know there is some contention because or the UMA, but by most estimates that's less than 20% of the total bandwidth at most, and usually less than 15%.
Not even the same architecture and the GTX1070 has ... more of everything ... so this can't compare them. And yes there is the memory contention and the GTX has its own memory pool (bigger than the pool for games on the ps4/pro) and the system has even more memory available.
 
Actually fill-rate is around 65% higher in a typical read & write scenario (real bench, not theoretical).
I haven't seen this anywhere. Do you have a source?

For 1440p you only need to draw 78% more, so around 65% should be enough. (If you talk of % more then 4K is 300% more to draw than 1080p, not 400%).
Yep, you are right there. I was using half precision math ;p

Not even the same architecture and the GTX1070 has ... more of everything ... so this can't compare them. And yes there is the memory contention and the GTX has its own memory pool (bigger than the pool for games on the ps4/pro) and the system has even more memory available.
It doesn't have more memory bandwidth. Which is sort of the point I was making, if a part with similar memory bandwidth and more of everything isn't constrained by bandwidth how can a part with less of everything at nearly the same bandwidth. Something else is holding it back.
 
It doesn't have more memory bandwidth. Which is sort of the point I was making, if a part with similar memory bandwidth and more of everything isn't constrained by bandwidth how can a part with less of everything at nearly the same bandwidth. Something else is holding it back.
Again, different architecture. It just might use the bandwidth better (e.g. compression techniques or something like this). Also don't forget the big part memory contention has on the bandwidth. Shifty just stated this in another that this can have a big hit on memory bandwidth. So even if it would have the same bandwidth like the GTX 1070, it can't use it. The bandwidth is not much higher than the PS4, they just upped it a bit.
Yes, they might have done a better pro-patch, but we still don't know why it might be a difference. Like I stated, memory-size can be the thing (just a guess), just because 512MB more memory is not that much. Also 16x AF might already eat up a bunch of the memory bandwidth.
The XboX on the other hand has much more RAM and a much higher bandwidth. Maybe it was just easier to optimize for this system and nothing needed a big rework. We just don't know until blizzard would say why they didn't increase the res.
btw, they also didn't make it full 4k for XboX but should be possible if the xb1 could hit 1080p most of the time.
But well... we will never know how the engine is build.
 
It's 8X AF on Pro...

Also : "It's an excellent showing overall and the implementation here is so impressive that it only deepens the mystery surrounding the lacklustre Pro upgrade. Xbox One X does have big compute and memory bandwidth advantages over its PlayStation rival, but at the minimum there's a 2.2x increase in pixel-count when the X's dynamic scaler is pushed hardest, rising to a more typical 2.9x in busy gameplay, propelled upwards again to a full-on 4x in less busy scenes. Despite the spec advantage, we still feel sure that PS4 Pro could deliver much more."

http://www.eurogamer.net/articles/digitalfoundry-2018-overwatch-xbox-one-x-patch-analysis

I mean, the lack of Pro support is obvious for everyone...
 
It's sort of funny to see how because of a single decision by Sony within their 4Pro SDK example on how to detect and handle the extra resources of the 4Pro that they're now having to implement new firmware mode to over-ride what the games are doing.

Anyways, here's the DF article on the new SuperSampling override for 4Pro: http://www.eurogamer.net/articles/d...er-sampling-tested-big-boosts-for-1080p-users

With the arrival of the upcoming firmware 5.5, Sony has introduced a new option - system-level super-sampling. It addresses a key frustration for PS4 Pro users hooked up to 1080p screens: the lack of access to high resolution support on a range of games.

...

Comparisons with Xbox One X's implementation are inevitable, and while the new super-sampling mode for Pro users is useful, it's still something of a fudge. After all, the whole concept of having Pro enhancements linked to specific display outputs in the first place is a really bad idea. Microsoft's alternative approach is simple and elegant: the system doesn't tell developers which display is attached to the console. This forces game-makers to offer in-game options for all modes, regardless of whether a 1080p or 4K TV is attached and by extension, 1080p super-sampling is automatically taken care of at the system level. It's the ideal set-up and the one we've advocated since PlayStation 4 Pro launched.

It's worth stressing that the majority of games, from Rise of the Tomb Raider to Horizon Zero Dawn to Monster Hunter World do get it right, ensuring that all game modes are available to all users, regardless of what display they might happen to own. With Xbox One X effectively enforcing this standard for all games, hopefully future Pro titles will have SSAA built-in too. However, Metal Gear Survive's beta once again saw exclusive modes that kick in depending on which display output is selected. If this remains the case in final code, it demonstrates that firmware 5.5's SSAA functionality may well be a useful fallback option to have not just for existing games, but potentially for future titles too.
 
And now they look at the PC version of FFXII -- http://www.eurogamer.net/articles/digitalfoundry-2018-final-fantasy-12-pc-analysis

Final Fantasy 12 on PC delivers 60fps - but system requirements are high
A PS2 remaster as demanding as many modern games.

Ultimately, Final Fantasy 12 seems to have remarkably high system requirements for what is a PlayStation 2 remaster, and while 1080p60 should be well within reach for today's mainstream gaming PCs, reaching beyond this to higher resolutions requires a lot of optimisation effort - a task not made easy by the menu system. Meanwhile, AMD performance (we tested on two distinct systems with the latest drivers) seems strangely low. Hopefully, some of these issues will be addressed in a future patch, but right now it's something to bear in mind when considering the PC release.

Overall, while the graphics menu set-up and overall PC performance is less than ideal, in other respects the conversion work is excellent. There are no glaring issues when playing at 60fps, and the smoother frame-rate makes the experience is palpable upgrade over the standard PS4 and Pro versions. Reduced detail texture oddities aside, the PC version of Final Fantasy 12 is the best way to play - just make sure you come equipped with the GPU power to get the job done.

---
The Zodiac Age gets a new port and it can run at 60fps! Indeed, you can't actually run it any higher. But what kind of hardware do you need to get the job done? We test a range of modern GPUs at 1080p, 1440p and 4K resolutions and offer up a full graphics comparison.

 
It's sort of funny to see how because of a single decision by Sony within their 4Pro SDK example on how to detect and handle the extra resources of the 4Pro that they're now having to implement new firmware mode to over-ride what the games are doing.

Anyways, here's the DF article on the new SuperSampling override for 4Pro: http://www.eurogamer.net/articles/d...er-sampling-tested-big-boosts-for-1080p-users

With the arrival of the upcoming firmware 5.5, Sony has introduced a new option - system-level super-sampling. It addresses a key frustration for PS4 Pro users hooked up to 1080p screens: the lack of access to high resolution support on a range of games.

...

Comparisons with Xbox One X's implementation are inevitable, and while the new super-sampling mode for Pro users is useful, it's still something of a fudge. After all, the whole concept of having Pro enhancements linked to specific display outputs in the first place is a really bad idea. Microsoft's alternative approach is simple and elegant: the system doesn't tell developers which display is attached to the console. This forces game-makers to offer in-game options for all modes, regardless of whether a 1080p or 4K TV is attached and by extension, 1080p super-sampling is automatically taken care of at the system level. It's the ideal set-up and the one we've advocated since PlayStation 4 Pro launched.

It's worth stressing that the majority of games, from Rise of the Tomb Raider to Horizon Zero Dawn to Monster Hunter World do get it right, ensuring that all game modes are available to all users, regardless of what display they might happen to own. With Xbox One X effectively enforcing this standard for all games, hopefully future Pro titles will have SSAA built-in too. However, Metal Gear Survive's beta once again saw exclusive modes that kick in depending on which display output is selected. If this remains the case in final code, it demonstrates that firmware 5.5's SSAA functionality may well be a useful fallback option to have not just for existing games, but potentially for future titles too.
Interestingly, with the 5.5 patch the best setup (having more options) will be on Pro.

For instance with Metal gear Survive Pro's owner will have 2 ways of playing the game: at 1080p without downsampling for higher framerate or at 1080p with downsampling for less aliasing. XBX owners could only play this game at high resolution (and less stable framerate). Actually Pro will be more 'PC' than XBX here.
 
Interestingly, with the 5.5 patch the best setup (having more options) will be on Pro.

For instance with Metal gear Survive Pro's owner will have 2 ways of playing the game: at 1080p without downsampling for higher framerate or at 1080p with downsampling for less aliasing. XBX owners could only play this game at high resolution (and less stable framerate). Actually Pro will be more 'PC' than XBX here.

Except, XBX has 1440p output :p
 
Status
Not open for further replies.
Back
Top