Digital Foundry Article Technical Discussion [2017]

Status
Not open for further replies.
They likely wanted to remove the completely randomness of the network from their controlled GPU stress sections. DF indicated that was the same section that causes resolution drops on the 4Pro.
I'm just trying to debunk any forced parity theories, or answer why it's only 30fps on X. It's not a GPU issue, it's a CPU issue. And in order to test the CPU, you need to do PVE activities outside of the campaign.
 
DF's stress test isn't even close to being the most stressful area of the game. Public events, which are mini missions in large open public spaces where you can have 6+ people with super abilities going off and dozens of enemy AI on screen.

Bungie straight up said it's a CPU limitation. DF seems to only test the campaign, which is not where the CPU is stressed.

Sure that would explain why they don't try for 60 FPS, but doesn't explain why there aren't more GPU stressful options enabled. Better shadows, AO, AA, AF, etc. All of which are lower than the PC counterparts and have minimal CPU requirements.

Shadows could be affected by number of players on the GPU side potentially, but AO, AA, and AF are examples of GPU features that wouldn't be impacted in any meaningful way by the addition of extra players.

If they are already at the max of what they can accomplish then that's cool. But we'll never know unless Bungie talks about it. The fact that there aren't ever any drops in graphically (not CPU) limited scenes leaves one to wonder if they were just lazy, wanted to maintain graphical parity (not forced, but voluntary decision), or if it is the best they can do.

Either way, I'm glad that they didn't go completely overboard as some developers have done.

Regards,
SB
 
Well, they basically designed for parity with Destiny 1 on Xbox One and PS4. PS4 obviously had room left on the GPU to handle more, but they ended up with pixel to pixel match with Xbox One.
 
Sure that would explain why they don't try for 60 FPS, but doesn't explain why there aren't more GPU stressful options enabled. Better shadows, AO, AA, AF, etc. All of which are lower than the PC counterparts and have minimal CPU requirements.

Shadows could be affected by number of players on the GPU side potentially, but AO, AA, and AF are examples of GPU features that wouldn't be impacted in any meaningful way by the addition of extra players.

If they are already at the max of what they can accomplish then that's cool. But we'll never know unless Bungie talks about it. The fact that there aren't ever any drops in graphically (not CPU) limited scenes leaves one to wonder if they were just lazy, wanted to maintain graphical parity (not forced, but voluntary decision), or if it is the best they can do.

Either way, I'm glad that they didn't go completely overboard as some developers have done.

Regards,
SB
They also spent significantly more time with their version of the title. Which looking back and some other developers, do look like they just saw the power and pushed it out without really taking the time to go through and optimize everything. They wanted to capitalize on making sales to early adopters.
 
Destiny 2: Pro 4K CBR vs XBX native pic comparison. I think it's the first time we have a direct and quite fair comparison.

I am quite surprised by how well the 4K CBR fares against native 4K. Overall I'd say the sharpness difference is even slightly smaller than your typical 900p versus 1080p comparison.

The reason: CBR 4K is not further blurred by bilinear upscaling. It seems they use a very good CBR solution in Destiny 2.

http://images.eurogamer.net/2017/articles/2017-12-08-17-23/Pro_000.jpg
http://images.eurogamer.net/2017/articles/2017-12-08-17-30/XOX_000.jpg
 
When did overhead (negative) and headroom (positive) become interchangeable. The 4K overhead on PS4Pro results in checkerboarding plus dynamic resolution to reach it. The 4K headroom on Xbox One X results in a native resolution throughout.
 
Destiny 2: Pro 4K CBR vs XBX native pic comparison. I think it's the first time we have a direct and quite fair comparison.

I am quite surprised by how well the 4K CBR fares against native 4K. Overall I'd say the sharpness difference is even slightly smaller than your typical 900p versus 1080p comparison.

The reason: CBR 4K is not further blurred by bilinear upscaling. It seems they use a very good CBR solution in Destiny 2.

http://images.eurogamer.net/2017/articles/2017-12-08-17-23/Pro_000.jpg
http://images.eurogamer.net/2017/articles/2017-12-08-17-30/XOX_000.jpg

If you look at it at native 4k, the differences are jarring at times and not so jarring at other times.

The biggest difference is when looking at the finer details. Take the wires in the provided shot. They are extremely jagged on the PS4 version. The grass is also noticeably worse. The vegetation on the roof of that house has obviously less detail. The roof has obvious stair stepping. The device on the left side has an entire thing bizarrely missing on the PS4-P. The items on the table by the NPC are less detailed. Most of this won't be noticeable in action, but the jaggies will crawl and be very noticeable to some while not being that noticeable to others.

On the flip side. Same as with the 900p/1080p comparisons for base PS4/XBO in a lot of games. If you don't look at them side by side, you'd never know that the PS4-P was slightly worse. I doubt anyone playing the PS4-P version will be unhappy with the presentation and overall it looks great. Just like 900p/1080p comparisons, it's unlikely anyone would notice the difference without side by side comparisons. Especially at typical living room distances.

Regards,
SB
 
The written article to go with their video, here's some snippets.

http://www.eurogamer.net/articles/digitalfoundry-2017-destiny-2-xbox-one-x-analysis

And that full 4K pixel-count does appear to be completely locked on the X. Base Xbox One and PlayStation 4 Pro both require the use of a dynamic resolution scaler operating across the horizontal axis in order to deliver Destiny 2's signature solid frame-rate, a situation we had half-expected the Xbox One X to replicate bearing in mind the impressive uptick in resolution. However, based on our suite of Destiny 2 stress tests, we couldn't get the dynamic scaler to kick in at all, Xbox One X sticking hard and fast to its target resolution.

Fine detail like grass is much clearer at Xbox One X's true 4K output, and trumps PS4 Pro's checkerboarded approach.

Overall though, it's a clearer image compared to the checkerboarded 4K on PS4 Pro, and a huge leap over the native 1920x1080 on a standard Xbox One.

As things stand, Destiny 2 on Xbox One X is well worth checking out. It delivers an excellent presentation that simply looks sensational, especially when the game's pyrotechnics flow in full effect. Within the bounds set by the console versions' feature set, Bungie delivers in scaling up the game beautifully for 4K display owners. What the team has handed in here essentially makes good on the promise Microsoft made for the hardware - a 4x increase to resolution with some overhead left over. In the case of this title, it seems that this little extra is deployed in ensuring consistency in the resolution - full 4K with no compromises - and in the process, players get the most pristine console edition of Destiny 2 available.
 
I wish Bungie had provided a checkerboard 4k option with highest (ish) PC setting on Xbox One X (16 AF, highest shadow quality etc)
They likely considered it and felt it looked better this way around.
 
Sure that would explain why they don't try for 60 FPS, but doesn't explain why there aren't more GPU stressful options enabled. Better shadows, AO, AA, AF, etc. All of which are lower than the PC counterparts and have minimal CPU requirements.

Shadows could be affected by number of players on the GPU side potentially, but AO, AA, and AF are examples of GPU features that wouldn't be impacted in any meaningful way by the addition of extra players.

If they are already at the max of what they can accomplish then that's cool. But we'll never know unless Bungie talks about it. The fact that there aren't ever any drops in graphically (not CPU) limited scenes leaves one to wonder if they were just lazy, wanted to maintain graphical parity (not forced, but voluntary decision), or if it is the best they can do.

Either way, I'm glad that they didn't go completely overboard as some developers have done.

Regards,
SB
Player count does affect the GPU because players can use super or class abilities that can produce a lot of alpha and particle effects on screen, far more so than in the campaign where things are far more controlled. Public areas are way more unpredictable.

You can't just play the campaign and think that there's extra headroom there, that's not where the engine is stressed the most.

Bungie is probably just going for a more stable experience on the premium consoles like they usually do. That's better than other devs pushing extra effects and/or resolution to sacrifice performance as we've been seeing in quite a few cases recently IMO, even if it's for only ~20% of the game. I'd take stable performance over slightly better effects any day, especially for things like AO, AF, shadow quality, alpha or particle effects, or even LOD. Not saying lower these settings significantly, but the difference between high and max for example is negligible to me.
 
Last edited:
the alpha resolution caused visual problems on the 4pro from what i remember from their 4pro analysis. Think the checkerboarding made it show up even more.
how is it on the 1X?

has the resolution been upped, how does it compare non checkerboarded? Considering this was something i felt they highlighted, I'm surprised they never analyzed it on the 1X.

for better or for worse, if it's not scaling down rez, has locked fps it probably means under general circumstances it's leaving a lot of Gpu power on the table.
Hence why i like dynamic rez.
 
DigitalFoundry has a video on "How Does Xbox 360 Backwards Compatibility on Xbox One Actually Work?". No article yet.


Rich spoke to Microsoft about how Xbox 360 back-compat and XBox One X enhanced 360 support actually work. We'll have more on this soon, but in the meantime, here are the top-line facts. Emulation, you say? It's actually a bit more complex than that...

Article is now up -- http://www.eurogamer.net/articles/d...x-one-x-back-compat-how-does-it-actually-work
 
Player count does affect the GPU because players can use super or class abilities that can produce a lot of alpha and particle effects on screen, far more so than in the campaign where things are far more controlled. Public areas are way more unpredictable.

You can't just play the campaign and think that there's extra headroom there, that's not where the engine is stressed the most.

Bungie is probably just going for a more stable experience on the premium consoles like they usually do. That's better than other devs pushing extra effects and/or resolution to sacrifice performance as we've been seeing in quite a few cases recently IMO, even if it's for only ~20% of the game. I'd take stable performance over slightly better effects any day, especially for things like AO, AF, shadow quality, alpha or particle effects, or even LOD. Not saying lower these settings significantly, but the difference between high and max for example is negligible to me.

Again, those are predominantly GPU related effects. IE - the CPU is irrelevant in increasing the graphics effects I mentioned. AA, AO, and AF. Only shadows would see a slight impact on performance with more players. BTW - multiplayer doesn't increase the limit of players much more than single player. You get ~2x the players, but greatly reduced scene complexity. Limited arena versus open world.

You also don't have AI in competitive multiplayer (the only mode with more players) which reduces the CPU load and GPU load.

So, yes, I still fail to see why AA/AO and especially AF are unchanged on the XBO-X. Increased players would have absolutely no effect on AF or AO with a minimal impact on AA. Yet it remains unchanged.

I'd be interested to see them get multiple players and test out a public event in the open world. But that, of course, introduces variation in testing, unfortunately.

Regards,
SB
 
Video for Ghost Recon Wildlands, One X.



Summary of enhancements:
  • Higher resolution (1800p)
  • Higher quality assets
  • Higher quality texture filtering
  • Higher quality AntiAliasing
  • Higher geometry detail
  • Further LOD, less pop-in transitions
 
Destiny 2: They tested all big Multiplayer games in MP mode before. So randomness of situations is not the reason.

VG tech does this regularly (and fairly) on MP games. With a bit or work, it's possible.
 
Destiny 2: They tested all big Multiplayer games in MP mode before. So randomness of situations is not the reason.

VG tech does this regularly (and fairly) on MP games. With a bit or work, it's possible.
Destiny is pretty stable to be honest. If you’re looking to split hairs over console wars I guess it’s worth doing. But the service that DF provides to consumers is whether the game is going to be a great experience for you and if you need help with your buying decision, it can separate that.

That being said, Destiny is pretty stable, they don’t over budget, and so the game runs very well on all their platforms. Only a handful of times have I seen some dipping on OG. But since moving to 1X I don’t see any issues at all.
 
Compared to the PRO, X1X is delivering well beyond the flop difference on the GPU.

Bandwidth bottleneck on Pro... same thing happened between the PS4/XB1, but to a lesser extent (obviously). In this specific game, you had 1080p vs 900p + much better performances on PS4 which is greater than the GPU difference.
 
Last edited:
Status
Not open for further replies.
Back
Top