Digital Foundry Article Technical Discussion [2018]

Status
Not open for further replies.
haven't found a DF article or video on this yet, but Panic Button are like what Rare was in the SNES and N64 days, wringing out the console like no one ever did.

I'm almost certain that they downgraded some graphical features to improve the resolution. So no magic here.

One obvious downgrade to me is the volumetric lighting. But overall, the game is now much more balanced. They made the right compromises.
 
I'm almost certain that they downgraded some graphical features to improve the resolution. So no magic here.

One obvious downgrade to me is the volumetric lighting. But overall, the game is now much more balanced. They made the right compromises.
That might be a necessity, though. Switch has higher fillrate but is bandwidth limited. 360 has lower fillrate but higher bandwidth. So switch might have to drop the bandwidth sapping effects like smoke and other volumetric effects, but still have the actual fillrate available to hit those higher resolutions.
 
Article to go along with the video: https://www.eurogamer.net/articles/digitalfoundry-2018-crysis-trilogy-backwards-compatibility-report

The Crysis Trilogy on Xbox One back-compat offers big performance boosts
But is it worth replaying?

Microsoft surprised us last week with the backwards-compatible re-release of the Crysis trilogy for Xbox One and Xbox One X. Once again - alas - there's no sign of X-enhanced support for these titles, but what we do get is one of the most dramatic performance upgrades yet, and a chance to revisit a fascinating period of Xbox 360 history.

We all know of the original Crysis' legendary status amongst PC users, and the fact that even today, the game can still bring the most powerful CPU hardware to its knees. But the wider point here is that developer Crytek were pushing the envelope in rendering and simulation in ways that no other game would dare attempt at the time. Suffice to say, when Crytek announced a multi-platform future for the franchise with Crysis 2, there was doubt from all sides. Would PC users be let down with a console port? And did Xbox 360 and PlayStation 3 have the hardware chops to run CryEngine 3 - even on lower than low settings?

When it did eventually launch in 2011, Crysis 2 - the first of the trilogy to hit Xbox 360 - was definitely a mixed bag. On the one hand, the CryEngine experience was there and the game looked quite unlike anything else on the platform. Aside from various bugs and some obvious, distracting pop-in, it was a stunning visual achievement. On the other, performance was poor, with frame-rates tanking into the teens. And this is where back-compat truly makes a difference. At worse, the standard Xbox One dips to the mid-20s, while Xbox One X mostly hits the 31fps performance target - with only minor drops.

 
Wonder why they chose 31fps or is it akin to the frame-pacing issue in the Halo games?
I think that's been a thing on Cryengine games on console for a while. The 30FPS cap always produces 31. I'm curious if these games would feel better with freesync.
 
I don't know why for sure, but if I had to guess it would be that they limit the framerate in a way that's just course enough to lock to 31 instead of 30. Like, if you have vsync on at 60hz and you only update every 2 frames you would have a solid 30FPS, assuming you weren't limited in some way. But they may be limiting it in a 33ms per frame instead of 2 frames per refresh, you would get a different result, because 30 fps is 33.3333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333... ms per frame. So it's probably math that's the problem.

Also, 31 fps is only a problem because of 60hz displays cannot evenly display 31 fps. You have that extra frame, so there is a frame every 2 refreshes but that extra frame isn't just there by itself, you have 3 frames in a row that update at 16 ms, so 10% of the frames rendered per second have less persistence than the other 90%. If we had 31hz (or 62hz, or 124hz, or 93hz) monitors it wouldn't be a problem. Actually, it might still be a problem because xbox one probably doest support a locked 31hz refresh rate.
 
I don't know why for sure, but if I had to guess it would be that they limit the framerate in a way that's just course enough to lock to 31 instead of 30. Like, if you have vsync on at 60hz and you only update every 2 frames you would have a solid 30FPS, assuming you weren't limited in some way. But they may be limiting it in a 33ms per frame instead of 2 frames per refresh, you would get a different result, because 30 fps is 33.3333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333... ms per frame. So it's probably math that's the problem.

Also, 31 fps is only a problem because of 60hz displays cannot evenly display 31 fps. You have that extra frame, so there is a frame every 2 refreshes but that extra frame isn't just there by itself, you have 3 frames in a row that update at 16 ms, so 10% of the frames rendered per second have less persistence than the other 90%. If we had 31hz (or 62hz, or 124hz, or 93hz) monitors it wouldn't be a problem. Actually, it might still be a problem because xbox one probably doest support a locked 31hz refresh rate.
btw, how is 31hz possible if xbox one is forcing vsync on it? Wouldn't that automatically cap the frames at 30fps if it isn't an dynamic approach?
 
btw, how is 31hz possible if xbox one is forcing vsync on it? Wouldn't that automatically cap the frames at 30fps if it isn't an dynamic approach?
Because the console is outputting 60hz regardless of how fast the game is rendering frames. So the game is attempting to lock to 30 but rendering 31, and the console is pushing 60 updates to the display. 29 frames are spaced as 1 frame updated every 2 display refreshes, but the next 2 are updated on each refresh, followed by the first frame of the next second.

Actually, thinking about this a bit further, I think Xbox One can be set to 30hz, if not at 1080p, it should be possible in 4k. I'm pretty sure there are options for it, and even if they aren't, if you used a lower bandwidth cable or older 4k display that only supports 4k30. So I wonder if it's possible to even out the framerate by changing th output options on the xbox.
 
https://www.eurogamer.net/articles/digitalfoundry-2018-red-dead-redemption-2-face-off

Red Dead Redemption 2 looks and plays best on Xbox One X
Every console version tested.

PlayStation 4 - and latterly, PS4 Pro - have taken centre-stage in Red Dead Redemption 2's pre-release marketing campaign, meaning we have a pretty decent idea of how Rockstar's latest epic presents on Sony hardware. Today, we can discuss the Xbox One versions of the game, and the key takeaway is this: if you're looking for the very best RDR2 experience, Xbox One X is the go-to platform for this game. Rockstar's stunning technological achievement runs at native 4K on the X, and also delivers the smoothest performance. Bearing in mind just how far Rockstar is pushing current-gen hardware, that's a stunning achievement.


There's clear variation between the consoles then, but it is worth stressing the positive points - there's platform parity in terms of the vast majority of the rendering features, with no noticeable omissions on the base machines, which are divided only by resolution. While performance can be variable, I'd still say that the PS4 and Xbox One S still hand in more stable frame-rates than the original Red Dead Redemption, with fewer variations in the visual make-up than the last game too. But really, to see this phenomenal game at its absolute best, Xbox One X is the platform of choice - and by quite a considerable margin.
 
I don’t recall which thread I was being debated about what’s was an impressive performance for X1X. But here we go. More than 6x performance. 864p vs 4K.
On one of the most graphically intense games there is.

Now that the X1X has been out over a year, we’re starting to see developers push out serious performance on the machine. It has never been just more TF. The profiling technique that MS has used to beef up the correct areas of the GPU is reaping benefit now. I do not believe a 1060 which most people compared the X1X to will stand a chance. This is clearly operating very close to a 1070 performance level now.

Unfortunately we’ll never know since it’s not coming to PC.

MS has succeeded in reaching their goal of 4K where it counts. Looking forward to see how next gen plays out for them using enhanced games to profile 4K for the next console.
 
My biggest concern right now is that people are preferring the PS4 1080p over the CB 4K of the Pro. John indicates that the checkerboarding causes so much artifacting that it ends up resolving detail worse than 1080p.

Not good. I was really hoping for there to be a standard reconstruction technique that would be super good and carry over to next gen. Seems like each CB implementation will be highly dependent on each title.

Edit: also why I am all in on neural network upscaling vs algorithmic.
 
Last edited:
My biggest concern right now is that people are preferring the PS4 1080p over the CB 4K of the Pro. John indicates that the checkerboarding causes so much artifacting that it ends up resolving detail worse than 1080p.

Not good. I was really hoping for there to be a standard reconstruction technique that would be super good and carry over to next gen. Seems like each CB implementation will be highly dependent on each title.

Edit: also why I am all in on neural network upscaling vs algorithmic.

They could have done better choice, 1440p plus temporal reconstruction like Spiderman or Ratchet and Clank is a better solution and we would maybe have a better framerate... It looks like AC Odyssey before a patch on PS4 Pro. I will wait a patch or I will wait PS5 and a PS5 patch or a PS5 remaster.
 
They could have done better choice, 1440p plus temporal reconstruction like Spiderman or Ratchet and Clank is a better solution and we would maybe have a better framerate... It looks like AC Odyssey before a patch on PS4 Pro. I will wait a patch or I will wait PS5 and a PS5 patch or a PS5 remaster.
I don't think it's ever that simple. Not every engine renders the image the same order or path. I'm sure if their engine allowed for, they would be able to do a better reconstruction.
 
I don't think it's ever that simple. Not every engine renders the image the same order or path. I'm sure if their engine allowed for, they would be able to do a better reconstruction.

I'm pretty sure R* could have allowed for a more native resolution of 1440p, rather than that bad reconstruction job of trying to achieve 4K. Because right now, the Pro version looks quite bad stacked up to the standard PS4.
 
I'm pretty sure R* could have allowed for a more native resolution of 1440p, rather than that bad reconstruction job of trying to achieve 4K. Because right now, the Pro version looks quite bad stacked up to the standard PS4.
Agreed. I would rather have them do 1440p and drop checkerboarding.
 
All I want is a 60fps performance mode on One X, but I'm happy the 4k performance is pretty much locked. Considering how physics driven their animation system is, I realize this is probably not a realistic want.
 
Status
Not open for further replies.
Back
Top