Digital Foundry Article Technical Discussion [2017]

Status
Not open for further replies.
If you know nothing about the context of the numbers I’d hardly call that the whole picture.

Not saying it’s not a relevant data point. But over attributing without knowing how many frames will dip that low or for how long or if that number can be reproduced reliably, is not really knowing the whole story.

Above 1440p for All of 30 seconds to 2 minutes for a game that plays for 200hrs is not relevant data point. I don’t even know if it amount to 0.000025% or game time. That puts it probably near 6 deviations out from average resolution.

Indeed. If the game manages to render in between 4k and 1800p most of the time, I simply don't see the point of being a stickler for the statistical oddity of spending maybe a few frames at a much lower pixel count in the grand scheme of things. I'd wager the game actually spends more time rendering inbetween 4k and 1800p than it does at 1800p. I'd say that's a massive leap compared to the base X1 version at 900p. As it is, the 1X version doesn't drop frames in the 4k mode in intensive areas.

The presence of the 60fps mode is great for future hardware as right of the bat you can get 4k/60fps TW3.
 
Above 1440p for All of 30 seconds to 2 minutes for a game that plays for 200hrs is not relevant data point. I don’t even know if it amount to 0.000025% or game time. That puts it probably near 6 deviations out from average resolution.

Above 25fps for all of 30 seconds to 2 minutes for a game that plays for 200hrs is not relevant data point. I don’t even know if it amount to 0.000025% or game time. That puts it probably near 6 deviations out from average fps.

While we all may have our preference, a drop in dynamic resolution is effectively the same technical problem as a drop in fps.
 
Above 25fps for all of 30 seconds to 2 minutes for a game that plays for 200hrs is not relevant data point. I don’t even know if it amount to 0.000025% or game time. That puts it probably near 6 deviations out from average fps.

While we all may have our preference, a drop in dynamic resolution is effectively the same technical problem as a drop in fps.
Symptom of the same problem, not effectively the same problem; dynamic resolution is an algorithm of best fit. If load continues to increase resolution will need to drop to compensate. It’s still running at maximum load. Locked 4K CBR is not a function of best fit, meaning there are likely times it’s under utilized and times when it’s over utilized.

Consider the scene, it’s still running at 30fps. The quality of the image will still be good at 30fps. Motion image quality improves at 60fps, the higher the refresh rate the clearer the image.

So 4K CBR is running a higher resolution in a still frame where we are counting pixels will look better; In motion is another story, playing a game with super heavy loads chugging 20fps-30fps locked 4K is not going to look or feel better than 30fps 4K with some blips to 1440p Or even sustained 1440p

You’d be lucky to see the drop in resolution during a loaded heavy scene when chaos is going on.

Imo. They should have dynamic 4K CBR on 4Pro. Not sure why they shipped locked. Not sure why there isn’t dynamic on every title. It just makes sense imo.
 
If load continues to increase resolution will need to drop to compensate.

If load continues to increase fps will need to drop to compensate.

I think you’re having a subjective discussion on which choice is preferred.
 
If load continues to increase fps will need to drop to compensate.

I think you’re having a subjective discussion on which choice is preferred.
It’s not a perfect algorithm granted. Dynamic res is load management tool. CBR is a method of increasing rendering resolution for less resources.

I wouldn’t say they are the same. And it’s only preference if you were forced to select one or the other, I’m pretty sure they can both be enabled at the same time if designed to.
 
Above 25fps for all of 30 seconds to 2 minutes for a game that plays for 200hrs is not relevant data point. I don’t even know if it amount to 0.000025% or game time. That puts it probably near 6 deviations out from average fps.

While we all may have our preference, a drop in dynamic resolution is effectively the same technical problem as a drop in fps.

Dropping resolution is a little different than dropping frames. Dropping resolution is designed, usually hopefully to go unnoticed. The reason it exists is because dropping frames rarely goes unnoticed.
 
Have we seen any cases of both being utilised simultaneously yet?
edit: list time
Battlefield 1
Call of Duty Black Ops III
Call of Duty Infinite Warfare
Call of Duty Modern Warfare Remastered
Call of Duty WWII ** (Temporal reconstruction, not the same technique as CBR, but dynamic reconstruction none the less)
Destiny 2
Deus Ex: Mankid Divided
Final Fantasy XV
 
Last edited:
Dropping resolution is a little different than dropping frames. Dropping resolution is designed, usually hopefully to go unnoticed. The reason it exists is because dropping frames rarely goes unnoticed.

My point is that they’re both a system to drop pixels when load gets greater than compute power. We may subjectively prefer one over the other but I thought we were having a technical discussion, not subjective.

I have my own opinion about which I prefer but I didn’t think it was relevant to the technical discussion.
 
My point is that they’re both a system to drop pixels when load gets greater than compute power. We may subjectively prefer one over the other but I thought we were having a technical discussion, not subjective.

I have my own opinion about which I prefer but I didn’t think it was relevant to the technical discussion.

But dynamic resolution is a technical solution to a problem. It also allows devs to push games harder because they don't need to leave as much headroom to maintain their target. Dropping frames is not a design feature. Dynamic resolution is.
 
My point is that they’re both a system to drop pixels when load gets greater than compute power. We may subjectively prefer one over the other but I thought we were having a technical discussion, not subjective.

I have my own opinion about which I prefer but I didn’t think it was relevant to the technical discussion.
once again, it doesn't need to be one over the other.

Looking at this case in isolation, and why i brought up my previous posts (namely because the narrative was console wars), the reality is 4Pro does a large percentage of this game in 4K CBR. But there are a few dips here and there (even if for 1 whole section) would benefit from dynamic scaling just to get through the player in that area without dropped frames.

It is to the benefit of PS4Pro players that the frame rate stays at 30, the experience is still great. The pixel counting is purely console wars, and trying to cite that 4pro players are getting a better experience at 20fps 4K CBR in the heaviest loads vs a dynamic resolution 4K CBR where it will dip is purely politically driven.

Likely the only reason we don't see dynamic scaling with 4K CBR in this title is because there's only 1 zone where 4Pro will have consistent issues, so it's likely not worth the effort I think.

There seems to be a large misunderstanding here.
4K CBR is a method of producing 4K graphics with less power. If 4Pro could reach it natively in a majority of the game, they would have gone with a dynamic resolution. Since it can't, they went with CBR. A great substitute. If the game consistently fell below frame rate target, we would have seen dynamic scaling included as it would be beneficial throughout the game, but it wasn't the case except for 1 area (in which 1 hr of 100/200 play through doesn't account for more than 1% of game time).

But to say the game is better because of it... (keeping locked 4K at reduced frames) I mean.. common, it's clear Global is just reaching for something to hold over X1X.

Any player annoyed with jarring frame rate would switch from 4K mode to performance mode, clear what they need to do and revert back to where it was smooth and fun to play. If this is what players do, you might as well have programmed it in for all points in time of the game where the frame rate dipped.
 
Last edited:
My point is that they’re both a system to drop pixels when load gets greater than compute power. We may subjectively prefer one over the other but I thought we were having a technical discussion, not subjective.

I have my own opinion about which I prefer but I didn’t think it was relevant to the technical discussion.
Reduced framerate from target is a bug that has existed since the beginning of video games. Dynamic resolution is a recent implementation to address this bug. No developer purposefully implements dynamic framerate, unlike dynamic resolution. I don't think the two are really comparable.
 
Reduced framerate from target is a bug that has existed since the beginning of video games.

If it were truly a bug developers would reduce the average load of a game to prevent it. Is the dynamic frame rate of Shadow of the Colossus on the PS2 a bug? No. It was a design choice.

Dropping frames is not a design feature. Dynamic resolution is.

I believe this is largely incorrect. Now, of course that odd point when 50 explosions happen and yeah, unexpected things happen in games that devs can’t predict so the frame rate drops. But the devs of The Witcher 3 made a conscious choice to keep a certain amount of fog at the expense of reducing frame rate rather than reducing the fog (or resolution).

Since it’s a technical choice, the question then becomes why choose one path for the PS4 Pro and a different path for the X1X. Was it just time? Was it an artistic change of direction? We know both systems are capable of either solution.

I’m wondering if dynamic checkerboard looks like ass at lower resolutions making a drop in fps a subjectively better choice on the PS4 Pro.
 
But to say the game is better because of it... (keeping locked 4K at reduced frames) I mean.. common, it's clear Global is just reaching for something to hold over X1X.

I feel like you’re dipping into subjective console war territory. I’m trying to avoid that.

I will add that I love that the developers have (at least currently) implemented two solutions for the two platforms. It’s nice to see a developer not go for complete parity which gets old. I wish we had seen more of this last generation with the ESRAM/Cell division being so...weird. It would have been really cool to see cross platform games play to the strength of the system they were on rather than the homogenized experience we got.
 
I will add that I love that the developers have (at least currently) implemented two solutions for the two platforms. It’s nice to see a developer not go for complete parity which gets old. I wish we had seen more of this last generation with the ESRAM/Cell division being so...weird. It would have been really cool to see cross platform games play to the strength of the system they were on rather than the homogenized experience we got.

There's at least 3 different solutions, where some developers have gone Dynamic CBR and some have used it on both platforms.
 
There's at least 3 different solutions, where some developers have gone Dynamic CBR and some have used it on both platforms.

I was referring specifically to The Witcher 3 but great if there are other examples as well.

EDIT: Looking back I said “developers” but I meant the team at CDPR.
 
Reduced framerate from target is a bug that has existed since the beginning of video games. Dynamic resolution is a recent implementation to address this bug. No developer purposefully implements dynamic framerate, unlike dynamic resolution. I don't think the two are really comparable.
No.
 
Status
Not open for further replies.
Back
Top