Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
Because the medium preset usually looks and performs worse than "optimized" or "console" settings. I don't think I can recall a scenario where the medium preset was ever a smart settings choice. When we look at DF, HUB, GN etc settings breakdowns to determine smart visual and performance compromises, it's a far cry from the developer's included medium preset and usually very close to console settings.

Maybe now, but abit in the generation its usually medium/low when compared to PC settings. Even now, some settings already go down to medium or lower. Watch dogs for example is quite comparable to medium.
 
The resolution is locked that is fact.
As per the article and video, aside from the Mendoza level at the back where the flowers are, Xbox Series consoles are locked and they had difficulties finding anything notable outside of that particular area. If think about the time played in a game, say like 20 hours of play time, how long will you spend in the flowers section? 5% of 20 hrs is 1 hr. You won't spend 1 hr in the flowers, so you'd be lucky to spend more than 10-15 minutes. You're looking at something closer to 98% of the time with frame rates at 60fps. That's how I generated the numbers.

As for paper difference between the machines, no, at least not in the sense that you ask. Ever since PS5 went with a fixed power draw with shared power between CPU and GPU and dynamic clocks, you'd have to look over the range of values that PS5 can perform at, vs the static level of XSX. So my answer is, I don't know. Hitman is a CPU intensive title, so there should be some considerations there. I don't think there will ever be a 'an ideal paper spec' title for PS5. It's a balancing act between CPU and GPU performance. It would be inappropriate to assume that the CPU could never be a bottleneck and the system was designed to ensure the only bottleneck would be on the GPU. We only have those types of setups when we are benchmarking GPUs.
So the figures were made up. I only just watched the video, the implication is PS5 is 100% - not only that they noted it was not just when in the flowers, I think alpha effects were mentioned. There are some really big drops too - you have to wonder what would happen with resolutions reversed.

Regarding expectations, it was a simple question of TF vs TF - I appreciate it’s not that simple.

IMHO It seems a shame they couldn’t implement dynamic resolution across the board, it seems both consoles would have benefited.
 
Maybe now, but abit in the generation its usually medium/low when compared to PC settings. Even now, some settings already go down to medium or lower. Watch dogs for example is quite comparable to medium.
and valhalla i mixed settings with foliage higher than ultra on ps5 so I think adding console settings to pc version would be beneficial and argument that in 2025 or 2026 it will be medium or lower doesn't make much sense for me ;d
 
The resolution is locked that is fact.
As per the article and video, aside from the Mendoza level at the back where the flowers are, Xbox Series consoles are locked and they had difficulties finding anything notable outside of that particular area. If think about the time played in a game, say like 20 hours of play time, how long will you spend in the flowers section? 5% of 20 hrs is 1 hr. You won't spend 1 hr in the flowers, so you'd be lucky to spend more than 10-15 minutes. You're looking at something closer to 98% of the time with frame rates at 60fps. That's how I generated the numbers.

As for paper difference between the machines, no, at least not in the sense that you ask. Ever since PS5 went with a fixed power draw with shared power between CPU and GPU and dynamic clocks, you'd have to look over the range of values that PS5 can perform at, vs the static level of XSX. So my answer is, I don't know. Hitman is a CPU intensive title, so there should be some considerations there. I don't think there will ever be a 'an ideal paper spec' title for PS5. It's a balancing act between CPU and GPU performance. It would be inappropriate to assume that the CPU could never be a bottleneck and the system was designed to ensure the only bottleneck would be on the GPU. We only have those types of setups when we are benchmarking GPUs.
On Jaguar CPU, maybe in some scenes, but not on Zen 2 CPUs. The game runs mostly at 60fps on Pro at 1080p. This is a PSVR ready game. And about the severe drops seen on XSX when plenty of alphas, I find them very suprising and I am suprised DF didn't have any problem with them. This is a XB1 game and the XSX should have no trouble running this part at locked 60fps. Those drops clearly show some kind of hardware bottleneck on XSX when plenty of alphas (but not that plenty, it's still a XB1 game). And we already saw similar problems with alphas effects on AC Valhalla and COD (on XSX smoke coming from the gun is significantly reduced, almost inexistent, compared to the PS5 version).
 
On Jaguar CPU, maybe in some scenes, but not on Zen 2 CPUs. The game runs mostly at 60fps on Pro at 1080p. This is a PSVR ready game. And about the severe drops seen on XSX when plenty of alphas, I find them very suprising and I am suprised DF didn't have any problem with them. This is a XB1 game and the XSX should have no trouble running this part at locked 60fps. Those drops clearly show some kind of hardware bottleneck on XSX when plenty of alphas (but not that plenty, it's still a XB1 game). And we already saw similar problems with alphas effects on AC Valhalla and COD (on XSX smoke coming from the gun is significantly reduced, almost inexistent, compared to the PS5 version).
Alpha-Effects were always a problem on consoles. PS5 has just the performance edge here because of the lower resolution. But maybe there are also much more problematic with DirectX in the background. In AC Vahalla it seems more to be a bug. AC Vahalla has a dynamic resolution scaling and should have scaled down in those situations which also doesn't work. So it should be more bug-related.

For the PS5 I would count out a power limitation because the game already runs at >40fps on PS4 Pro when unlocked. Therefore the Zen 2 CPU should easily handle 60fps. But maybe the next-gen versions do also have other effects that hit the CPU.
If the CPU is now power-hungry it would take away the power-budget from the GPU on PS5. But as I already wrote, I doubt that.

Lower shadow quality + lower resolution more or less points to less available memory. Which might point to the usage of the BC mode. As also the "old" games (Episode 1 & 2) are playable through hitman 3 it might be the only way to get it runnig on PS5 without a total conversion to a native PS5 game.
 
Alpha-Effects were always a problem on consoles. PS5 has just the performance edge here because of the lower resolution. But maybe there are also much more problematic with DirectX in the background. In AC Vahalla it seems more to be a bug. AC Vahalla has a dynamic resolution scaling and should have scaled down in those situations which also doesn't work. So it should be more bug-related.

For the PS5 I would count out a power limitation because the game already runs at >40fps on PS4 Pro when unlocked. Therefore the Zen 2 CPU should easily handle 60fps. But maybe the next-gen versions do also have other effects that hit the CPU.
If the CPU is now power-hungry it would take away the power-budget from the GPU on PS5. But as I already wrote, I doubt that.

Lower shadow quality + lower resolution more or less points to less available memory. Which might point to the usage of the BC mode. As also the "old" games (Episode 1 & 2) are playable through hitman 3 it might be the only way to get it runnig on PS5 without a total conversion to a native PS5 game.
Game is a native PS5 app, not BC mode. Hence the VR mode being only playable on PS4 and Pro. Maybe the game just targets a lower resolution on PS5 because at 4K the PS5 would be dropping the framerate too often? That sounds reasonable. It is not exactly an arbitrary decision to edit the config file dictating the game's resolution.
 
1800p cap on PS5 dosent make sense if its almost 60fps@1080 on a pro.

Then dropping from 4k from Hitman 2 to 1440p on Xbox One X is also odd
 
1800p cap on PS5 dosent make sense if its almost 60fps@1080 on a pro.

Then dropping from 4k from Hitman 2 to 1440p on Xbox One X is also odd
And I think ps4pro has same settings as xox with even higher anizo ? Quite strange
 
Not sure why they didn’t just set both to dynamic res, to iron out the frame rate issues on X while also allowing the PS5 to nudge up in parts where it probably could run at 4K easily. Then again I’ll probably never actually see the difference between 4K and 1800p on my TV so who cares. Twitter is certainly having a field day with this though.
 
And I think ps4pro has same settings as xox with even higher anizo ? Quite strange

The PS4 Pro has a 1080p@60fps mode that makes it essentially a match for the Xbox Series S. With the Pro having higher shadow quality.

I can't understand why the Xbox One X is the same as the Pro in resolution mode. Surely there should be room for a higher resolution?

Mathematically the XSX should be a higher resolution when compared to the PS5, only every single other game tells a different story.

It's definitely an odd situation for most of the machines.
 
Correct me if I’m wrong, but 4Pro uses interpolation for its 60 FPS which if I understand is the normal type of thing to support PSVR. It’s not the same as a true 60fps update that XSS is running.

I can't see any of the usual frame interpolation artifacts in DF's video... I need take a look again and the original comments from IOI.

Edit: apparently it is frame interpolation (on 4 Pro). Really nice implementation of it. Not sure I could tell based on the DigitalFoundry video.

I can't make sense of it, the PS4 Pro has single frame drops. If it was interpolated then you'd think it'd have to drop by 2 frames/second. 59 frames (or 53) cannot be divided by 2.

Screenshot_20210122-135707_YouTube.jpg
 
Last edited by a moderator:
I can't see any of the usual frame interpolation artifacts in DF's video... I need take a look again and the original comments from IOI.

Edit: apparently it is frame interpolation (on 4 Pro). Really nice implementation of it. Not sure I could tell based on the DigitalFoundry video.

I can't make sense of it, the PS4 Pro has single frame drops. If it was interpolated then you'd think it'd have to drop by 2 frames/second. 59 frames (or 53) cannot be divided by 2.

View attachment 5226
IIRC frame interpolation is additive.
ie: it's a regular 30fps and frame interpolation adds frames in between. It doesn't need the next frame to interpolate, the interpolation algorithm data is entirely generative based largely on the last frame data.

It's probably just better to look at frame rate as a function of the time it takes to produce a new frame, ie, 16.6ms is 60fps and 16.9ms is 59fps. for instance. So it just missed it's update window by 0.03ms after the frame interpolation. To lose 2 frames you need to be at 17.2ms.

So basically there was a standard 33.33ms update with a frame injected inbetween to make it appear like a 16.66ms update. If PS4Pro misses a bit say 33.6ms from the previous update to the next it will appear at 59fps. To lose 2 frames, the next update would have been at 33.9ms. 0.6ms slower than the standard 33.33ms update.
 
Last edited:
IIRC frame interpolation is additive.
ie: it's a regular 30fps and frame interpolation adds frames in between. It doesn't need the next frame to interpolate, the interpolation algorithm data is entirely generative based largely on the last frame data. So you only lose the next update, so the typical 30fps is 33.3ms cycle and interpolation comes in to add a frame inbetween to make the frame update look like 16.6ms.
As long as the time from the last actual update to the next actual update doesn't exceed 50ms (33.3ms + 16.6ms) it shouldn't be considered 2 skipped frames if I understand correctly, just 1.

From what I can tell, the DigitalFoundry framerate analysis show updates in one second intervals. In the video we can see that there's a single dropped frame over one second (59fps) and an instance of seven dropped frames (53fps).

In the first instance, we can't assume that the 4Pro is rendering 30 frames with the interpolated frames being 29, as it's illogical for a dropped interpolated frame (cost must be very low). It must be that rendered frame(s) dropped, so it'd have to be 29 rendered frames with 30 interpolated frames.

Just seems odd to me that we'd have more interpolated frames than rendered ones. As you'd expect one interpolated for every rendered frame.

I guess it could be "borrowing" data from another of the second long update intervals. I can't understand why else it'd happen.
 
From what I can tell, the DigitalFoundry framerate analysis show updates in one second intervals. In the video we can see that there's a single dropped frame over one second (59fps) and an instance of seven dropped frames (53fps).

In the first instance, we can't assume that the 4Pro is rendering 30 frames with the interpolated frames being 29, as it's illogical for a dropped interpolated frame (cost must be very low). It must be that rendered frame(s) dropped, so it'd have to be 29 rendered frames with 30 interpolated frames.

Just seems odd to me that we'd have more interpolated frames than rendered ones. As you'd expect one interpolated for every rendered frame.

I guess it could be "borrowing" data from another of the second long update intervals. I can't understand why else it'd happen.
I updated my post because it was very wrong, I think it's better to look at frame update times as measure of frame times.
 
Status
Not open for further replies.
Back
Top