Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
Is this fact or just an example? OOI does that include the amount below 60fps of the time below fps? Overall is this the 'paper' difference between the machines?
The resolution is locked that is fact.
As per the article and video, aside from the Mendoza level at the back where the flowers are, Xbox Series consoles are locked and they had difficulties finding anything notable outside of that particular area. If think about the time played in a game, say like 20 hours of play time, how long will you spend in the flowers section? 5% of 20 hrs is 1 hr. You won't spend 1 hr in the flowers, so you'd be lucky to spend more than 10-15 minutes. You're looking at something closer to 98% of the time with frame rates at 60fps. That's how I generated the numbers.

As for paper difference between the machines, no, at least not in the sense that you ask. Ever since PS5 went with a fixed power draw with shared power between CPU and GPU and dynamic clocks, you'd have to look over the range of values that PS5 can perform at, vs the static level of XSX. So my answer is, I don't know. Hitman is a CPU intensive title, so there should be some considerations there. I don't think there will ever be a 'an ideal paper spec' title for PS5. It's a balancing act between CPU and GPU performance. It would be inappropriate to assume that the CPU could never be a bottleneck and the system was designed to ensure the only bottleneck would be on the GPU. We only have those types of setups when we are benchmarking GPUs.
 
It would be nice if PC games had a default preset of console settings.
I agree it would be fun, but the challenge there is the amount of work. As we've seen from DF videos, sometimes console settings are customized per level as well and not just global settings.
 
Its also not useful in any practical sense.
Why do you think it wouldn’t be useful? At the very least it provides a balanced mix of visuals/performance for users. Current presets following the typical low/med/high etc are terrible in that sense.
 
Why do you think it wouldn’t be useful? At the very least it provides a balanced mix of visuals/performance for users. Current presets following the typical low/med/high etc are terrible in that sense.
they may leverage specific customizations that are specific to the hardware available to handle specific configurations.

But I get the point of it being useful at the same time for the reasons you cited.
 
they may leverage specific customizations that are specific to the hardware available to handle specific configurations.

But I get the point of it being useful at the same time for the reasons you cited.
That doesn’t have to preclude a default preset. There is no reason the default preset can’t apply different settings depending on the level as in consoles. And WRT console specific optimizations, the developer can apply the equivalent or closest setting using whatever fallback/code path has been written for the PC version.
 
I dont see all that much reason to exactly 100% match every single console setting including resolution etc. I think as in hitman 3 what DF comes to with the settings is close enough, its only intresting to match and gauge console performance at the exact level, which most pc gamers dont care about i think. Talking about modern hardware going forward (gpu/console availability aside), people getting 3060Ti/6700XT and higher are most likely going to play at higher settings anyway, not turning down visual fidelity to match a PS5.
 
It would be nice if PC games had a default preset of console settings.

Yes I agree here. The console settings represent the best bang for buck in the eyes of the developers so it'd be nice for PC gamers to have the same option, especially if you're sporting hardware at or below the console levels or enjoy very high frame rates.

HZD obviously did this which was great, but very few if any other games seem to. The Call of Duty example is a particularly strange one. There's clearly performance to be gained from the lower resolution alpha effects or they wouldn't have been implemented on the consoles in the first place. So why not offer that on PC as well as a lower option than currently available? Obviously lots of PC's out there are less powerful than the PS5 so could presumably benefit even more from such a setting.

My guess would be it was a simply trade off in development time. On the PC it's easier to not spend the time on it and leave more powerful hardware to sort it out I guess.
 
Yes I agree here. The console settings represent the best bang for buck in the eyes of the developers so it'd be nice for PC gamers to have the same option, especially if you're sporting hardware at or below the console levels or enjoy very high frame rates.

HZD obviously did this which was great, but very few if any other games seem to. The Call of Duty example is a particularly strange one. There's clearly performance to be gained from the lower resolution alpha effects or they wouldn't have been implemented on the consoles in the first place. So why not offer that on PC as well as a lower option than currently available? Obviously lots of PC's out there are less powerful than the PS5 so could presumably benefit even more from such a setting.

My guess would be it was a simply trade off in development time. On the PC it's easier to not spend the time on it and leave more powerful hardware to sort it out I guess.
At the very least there could have a default preset consisting of the settings Alex determined to be the most equivalent. Surely every developer can find the manpower/budget to do that.
 
At the very least there could have a default preset consisting of the settings Alex determined to be the most equivalent. Surely every developer can find the manpower/budget to do that.

I don’t see why “console settings” would be any more useful than “medium” is today. Console games are tuned for console hardware. There’s no guarantee that those settings are optimal for the average gaming PC.

The only benefit would be to make Alex’s job easier.
 
I don’t see why “console settings” would be any more useful than “medium” is today. Console games are tuned for console hardware. There’s no guarantee that those settings are optimal for the average gaming PC.

The only benefit would be to make Alex’s job easier.
Because the medium preset usually looks and performs worse than "optimized" or "console" settings. I don't think I can recall a scenario where the medium preset was ever a smart settings choice. When we look at DF, HUB, GN etc settings breakdowns to determine smart visual and performance compromises, it's a far cry from the developer's included medium preset and usually very close to console settings.
 
Last edited:
  • Like
Reactions: snc
I don’t see why “console settings” would be any more useful than “medium” is today. Console games are tuned for console hardware. There’s no guarantee that those settings are optimal for the average gaming PC.

The only benefit would be to make Alex’s job easier.
I see benefit thats usually console settings are not only suite for hardware but also have usually good graphic/performance value better than for example settign everything on medium etc
 
DF Written article to go with the video @ https://www.eurogamer.net/articles/digitalfoundry-2021-hitman-3-tech-review

Hitman 3 tech review: the Glacier Engine shines on next-gen consoles
But last-gen hardware isn't left behind.

Traditionally, the cross-gen period - the awkward transition from one set of consoles to the next - hasn't really worked out. There's the sense that the next-gen versions aren't necessarily everything they could be, while the software served up to owners of the older consoles often verged on the unplayable. But as move from PS4 to PS5, from Xbox One to Xbox Series, it's clear that we're seeing something very different this time around. IO Interactive's Hitman 3 shows that cross-gen can actually work out just fine, and there's the feeling that prior generation consoles are being pushed closer to their ultimate limits, while simultaneously, massive gains are achieved when playing on PS5 and Xbox Series X.

Of course, Hitman 3 is a game with additional interest for us, because IO's excellent Glacier Engine has evolved, bringing impressive new features into the technology - while at the same time back-porting those innovations to the entirety of the Hitman trilogy. It's to IO's credit that with each Hitman release, and with each iteration of Glacier, existing content has been ported to the latest game, benefitting from the engine upgrades. It's doubly useful for Hitman 3 played on a next-gen console because the baseline improvements are so profound, it's effectively both a new game and a next-gen remaster of the older releases.

So how has Glacier moved on? The new maps are truly effective showcases for some of the new technology and the initial Dubai mission is a beautiful debut for a range of effects - foremost amongst them the arrival of a remarkable implementation of screen-space reflections. The environment is packed with reflective surfaces including metal and glass, presented in a brightly lit manner that would make imperfections obvious, while Hitman continues to look pristine. Of course, there are limits to SSR - off-screen detail can never be reflected - but Glacier has some interesting tricks up its sleeve for enhancing the effect. So, for example, the game's signature crowds are reflected too, but it seems that these are 2D-sprite like 'imposters' to save on processing resources, while still looking very effective.


...
 
Because the medium preset usually looks and performs worse than "optimized" or "console" settings. I don't think I can recall a scenario where the medium preset was ever a smart settings choice. When we look at DF, HUB, GN etc settings breakdowns to determine smart visual and performance compromises, its a far cry from the developer's included medium preset and usually very close to console settings.

If that holds true for midrange PC cards then I agree with you. But those sites are usually running beefy PC hardware.

I can believe medium looks worse but looks and performs worse? That doesn’t fly. The main problem with PC medium settings is they usually drop texture resolution for no good reason and that makes a big diff in IQ.
 
If that holds true for midrange PC cards then I agree with you. But those sites are usually running beefy PC hardware.

I can believe medium looks worse but looks and performs worse? That doesn’t fly. The main problem with PC medium settings is they usually drop texture resolution for no good reason and that makes a big diff in IQ.
HUB tests on a range of GPUs from both vendors when determining bang for buck settings. I'm not sure if DF and GN do the same but they usually end up with very similar settings in the end either way.
 
Because the medium preset usually looks and performs worse than "optimized" or "console" settings. I don't think I can recall a scenario where the medium preset was ever a smart settings choice. When we look at DF, HUB, GN etc settings breakdowns to determine smart visual and performance compromises, it's a far cry from the developer's included medium preset and usually very close to console settings.

Especially with regards to texture quality. There are many cards out there that can have no problem running high, or even ultra textures, but struggle with other graphical settings on high. Presets, as most are designed now, are non-optimal for this reason alone.

Edit: ah point already made
 
Status
Not open for further replies.
Back
Top