Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
DF Article @ https://www.eurogamer.net/articles/digitalfoundry-2022-ghostwire-tokyo-df-tech-review

Ghostwire: Tokyo has up to 10 graphics modes - so which is best?
The Digital Foundry verdict on the PS5 version.

Ghostwire: Tokyo sees acclaimed Japanese developer Tango Gameworks striking out in new directions. For one, it's an actual next-gen/current-gen exclusive - it's only available on PlayStation 5 and PC, tapping into cutting-edge features like hardware-accelerated ray tracing. Secondly, the developer has shifted away from its internal STEM engine, based on idTech, instead favouring Unreal Engine 4. On balance, it's an excellent move - Epic's technology merged with this developer's unique vision delivers a beautiful game. In today's coverage, we'll be looking exclusively at PlayStation 5 before moving onto PC in a separate piece - and there's certainly much to cover. While there are six 'official' graphics modes to choose from, there are actually an unofficial ten in total.

First up, I wanted to share some impressions about the game overall. The shift to Unreal Engine 4 surprised me, but not as much as the core design itself which sees the surprisingly evocative and detailed world Tango Gameworks has created focused on a concept that is very much Far Cry-like in nature. Yes, there's a main mission path to follow, but it's also a game rich in side missions and other explorable elements, backed by an Ubi-style icon-packed map that overwhelms. It's not to my tastes but I'm aware that many love this style of experience.

...
 
They may show promise if PS5 had VRR and they used dynamic res. Hopefully they'll be able to polish up the modes with a patch.

The thing is, people without vrr displays still have issues. I hope devs don't learn on vrr when Sony brings it out as opposed to optimizing their games. Also it may just me, the amount of modes frankly sucks. I wish like alex said, they focused on two modes and optimized those
 
Rather than 10 modes, it would have been preferable to see something more PC like. Maybe 3 modes with an additional "custom" mode and then expose rendering features that the user can adjust to create a custom mode that they like.

10 modes attempts to do something like that, but without any choice in what gets enabled, disabled or what levels are chosen for any rendering effect, it's still possible for there to not be a mode that a user desires.

IMO, 10 modes is worse than 3 or 4 modes with rendering features available to be adjusted under an advanced menu. People wanting a console experience can just stick to the 3 predefined modes while those seeking greater control can dive into advanced options.

Regards,
SB
 
Having 10 modes is great, but maybe all of the ones that are clearly future proofing could have been tucked behind an advanced tab. Still, when we’re playing this backwards compatible on our ps8s we’ll be mad not every game had options like this
 
Having 10 modes is great, but maybe all of the ones that are clearly future proofing could have been tucked behind an advanced tab. Still, when we’re playing this backwards compatible on our ps8s we’ll be mad not every game had options like this

I disagree. All console plebes really want is dynamic res and option to uncap fps up to 60. That's all the future proofing required for more powerful hardware overhead.

Options like this just takes the entire plug and play aspect away under the illusion of choice on the consumer side, and it's harder on the developer side to optimize their software for the fixed platform which is supposed to be a strength of console development and the resulting output.

I think that's really important
 
I disagree. All console plebes really want is dynamic res and option to uncap fps up to 60. That's all the future proofing required for more powerful hardware overhead.

Options like this just takes the entire plug and play aspect away under the illusion of choice on the consumer side, and it's harder on the developer side to optimize their software for the fixed platform which is supposed to be a strength of console development and the resulting output.

I think that's really important

The problem with pc is there's no way to set an appropriate default because there's a wide array of hardware. Even if console let you change hundreds of options it would still be plug and play because they can just set a good default for players that don't want to tweak.
 
The problem with pc is there's no way to set an appropriate default because there's a wide array of hardware. Even if console let you change hundreds of options it would still be plug and play because they can just set a good default for players that don't want to tweak.

Isn't this exactly what GeForce Experience already does? Or for that matter most games that provide default settings based on your GPU?
 
Isn't this exactly what GeForce Experience already does? Or for that matter most games that provide default settings based on your GPU?

Geforce Malware Experience might do it ok. Not sure. I find the defeaults provided by games that auto-detect to be quite bad. I'm not sure what they do to account for differences in resolution or refresh rate.
 
Isn't this exactly what GeForce Experience already does? Or for that matter most games that provide default settings based on your GPU?

Which is hit or miss. Granted I don't know anyone who has used it in the past 3-4 years so it might have gotten better. But back in the day, enabling GeForce Experience would prevent some people from being able to play Final Fantasy XIV because it would cause their GPU to randomly black screen while playing the game. And the only way to fix it was for them to hard reboot their machine even though they could hear the game still running, voice chat still working, etc.

This was a problem for some of our raiders until we managed to track down the problem and then required all raiders to run without GeForce Experience enabled. As soon as we did that all black screening by our raiders stopped on NV hardware.

Regards,
SB
 
The problem with pc is there's no way to set an appropriate default because there's a wide array of hardware. Even if console let you change hundreds of options it would still be plug and play because they can just set a good default for players that don't want to tweak.

The only reason PC usually have options is because the variance in hardware can go up or down. And the base limit of most core gaming is console. Hence my point. Consoles set the base so they can't change significantly to account for major variances with things like major settings tweaks.

The only reason we have options on PS4 pro and PS5 is because we are in crossgen period and there is weaker console hardware to allow for those settings due to the overhead. Once those are cut loose we are back to the same issue of developers needing to get every ounce of power out of these machines and not being able to properly optimize a baseline in the case of needing to account for hypothetical settings like some suggest.

Maybe the scale of games will stay how it is now and res and graphics will only need to change. But when the benefit of next gen vanishes with that along with pushing the hardware.
 
The only reason PC usually have options is because the variance in hardware can go up or down. And the base limit of most core gaming is console. Hence my point. Consoles set the base so they can't change significantly to account for major variances with things like major settings tweaks.

The only reason we have options on PS4 pro and PS5 is because we are in crossgen period and there is weaker console hardware to allow for those settings due to the overhead. Once those are cut loose we are back to the same issue of developers needing to get every ounce of power out of these machines and not being able to properly optimize a baseline in the case of needing to account for hypothetical settings like some suggest.

Maybe the scale of games will stay how it is now and res and graphics will only need to change. But when the benefit of next gen vanishes with that along with pushing the hardware.

Are console devs not pushing the hardware already? I'd assume a PS5 or Series X game is going to push the gpu pretty hard on the defaults the game installs with. I don't see much of a difference between how it works now and how it worked in the past, except that you have some options to tailor your experience in some games. Doesn't really impact the plug and play nature of consoles.
 
Are console devs not pushing the hardware already? I'd assume a PS5 or Series X game is going to push the gpu pretty hard on the defaults the game installs with. I don't see much of a difference between how it works now and how it worked in the past, except that you have some options to tailor your experience in some games. Doesn't really impact the plug and play nature of consoles.
They are pushing them. But perhaps sub-optimally, there may be more optimal paths but is unsupported or extremely disadvantageous to last gen.
 
Isn't this exactly what GeForce Experience already does? Or for that matter most games that provide default settings based on your GPU?
Geforce Experience recommended settings are completely worthless in my experience. They don't make any sense at all and rarely come close to offering a good selection to maximize visual return for performance.
 
Geforce Experience recommended settings are completely worthless in my experience. They don't make any sense at all and rarely come close to offering a good selection to maximize visual return for performance.
Yeah, they don't make any sense. I could see if they are trying to showcase features that are nVidia exclusive, but, for example, GFE says I should run Injustice at 1080p even though I have a 1440p monitor (and it knows this because it recommends 1440p for most other games), wants me to turn down FSAA quality in Battlefield 3 to the lowest setting, and limit my FPS in Doom Eternal to 60. AMD used to have a companion app (not sure if it's integrated into their drivers now) and the optimized settings they suggested were mainly to circumvent their hardware shortcomings. So it would suggest turning down tessellation and the like.

Also, nVidia's driver control panel is ugly and old. AMD is so far ahead of them in this regard, and they have built in overclocking and performance statistics.
 
Are console devs not pushing the hardware already? I'd assume a PS5 or Series X game is going to push the gpu pretty hard on the defaults the game installs with. I don't see much of a difference between how it works now and how it worked in the past, except that you have some options to tailor your experience in some games. Doesn't really impact the plug and play nature of consoles.

Just maximizing res or fps or slightly bettering some settings isn't really pushing the current gen in ways console generations normally do. And that's to account for crossgen
 
DF Article @ https://www.eurogamer.net/articles/digitalfoundry-2022-ghostwire-tokyo-pc-tech-review

Ghostwire: Tokyo on PC debuts impressive new DLSS competitor
But it's another port with stutter problems.

Ghostwire: Tokyo is a game with many surprises in terms of its technical make-up. Developer Tango Gameworks has delivered a gameplay concept I wasn't expecting, wrapped up in a very different engine from prior titles, offering up an exceptional level of graphical finesse. The move away from its own idTech-based Unreal Engine 4 has clearly been a great enabler for the team, but I approached the PC version with some trepidation. Many recent PC releases have arrived with intrusive levels of stutter that impact the experience - no matter how powerful your hardware. It's especially common in Unreal Engine 4 titles - and unfortunately, it impacts Ghostwire: Tokyo too.

And that's frustrating for me, because there's so much to like here from a visual perspective - especially in terms of ray tracing features. On PC and PlayStation 5, ray traced reflections steal the show. RT reflections are applied liberally in Ghostwire: Tokyo, most striking on highly reflective surfaces where we get a perfect mirror-like effect. That said, they also apply to duller materials too, with a soft distorted look - computationally expensive but adding greatly to lighting realism.

 
First look at UE's TSR, Temporal Super Resolution in Ghostwire Tokyo analysis.
 
DF Article @ https://www.eurogamer.net/articles/digitalfoundry-2022-ghostwire-tokyo-pc-tech-review

Ghostwire: Tokyo on PC debuts impressive new DLSS competitor
But it's another port with stutter problems.

Ghostwire: Tokyo is a game with many surprises in terms of its technical make-up. Developer Tango Gameworks has delivered a gameplay concept I wasn't expecting, wrapped up in a very different engine from prior titles, offering up an exceptional level of graphical finesse. The move away from its own idTech-based Unreal Engine 4 has clearly been a great enabler for the team, but I approached the PC version with some trepidation. Many recent PC releases have arrived with intrusive levels of stutter that impact the experience - no matter how powerful your hardware. It's especially common in Unreal Engine 4 titles - and unfortunately, it impacts Ghostwire: Tokyo too.

And that's frustrating for me, because there's so much to like here from a visual perspective - especially in terms of ray tracing features. On PC and PlayStation 5, ray traced reflections steal the show. RT reflections are applied liberally in Ghostwire: Tokyo, most striking on highly reflective surfaces where we get a perfect mirror-like effect. That said, they also apply to duller materials too, with a soft distorted look - computationally expensive but adding greatly to lighting realism.


I love RT, but the RT reflections in this game still look cheap, or overly exaggerated in many areas.
 
Status
Not open for further replies.
Back
Top