Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Why do people keep bringing up this argument for this game? Every aspect of this games technical makeup other than RT are mediocre by last gen standards.
Because this game fits exactly the criteria that people were using the make the argument for dropping last gen and high framerates. Mostly I was making a joke, though. No so much at the games expense, but at the people who thought that dropping last gen and 60+fps meant that graphics were going to magically advance to some unseen levels.

Jokes aside, the work on Gotham Knights appears to me to be the work of a less inexperienced (probably smaller too?) yet still talented team.
This was the Arkham Origins team though, wasn't it? That's a game that launched pretty buggy on PC but I actually found that game to be underrated by most people. Real talk I'm pretty disappointed in the technical showing of Gotham Knights, but I'll probably give it a go when it gets some patches and a discount.
 
the people who thought that dropping last gen and 60+fps meant that graphics were going to magically advance to some unseen levels.

The hardware just isnt really there for it. If we want '4k60' then that means reduced fidelity either by 'performance modes' or just a game designed around a 60fps target. UE5 for one wont be much of 60fps unless consessions are made. Honestly even at 30fps we shouldnt expect too much either fidelity-wise.
The 'humpf, lazy devs' isnt new anyway, it will be used throughout the generation when expectations arent met.
 
Rather barebones port and a poor effort but what did we expect from the team that ported Arkham Knight to PC?

A PS4 game shouldn't need optimized settings to run at 1440p/60 on an RTX 2060S that's three times the performance of the basically 7850 inside the PS4.

The PS5 trounces equivalent PC parts in terms of performance to a never before seen degree.
 
Theres a reason Nixxes stepped in, unfortunately too late for this one. A game performing worse than spiderman does.... lol.
Not much to sorrow, UC4 aint that class of a game gameplay-wise and sales will be reflecting that.
 
Theres a reason Nixxes stepped in, unfortunately too late for this one. A game performing worse than spiderman does.... lol.
Not much to sorrow, UC4 aint that class of a game gameplay-wise and sales will be reflecting that.
It might rise on the week-end but the peak concurrent number of players so far is a paltry 9,740. This is not even half of what Days Gone peaked at and less than 1/5th of what GOW and Spider-Man did. If it rises, I'd be surprised to see it breach 20,000.
 
Theres a reason Nixxes stepped in, unfortunately too late for this one. A game performing worse than spiderman does.... lol.
Not much to sorrow, UC4 aint that class of a game gameplay-wise and sales will be reflecting that.
Nixxes stepped in on Uncharted?


I'm still trying to find the answer of whether or not the PS5 version uses video transitions like the game does on PC. Was hoping the DF video would have some answers for me, but I don't think Alex was too enthusiastic about this one after all the craziness of the past couple weeks lol. Maybe I'm wrong lol.. Oh well.
 
Theres a reason Nixxes stepped in, unfortunately too late for this one. A game performing worse than spiderman does.... lol.
Not much to sorrow, UC4 aint that class of a game gameplay-wise and sales will be reflecting that.
And there I was so happy for you that you finally have some games worth playing
 
  • Haha
Reactions: snc



Interesting observation on the slow loading times being limited to two threads. That definitely points to a suboptimal port. Great to see the properly matched PS5 settings too. Looks like about a 7% performance difference between Ultra and PS5 settings for almost no visual gain. Certainly not enough to account for the performance deficit at lower resolutions. I'd like to see some more analysis on that as it looks like a major weakness in this title.
 
Why do people keep bringing up this argument for this game? Every aspect of this games technical makeup other than RT are mediocre by last gen standards.

Probably because other people keep bringing up the false equivalency that a game is only going to really look great if it's current gen only, and that is just patently not true. Absolutely nothing about being cross gen prevents a game from potentially being the best looking game of the current generation of consoles. It all comes down to what the hardware target is.

So, sure, if a developer targets the previous generation of consoles and then scales the graphics up, it's going to hold back the games looks.

OTOH, if a developer targets the current generation of consoles and then scales the graphics down to the previous generation of consoles then there is absolutely ZERO things about it that would prevent it from looking just as good or better than another game that is only released on the current generation of consoles.

It's just unfortunate that during the 2000's most developers switched to console development and the practice of targeting the best hardware available (or even hardware that was currently not available) and then scaled down to lesser hardware gradually died. Epic, CryTek, DICE, and many other developers used to do this. Epic still does WRT Engine development (UE5 for example still supports Nanite on last gen consoles and I don't see people saying UE5 looks like a cross gen. engine), but they no longer make games so unfortunately we don't get to see what they could do with a cross gen UE5 game.

Now developers that really wish to push the graphics envelope and still benefit from last gen's console install base would do well to relearn what PC developers in the 90's and early 2000's were doing. Target the best hardware then scale down is always going to result in better looking games than targeting the worst hardware and then scaling up.

The end result is a game that would look exactly the same as if that developer had targeted the current gen and didn't bother with scaling it down to the previous gen. Actually, I take that back, it'd look better if they were targeting the best PC hardware rather than the best console hardware and the scaling the game down to console hardware (current and past gen).

Of course, the other caveat to all of this is how many developers can afford the budget (time, money and manpower) to truly push current gen hardware or even the best PC hardware?

Regards,
SB
 
Last edited:
Nixxes stepped in on Uncharted?


I'm still trying to find the answer of whether or not the PS5 version uses video transitions like the game does on PC. Was hoping the DF video would have some answers for me, but I don't think Alex was too enthusiastic about this one after all the craziness of the past couple weeks lol. Maybe I'm wrong lol.. Oh well.
Yeah PS5 uses 30 fps cutscene transition vids too! Both John and Oliver mentioned it I believe in their vids on the PS5 version
 



Overall very mediocre port, but at least disastrous elements like the shader stuttering were headed off at the last moment with the patch (albeit sent to reviewers in that state is pretty incompetent).

It wasn't clear from the video though if you do have a very capable CPU like a 12900k and choose to not wait for the initial shader compile, do you also get bad stutter on such an outsized CPU in gameplay? Some PS5 uncapped comparisons might have also been appreciated to show precisely how GPU & CPU heavy this game is as well, seems like it's impossible for it to go over ~140fps on any GPU and 120fps is really a struggle without the best rigs. So discovering where the bottlenecks lie outside of loading would have been interesting.

Speaking of loading bottlenecks, I'm a little wary of the comment that the game "properly uncaps the framerate when loading" - this isn't necessarily ideal. Arkane games and Control are like this, the problem is when you cap the framerate outside of the game, such as I'd like to do in Deathloop as Reflex low-latency is almost a requirement considering the awful input lag by default - but enabling reflex also induces horrible microstutter, at least on my 60hz TV. So, a Rivatuner cap of 60 to the rescue, and it largely eliminates that stutter. But...it then also affects loading speed. Deathloop's loading is crippled if you cap the framerate externally, this should never happen. You don't tie loading speed to the main renderer framerate. This may not be the case here but just something that I noticed.

They forgot to criticise the ringing causing DLSS.

That as well. If you're going to say it's basically the default to enable FSR/DLSS on the PC, especially at the lower quality levels, I'd like to see that substantiated with some comparisons between native and also the PS5. Especially when they struggled with their DLSS implementation before the patch with obvious glitches that should never have shipped to the reviewers, really doesn't give much confidence for them to cover the edge cases that several games with DLSS can break down with. I mean Spiderman is Nixxes, the 'gods' of PC porters, and they still have some pretty bad artifacts with DLSS in spots.
 
Last edited:
Yeah PS5 uses 30 fps cutscene transition vids too! Both John and Oliver mentioned it I believe in their vids on the PS5 version
Who would you have to kill to get an interview with Naughty Dog about their engine?
 
I've worked on low-memory systems and pre-allocating pools of space in RAM is normal. When you are dealing hundreds, probably thousands of assets, allocating memory dynamically in what is already a small window of memory and tracking all this through linked lists of pointers and sizes can often consume vastly more memory than it would theoretically save - and it has a performance hit.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top