Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
With what CPU?
Whichever you deem fit. You're the one saying PS5 runs this game at a level comparable to 3080+ performance. Data has been presented showing this not to be the case (benchmark to 3080 average framerates and reference to GamingTech video of light-load area versus low FR area for 3080 benchmark). You need to provide a data-driven response to prove yourself right. To date all you've said is other people are wrong and referenced various numbers without a clear context, such as an arbitrary "71 fps" PS5 screenshot.

Simply provide your data that shows 3080 performance and PS5 performance as comparable. ;)

Edit: I'm impartial. I have no idea what this game runs at on any platform and have no expectations. I'll side with whatever data clearly shows the difference between a PS5 and a 3080.
 
Last edited:
Agreed, very good to see the occasional update video. Frankly I would have waited another month considering the state of it but hey, 7 weeks is a hell of a generous buffer already before you 'officially' review a title. Like many said, if this title is going to be fixed to the degree it needs to, it's a months-long process.
I've not seen the video yet, and it might be included, but I'd be interested in a video covering the release of games in what are objectively a 'broken' state and the time for them for be significantly fixed. Not fully or highly optimised, but just no longer objective broken.
 
Before floppy drives, you may recall that loading from tape on computers like the ZX Spectrum and Commodore 64 went through an epic period of "turbo loader" phase. What this did was take the null CPU usage during loading of uncompressed data, and replaced it will uncompressing compressed data in realtime during load.

Why I am telling you this? I just like the stripey screen colours during 8-bit computer tape loads in the 1980s. :yes:
You're lucky, my VIC-20 just had a boring blue screen loading from the datasette!
 
You're lucky, my VIC-20 just had a boring blue screen loading from the datasette!
That was the default on he Commodore 64 as well, but software gradually migrated to using a turbo loader which loaded a decompression pre-loader with decompressor, and sometimes a title/loading screen, then proceeded to load/unpack the compressed game at about a tenth of the time it would have taken to load without a turbo-loader.
 
I've not seen the video yet, and it might be included, but I'd be interested in a video covering the release of games in what are objectively a 'broken' state and the time for them for be significantly fixed. Not fully or highly optimised, but just no longer objective broken.

It has been suggested previously that this is done in the DF Directs.

Not as in depth analysis but a quick 60 second clip per game showing frame rate before and after.
 
I'm curious about one thing though, what is with the Naughty Dog engine that makes the PS5 perform so remarkably well in relation to PCs? We don't see that with any of their games such as Horizon, God of War, or Returnal. Generally, the PS5 tends to perform better, which is a given, but in TLOUI and Uncharted 4, it performs like a 2080 Ti, which is kind of insane.

Given these are games built specifically for the PS5 by a studio famed for its low level optimisation, I'd say it's simply a matter of the engine utilising lots of PS5 fast paths and low level API tricks that don't directly translate over to PC API's and architectures.

And rather than rework all these paths and tricks to something that works more efficiently on the PC - which might require very timely and costly re-engineering at the core of the engine - they probably just looked at the PC landscape, and decided it was easier to chuck in FSR/DLSS to 'equalise' performance on console level GPU's and let higher end GPU's brute force to higher settings.

While annoying for tech heads like us that like to compare GPU performance, the reality is that with upscaling the GPU performance is actually 'good enough' in this game for the vast majority of PC users to have a pretty good experience.

The CPU performance on the other hand is a genuine issue and probably deserves more of their attention.
 
Glad they're revisiting it.

Hopefully there really will be another Dead Space Remake video.

After I switched to a Ryzen 7800 X3D it does stutter less. However, it still stutters when loading new areas. Therefore we can't let EA and Motive Studio get away with this unfinished software product so easily.

There was no patch for months and most probably no statement too.

I would have to say Hitman 3, Metro Exodus, Chernobylite, and Call of Duty: Black OPS Cold War ray-tracing implementations are quite excellent and surpasses Control's RT in many areas.

I disagree. In Hitman the shadows in the reflections also seem to be missing. Control has a much more flawless presentation.

Still. I have purchased Control 2 times and am currently comparing the unmodified with the modified version and the differences are larger than I had suspected. The image is much sharper, there are even fewer errors. To me, the modded is the better version.

The modder also said that he is working on a version that reduces ray tracing noise with DLSS. Currently, even with DLSS at native resolution there is a nervous noise in some places that does not occur at native resolutions without DLSS.

When Resizable BAR is enabled this game also has a kind of permanent tearing in the prerendered cutscenes.
 
Last edited:
Given these are games built specifically for the PS5 by a studio famed for its low level optimisation, I'd say it's simply a matter of the engine utilising lots of PS5 fast paths and low level API tricks that don't directly translate over to PC API's and architectures.
And you can go a little further on this line of thinking consoles can also have specific instructions for both CPU and GPU that may not exist either on PC. If you are leveraging that a lot, conversion may be very challenging. These types of “savings” in terms of optimizations are fairly powerful because they reduce the number of instructions required to complete a function.
 
Last edited:
  • Like
Reactions: snc
Great piece of kit. The one that’s really going to shine is Gen 2 Steamdeck and Ally.

They made it work Gen 1, Gen 2 is going to be about fixing all the wrongs and ideally more battery time. Gen 3 forward will be evolutions largely.
 
And you can go a little further on this line of thinking consoles can also have specific instructions for both CPU and GPU that may not exist either on PC. If you are leveraging that a lot, conversion may be very challenging. These types of “savings” in terms of optimizations are fairly powerful because they reduce the number of instructions required to complete a function.

One other thing with a fixed platform is that you can also use potentially unsafe code because the behavior of it might still be predictable enough that if it brings a speed benefit you can still potentially use it.

Back in the day for older consoles it was standard practice for the better developers (in terms of optimization and speed of execution, etc.) to regularly do things that they shouldn't do. Like, for example purposely adding items to an array so that it would exceed the memory allocated and then do some clever tricks on that data outside of the array's bounds.

Lots of weird memory hacks that were used on consoles that would immediately cause an issue with a modern OS capable of running multiple programs simultaneously. Or exploiting bugs (ahem, "undocumented features") in the hardware to enable interesting effects.

There's likely still some of that hackery (even if not intentionally violating memory bounds) going on with developers who are trying to eke out as much performance as possible from the fixed hardware in a console.

Regards,
SB
 
While annoying for tech heads like us that like to compare GPU performance, the reality is that with upscaling the GPU performance is actually 'good enough' in this game for the vast majority of PC users to have a pretty good experience.

Kind of. The problem is that with output resolutions at 1440p and lower, combined with having to DLSS/FSR modes below Quality to maintain 60fps (at least in terms of GPU load), and to top it off TLOU's subpar implementation of reconstruction tech in general, you're getting a worse image quality experience compared to the PS5 at its native 1440p even when using that reconstruction. Slightly worse shadows, lots of moire patterns, significant breakup/shimmering with the flashlight, etc. At a 4k output it fares much better, but unlike those ports when you're starting from a slight rasterized performance deficit vs the console version and DLSS can get you better performance, and potentially with also improved quality for lower to midrange cards, this requires DLSS just to get you close to the same performance but with noticeably worse image quality.

Still, it's not a complete disaster (well, at the $70 asking price...?). Provided they can revert those new shader stutters (😡), at least the damn engine seemingly scales. Still an exorbitant amount of CPU/GPU required, but it at least soaks up all the cores and CU's you throw at it. Well aside from the efficiency cores. :)

On the other hand:

Hopefully there really will be another Dead Space Remake video.

Alex kind of hinted in the last DF Direct that one would be coming. I imagine it will be very short considering that nothing really has been improved, it was basically done and dusted since launch. I played the trial and it was uh, not great on a 12400f with the traversal stutters as you can imagine, but not even implementing the DLSS mipmap fix is incredibly negligent.
 
Last edited:

I had it on in the background and may have missed it, but Richard talks about noticing increased stuttering in some games compared to the Steam Deck. One of my main concerns I've had with Windows-based handhelds is they don't get to take advantage of Steam's distributed compiled shader caches, which is especially critical for a low power device. Could the lack of a delivered shader cache be contributing to this stutter?
 
I had it on in the background and may have missed it, but Richard talks about noticing increased stuttering in some games compared to the Steam Deck. One of my main concerns I've had with Windows-based handhelds is they don't get to take advantage of Steam's distributed compiled shader caches, which is especially critical for a low power device. Could the lack of a delivered shader cache be contributing to this stutter?
Almost certainly.

I'm super interested in this new handheld PC market, but being honest.. seeing these new, more powerful devices come out only makes me more excited for the future of the Steam Deck lineup than anything else. Valve is hitting the right note with their blend of price, power, usability, and customizability. The ace in the hole is that Valve can go in and make a bunch of specialized profiles for games, and fix outstanding issues with them that Asus or any other handheld PC platform holder wont be able to.

The Deck 2 is going to be a VERY compelling product.
 
Almost certainly.

I'm super interested in this new handheld PC market, but being honest.. seeing these new, more powerful devices come out only makes me more excited for the future of the Steam Deck lineup than anything else. Valve is hitting the right note with their blend of price, power, usability, and customizability. The ace in the hole is that Valve can go in and make a bunch of specialized profiles for games, and fix outstanding issues with them that Asus or any other handheld PC platform holder wont be able to.

Exactly. All these do is make me think "Man the Deck 2 is gonna be great". The reason the SteamDeck works so well is because Valve thought of all the downsides with delivering a handheld running Windows, it's far more than just a thumbstick-friendly frontend. SteamOS, and Valve's surrounding infrastructure, is just as crucial, if not more so, than the APU in it.
 
One other thing with a fixed platform is that you can also use potentially unsafe code because the behavior of it might still be predictable enough that if it brings a speed benefit you can still potentially use it.
Yes. Back when I was working on ps@home my buddy was pushing on ps3 and he was basically taking advantage of memory overflow which you cannot do on PC. I forgot the exact thing he did, but he was writing or reading over GPU memory bounds in a very specific way that cannot be done on PC. And apparently everyone who was coding on ps3 was doing this to extract significantly more performance out of it.
 
The actual performance in these competing devices is very disappointing to me, but also understable.
But was it inevitable? Could they have succeeded in pushing AMD to make another really custom made SOC for devices like this like AMD did for Valve? If Valve comes out with a new Deck with another custom Deck it'll demolish all this competition.
(talking about an hypothetical Deck 2, I think Zen 4 cores with their big FPU are a wast of space)
 
If the devices only reach PlayStation 4 performance now it will take another 7 years until they reach PlayStation 5 performance. Therefore path tracing is perhaps 15 years away while I can already experience it at home on the PC. Technically I don't find ithem that exciting either. But it's nice to see that you can now create beautiful 3D worlds. The last time I played on a handheald wheels of trucks were hexagonal. Yes, I just checked (Metal Gear Solid: Portable Ops). I found 2d games on Gameboy etc. aesthetically more pleasing than PSP games.

Like movies I want to enjoy video games on the largest possible screen with high end sound. A game with a great presentation doesn't work on a mini screen. That's why I've been playing on the TV for a long time and not on a monitor. I don't watch movies on my smartphone or tablet either. Interstellar, Blade Runner 2049, Dune etc. don't work on it.


Alex kind of hinted in the last DF Direct that one would be coming. I imagine it will be very short considering that nothing really has been improved, it was basically done and dusted since launch. I played the trial and it was uh, not great on a 12400f with the traversal stutters as you can imagine, but not even implementing the DLSS mipmap fix is incredibly negligent.
It got a little better. The VRAM demand was significantly higher at the beginning.
 
Last edited:
That was the default on he Commodore 64 as well, but software gradually migrated to using a turbo loader which loaded a decompression pre-loader with decompressor, and sometimes a title/loading screen, then proceeded to load/unpack the compressed game at about a tenth of the time it would have taken to load without a turbo-loader.

Didn't that come with some risk of reduced / removed error correction?

Asking as someone who was / is a C64 owner. Still got my original. But sold all my CRTs. Dick move.
 
Status
Not open for further replies.
Back
Top