Digital Foundry Article Technical Discussion [2024]

if the game (ND) fidelity matches the trailer... then I would likely suspect DirectXR will need an overhaul, or developers would demand an overhaul, because I suspect Sony is providing access to let ND full control over how fixed function works in order to achieve this fidelity and performance targeted at the hardware profile of a PS5.
I think the trailer was pre-rendered. They said it was running on a base PS5? Pretty curious how it counts native 4K at 60 fps when their last game was only 1440p. I suspect a UC4 situation where they are showing their visual target, but Not necessarily with the IQ and/or framerate of the final product

Graphically though Independent of res, I actually think it shows a massive amount of continuity with TLOU pt1 and 2's cutscenes quality, not too much different actually. It looks realsitic for a PS5 game's directed cutscenes.

TW4 on the other hand looked massively beyond current gen IMO and I laughed audibly with the "pre-rendered in unreleased GPU" Statement.
 
I've always wondered why devs don't get the players on their side to help them invoke change for stuff like this. Start mentioning how restrictive DXR is compared to console RT APIs, and blame it as a reason why things aren't as good as they could be. Players see it, outlets like DF catch it and amplify it.. then maybe things will change a bit quicker?
Because most customers besides nerds like us who are a minority would understand what they're talking about.
 
Because most customers besides nerds like us who are a minority would understand what they're talking about.
We are the vocal ones though. We're the ones they need to reach and could use to amplify their voices. We all might not understand the issues on a low level, but they don't need us to. They simply need to hear large amounts of people complaining that nothing is being done to address developers concerns. If people start bringing up DXR and it being shit constantly with every game release with RT... it will gain a reputation that MS probably wouldn't want.
 
We are the vocal ones though. We're the ones they need to reach and could use to amplify their voices. We all might not understand the issues on a low level, but they don't need us to. They simply need to hear large amounts of people complaining that nothing is being done to address developers concerns. If people start bringing up DXR and it being shit constantly with every game release with RT... it will gain a reputation that MS probably wouldn't want.
if people play a game and don't feel there's a problem, then you're going to have a hard time convincing them to speak against it. just compare DF's Silent Hill 2 PC video with the user score on Steam for example
 
if people play a game and don't feel there's a problem, then you're going to have a hard time convincing them to speak against it. just compare DF's Silent Hill 2 PC video with the user score on Steam for example
Entirely different groups of people. The people I'm talking about are the people who post on gaming forums, complain about performance, and watch people like Digital Foundry. More than enough people who would gladly speak out on issues for the betterment the platform. All we need is a little bit of direction.

I see more and more discussion on issues with Windows, and people calling out the many regressions that have been happening.. people calling out longstanding issues.. people talking about how Windows is terrible for handhelds and so on. The more people speak out about it, the more it catches on and eventually the right people start feeling some pressure. People can deny it all they want, but we have SteamOS putting some pressure on Microsoft to make Windows better for handhelds.. perhaps developers who want to push for some real change in the industry could nudge people in the right direction to focus some of that pressure to further those goals sooner. I realize developers can not come right out and speak about things that are behind NDAs and make direct comparisons between architectures... but they can certainly speak out about things in PC-land not being even close to as optimized as they could be.. and keep planting those seeds until consumers start raising their voices pushing for it to change.
 
Yeah 2 bounces - with a cache that does another 2?
I believe Cyberpunk uses 2 rays per pixels, 2 bounces for it's "Pathtracing" mode.
Actually, Cyberpunk path tracing went through various stages of evolution (from the 1.6 Overdrive patch to 2.0 patch to 2.1 patch), it used to rely on ReSTIR DI + in house GI solution but got upgraded now to a solution that combines ReSTIR DI + ReSTIR GI.

Currently it runs at 1 sample per pixel + 4 main bounces + near infinite bounces from using a radiance cache, it's a vendor agnostic world space cache called "SHaRC", as opposed to the neural cache "NRC" that runs exclusively on tensor cores. For comparison Lumen is another vendor agnostic radiance cache but works in screen space not world space.

SHaRC is also used in Portal RTX and Portal RTX Prelude, where it has the highest bounces of any path traced titles out there (8 main bounces + infinite from cache).

You can learn more about this from the CDPR presentation, at minute 39:30.
 
Last edited:
Not sure if things have changed in the industry, and companies just aren't sharing like they used to. When FXAA came out, it seemed like it spread quickly, and there were implementations made all around. Now Guerilla makes a secret and seemingly leading non-ai upscaler. Really need someone to come out and invest real time into making something open sourced. FSR kind of sucks. I use DLSS because it's the best option on PC, but I'd 100% prefer a solution that could run on every gpu. Not sure if it's just an oversight in the industry and most companies don't want to invest the resources. Not sure who the good samaritan would be. Was expecting microsoft to come out with something, but ... yah.
 
Not sure if things have changed in the industry, and companies just aren't sharing like they used to. When FXAA came out, it seemed like it spread quickly, and there were implementations made all around. Now Guerilla makes a secret and seemingly leading non-ai upscaler. Really need someone to come out and invest real time into making something open sourced. FSR kind of sucks. I use DLSS because it's the best option on PC, but I'd 100% prefer a solution that could run on every gpu. Not sure if it's just an oversight in the industry and most companies don't want to invest the resources. Not sure who the good samaritan would be. Was expecting microsoft to come out with something, but ... yah.
There's definitely been a history of sharing, hopefully they showcase at the next conference.
But if developers stop sharing because of 'stuff'. That will be problematic, I feel like there will be a lot of stagnation in graphics.
 
I don't see thing stagnation in graphics, quite the opposite.
AI upscaling came and other had to play catch up and are still doing so.
RT came and others had to play catch up and are still doing so.

It is not like NVIDIA is laying on its laurels, quote the opposite.
The are just using their software on top of their hardware to seperate them from the competition.
And market numbers show they are getting rewarded for it.
 
That's a very pro-IHV attitude that doesn't really follow the concern. Throughout all of gaming's history, developers have freely shared ideas and algorithms. This has caused iterative evolution across the entire industry. eg. Intel shared MLAA as a concept, from which an entire new branch of AA algorithms were spawned. Everything from line drawing to path finding to flocking to GI solutions etc. has been free to use and develop.

Irrespective of whether a dominate market player uses their considerable financial advantages to progress their product lines or not, a change in the industry to keeping one's ideas to oneself for a competitive advantage would be a seismic shift in the industry and render it far worse overall.
 
I don't see thing stagnation in graphics, quite the opposite.
AI upscaling came and other had to play catch up and are still doing so.
RT came and others had to play catch up and are still doing so.

It is not like NVIDIA is laying on its laurels, quote the opposite.
The are just using their software on top of their hardware to seperate them from the competition.
And market numbers show they are getting rewarded for it.
yup, and current RT is a joke, they have a lot of work to do to make it as compelling as in Quake 2 RTX and other -VERY FEW- games.

There are games where I enable RT to the fullest and I barely notice anything vs non RT, except for the huge drop in fps. And it's not just me, there are a lot of screenshot comparisons where there isn't much of a difference. It exists, but if someone doesn't point out which is the RT and non RT image, many people would fail discerning the difference.

It's not that RT is useless, but it's still in very early development. Taking into account that Siggraph's demos using RT back in the day took many days to draw a single frame, I can understand, but RT isn't some kind of panacea now, it's still a joke and a resource hog in more than 95% of the games implementing it.
 
Machine Games has updated Indiana Jones on Series X/S to fix issues with how the GI looked or worked before:


Here are the before and after pics showing the issue with how the shadowed areas were "glowing" and now work more akin to how it looks on PC:


I am posting this here because of the DF tech review and how the Series consoles GI setting was lower than PC low.
The update also contains several fixes for the game on PC, including fixes to DLSS.
 
Last edited:
Machine Games has updated Indiana Jones on Series X/S to fix issues with how the GI looked or worked before:


Here are the before and after pics showing the issue with how the shadowed areas were "glowing" and now work more akin to how it looks on PC:


I am posting this here because of the DF tech review and how the Series consoles GI setting was lower than PC low.
The update also contains several fixes for the game on PC, including fixes to DLSS.
Awesome!
 
Machine Games has updated Indiana Jones on Series X/S to fix issues with how the GI looked or worked before:


Here are the before and after pics showing the issue with how the shadowed areas were "glowing" and now work more akin to how it looks on PC:


I am posting this here because of the DF tech review and how the Series consoles GI setting was lower than PC low.
The update also contains several fixes for the game on PC, including fixes to DLSS.
That's a very strange oversight or bug, it looked very different.
 
Back
Top