Digital Foundry Article Technical Discussion [2024]

Call me paranoid, but no FSR or XeSS, crippling VRAM demands, and the lack of scaling almost seem by design to gatekeep low-end GPUs.
They could be waiting for FSR4 launch and come out of the gates as a game supporting it at launch. It's just a month to go here.

It's most certainly worth it for them to wait and test FSR4 carefully during this time.
 
Indiana Jones looks much, much better and more cinematic with full ray tracing even if it's without the important ray reconstruction.

In general raster shadows are ugly and gamey looking. I'm getting sick of seeing stuff like shadow maps, floating looking objects etc.
 
Last edited:
Indiana Jones looks much, much better and more cinematic with full ray tracing even if it's without the important ray reconstruction.

In general raster shadows are ugly and gamey looking. I'm getting sick of seeing stuff like shadow maps, floating looking objects etc.
I hope the RT isn’t noisy…
 
Still lots of baked shadows it seems, makes sense for this game.
I don't think they are baked, just seems like the local light shadows are not replaced with RT for some reason. Also looks like some of the foliage is missing from the RT shadows, but only particularly noticeable if you A/B it. Curious to see the outdoor shots from the first review with the new path; hopefully adds more depth to those scenes too!
 
I don't think they are baked, just seems like the local light shadows are not replaced with RT for some reason. Also looks like some of the foliage is missing from the RT shadows, but only particularly noticeable if you A/B it. Curious to see the outdoor shots from the first review with the new path; hopefully adds more depth to those scenes too!
Not my shots but here's a few:

TheGreatCircle_2024_12_08_22_51_49_053.jpg
TheGreatCircle_2024_12_08_22_51_28_643.jpg


TheGreatCircle_2024_12_08_22_52_54_478.jpg
TheGreatCircle_2024_12_08_22_52_37_970.jpg


TheGreatCircle_2024_12_08_22_57_09_032.jpg
TheGreatCircle_2024_12_08_22_56_53_222.jpg


More at:
 
I wonder if there’s a practical business calculation at work here too. Is adding 4GB to a high volume part worth the added cost if it dampens criticism but the vast majority of users don’t need the extra memory anyway?

i.e. are the YouTube talking heads and vocal forum dwellers representative of the average 4060 user?

My guess is it’s not worth it if there are no other compelling options on the market. But if there’s competition and the criticisms cause enough people to consider competing solutions then the economics shift in favor of more VRAM even if the reality is most people still won’t benefit.

There's a few layers to this.

In terms of adding VRAM the lowest technical and direct business impact would method be to simply offer SKUs with x2 VRAM (double sided) like it is done with the 4060ti 8GB/16GB for desktop (this wouldn't work for mobile) at BoM+margins. Instead we only have 1 GPU (4060ti) with this and it's offered well above BoM+margins at $100. Users who want the VRAM can chose to pay for it, those who don't won't have to pay for it. The suspected reason they do not do this is likely driven by product segmentation and potential lost future sales, as such it's the indirect business consideration and opportunity cost at play. We know this is likely the reason it doesn't sit well with the people complaining about this nor would they ever directly admit to this if asked (due the negative optics involved).

Now in terms of what you're specifically mentioning I would agree in that the entry 8GB cards are likely fine for most users and they wouldn't care and that actually redesigning these cards specifically to avoid a 128bit (to 192 bits) has trade offs in itself. From this perspective it's actually interesting in that Nvidia's lack of VRAM is in large part due to how efficient they are at extracting performance on a narrower memory bus versus competitors mostly ever since Kepler. Most buyers of GPUs actually don't follow any of this at all or are aware of any of this discourse.

In terms of the competition angle an issue is GPUs aren't commodities. I know the benchmarking crowd also wnats to distill it down to just FPS numbers but in reality the IHVs are not substituable with each other just based on aggregate FPS results. I also feel there is this interesting conundrum in that I feel when most people want more VRAM they specifically mean on Nvidia cards, because they are better suited to leverage said VRAM. As such there is a lot of factors in deciding on a GPU, so it's hard for anyone one factor to push people from one alternative to another. We'd likely see the true demand for VRAM if they did offering every GPU with an x2 VRAM alternative as mentioned earlier but they won't for larger business reasons.
 
Indiana Jones performs very poorly on my Intel A770 16GB. The game defaults to High Settings, but at 1440p, the performance is subpar, and it remains poor even at 1080p. The game averages around 25fps, with drops to 17-18 fps, using these default settings.

Reducing the resolution from 1440p to 1080p does not improve the frame rate. I hope Intel releases a patch soon, as I won't be playing until then. Even using Lossless Scaling x4 did not make the game feel smooth on my A770.

Interestingly, Doom Eternal ran flawlessly, so I'm not sure why this game struggles. The GPU isn't fully utilized, the fans stay quiet, and the GPU doesn't overheat.

Despite these performance issues, the game looks visually impressive.
 
Indiana Jones performs very poorly on my Intel A770 16GB. The game defaults to High Settings, but at 1440p, the performance is subpar, and it remains poor even at 1080p. The game averages around 25fps, with drops to 17-18 fps, using these default settings.

Reducing the resolution from 1440p to 1080p does not improve the frame rate. I hope Intel releases a patch soon, as I won't be playing until then. Even using Lossless Scaling x4 did not make the game feel smooth on my A770.

Interestingly, Doom Eternal ran flawlessly, so I'm not sure why this game struggles. The GPU isn't fully utilized, the fans stay quiet, and the GPU doesn't overheat.

Despite these performance issues, the game looks visually impressive.
Did you check your GPU usage? I also get bad and inconsistent performance on my 4090 at max settings at 3440x1440 with DLSS Quality. GPU usage is extremely low and I get random fps drops all the time.
 
Did you check your GPU usage? I also get bad and inconsistent performance on my 4090 at max settings at 3440x1440 with DLSS Quality. GPU usage is extremely low and I get random fps drops all the time.
you are a lucky guy anyways. Are you using Path Tracing?

Switching to Ultra textures reduces performance by about 7 to 10fps. With the default settings on High, there's a noticeable improvement in framerate but still low -30, 31fps-. I observed the GPU utilization hitting 100%, yet the overall performance remains underwhelming, especially considering how the game runs on an RTX 3050 4GB laptop variant (39fps average on low, while the A770 16GB average is 45fps, meh). Additionally, the GPU power draw is around 145W, well below the card's maximum of 190W.

@DavidGraham commented on a different post that the Intel A770 was performing below the expectations in this game. Odd taking into account it runs under Vulkan which is a strong point of Intel ARC and again, Doom Eternal performed very well even with RT on.
 
you are a lucky guy anyways. Are you using Path Tracing?
Yeah, I am, but it makes no difference whether I use it or not. I can turn on DLSS and frame gen, but the problems remain. Random drops into the low 30s that last for a few seconds.
Switching to Ultra textures reduces performance by about 7 to 10fps. With the default settings on High, there's a noticeable improvement in framerate but still low -30, 31fps-. I observed the GPU utilization hitting 100%, yet the overall performance remains underwhelming, especially considering how the game runs on an RTX 3050 4GB laptop variant (39fps average on low, while the A770 16GB average is 45fps, meh). Additionally, the GPU power draw is around 145W, well below the card's maximum of 190W.

@DavidGraham commented on a different post that the Intel A770 was performing below the expectations in this game. Odd taking into account it runs under Vulkan which is a strong point of Intel ARC and again, Doom Eternal performed very well even with RT on.
It seems that this game has performance issues for a lot of people. Alex mentioned that frame generation does not work with DLSS, only with TAA. There's also no XeSS or FSR.
 
Yeah, I am, but it makes no difference whether I use it or not. I can turn on DLSS and frame gen, but the problems remain. Random drops into the low 30s that last for a few seconds.

It seems that this game has performance issues for a lot of people. Alex mentioned that frame generation does not work with DLSS, only with TAA. There's also no XeSS or FSR.
yes, XeSS would be ideal to have too, dunno why a game by MS doesn't have it when they developed Auto-SR. :(

Maybe I will try DF optimised settings now that you mention it but I don't think it's going to make a huge difference. The game seems to be made for nVidia GPUs, and you have the best gaming GPU made to date, I don't get it.

Have you tried lowering your texture settings? That seems to make the biggest difference for me without touching anything else.
 
if life were normal the A770 should be performing more akin to the 3060 or 3060Ti, not too far from this -minus DLSS I guess- (RTX 3070 footage).

 
Last edited:
A770 performing close to a 3070? I think you mean 3060.
my bad, meant the 3070 but yes the 3060 or 3060Ti -with RT on it isn't usually that far 'cos of VRAM I think- would be more accurate. The 3070 isn't usually that far, not 90fps away from the A770. I am going to edit the previous post.
 
@DavidGraham commented on a different post that the Intel A770 was performing below the expectations in this game. Odd taking into account it runs under Vulkan which is a strong point of Intel ARC and again, Doom Eternal performed very well even with RT on.
The Alchemist architecture is slow on games that use ExecuteIndirect type of instructions. Tom Petersen (Intel's GPU guy) talked about this. It was an oversight from Intel, they didn't realize how important this instruction is to performance, and thus emulated it (in software) without native hardware support, which is why Alchemist falls behind in any game that uses it a lot.

Battlemage fixed this and added native hardware support, making it between 7x and 12x faster in executing this instruction than Alchemist.
 
Back
Top