Digital Foundry Article Technical Discussion [2025]

Any idea what they handle differently?
Yea snowdrop was a completely from scratch engine made for The Division. It is their current main engine for all future games going forward.
Anvil will only be used for AC franchise because it’s so custom developed for it. Dunia is their other engine for Far Cry which from what I’ve heard, if there is a new one coming, they may move it to Snowdrop as well.

Effectively Snowdrop is there most modern engine where all their active development is taking place. Snowdrop is Avatar and SW:Outlaws, and it’s also The Division and that toy game on Nintendo that unfortunately flopped really badly. But yea it has all the features to support a super high quality GAAS and all the features you need for a solid SP title. It is probably among the best streaming engines out there today.
 
Snowdrop is great but I can’t think of any games on it that aren’t open world. The recent Prince of Persia game was built in Unity. It makes sense to sunset Dunia as Snowdrop does all of the same things.
 
What are Snowdrop and Anvil and who uses them, and why two engines?
Snowdrop and Anvil are only some of the engines UbiSoft uses.

They have the Dunia engine for Far Cry series, it's among the oldest in Ubisoft's arsenal, dating back to Far Cry 2, it was upgraded with modern features and ray tracing in Far Cry 6, though it still lacks modern geometry representations.

They have the Anvil Next engine, used for Assassin’s Creed, Ghost Recon and Rainbow Six games, it's also among the oldest, only recently upgraded to modern features (ray tracing, modern geometry.. etc) in Assassin’s Creed Shadows.

They also have the Snowdrop drop engine, which is technically the most advanced engine in their arsenal, first used in The Division games, then in Avatar Pandora and Star Wars Outlaws, sporting the latest features (ray tracing, geometry) and even path tracing.

They have the Disrupt engine too, used for Watch Dogs games, also sporting ray tracing, but lacks advanced geometry.

The RealBend engine is also an old one used for The Crew racing series, lacks modern features though.

UbiSoft deploys the custom engine philosophy, they allow each studio to develop their own engine suitable for the task and the game genre, studios are allowed to use other custom engines from other studios too. It's safe to say that UbiSoft has the largest collection of custom engines under it's belt in the world right now.
 
Snowdrop is great but I can’t think of any games on it that aren’t open world. The recent Prince of Persia game was built in Unity. It makes sense to sunset Dunia as Snowdrop does all of the same things.
Starlink: Battle for Atlas. I guess could be considered not open world?
 
Last edited:

DF Direct Weekly #206: Xbox/Steam Leak, The Last of Us Part 2 PC Specs, DXR 1.2 Confirmed!​


0:00:00 Introduction
0:00:58 News 1: Xbox App Steam mock-up leaks
0:14:03 News 2: Assassin’s Creed Shadows disappoints on Mac + Supporter Shadows questions!
0:37:56 News 3: DXR 1.2 announced
0:49:02 News 4: The Last of Us Part 2 PC specs drop
0:58:56 News 5: Rise of the Ronin PC tested!
1:08:56 Supporter Q1: Should developers include a console settings preset in their PC versions?
1:19:38 Supporter Q2: Will Switch 2 have a 120Hz, VRR, HDR capable screen?
1:23:30 Supporter Q3: How long will it take for handhelds to have PS5-level power?
1:28:56 Supporter Q4: Should reviewers test PC gaming in multitasking-heavy scenarios?
1:35:24 Supporter Q5: If consoles were to disappear, would developers be able to more closely optimize their games for PC?
1:40:27 Supporter Q6: What’s Alex’s recovery regimen after reviewing a bad PC port?
 

DF Direct Weekly #206: Xbox/Steam Leak, The Last of Us Part 2 PC Specs, DXR 1.2 Confirmed!​


0:00:00 Introduction
0:00:58 News 1: Xbox App Steam mock-up leaks
0:14:03 News 2: Assassin’s Creed Shadows disappoints on Mac + Supporter Shadows questions!
0:37:56 News 3: DXR 1.2 announced
0:49:02 News 4: The Last of Us Part 2 PC specs drop
0:58:56 News 5: Rise of the Ronin PC tested!
1:08:56 Supporter Q1: Should developers include a console settings preset in their PC versions?
1:19:38 Supporter Q2: Will Switch 2 have a 120Hz, VRR, HDR capable screen?
1:23:30 Supporter Q3: How long will it take for handhelds to have PS5-level power?
1:28:56 Supporter Q4: Should reviewers test PC gaming in multitasking-heavy scenarios?
1:35:24 Supporter Q5: If consoles were to disappear, would developers be able to more closely optimize their games for PC?
1:40:27 Supporter Q6: What’s Alex’s recovery regimen after reviewing a bad PC port?
1:28:56 Supporter Q4: Should reviewers test PC gaming in multitasking-heavy scenarios?

Is there any good testing of multitasking while gaming? I'd like the think my 13600K would be well equipped for this, but it's an assumption with no evidence. Some background tasks and especially some webpages can use a surprising amount of memory and CPU.
 
Tech outlets are more concerned with having a convenient and consistent test environment than they are with generating test results that map to real world usage. IMO, basically all benchmark results these days have to be treated as a synthetic test with an ample amount of YMMV. Presumably it's the same reasoning that keeps outlets from doing CPU tests on MMOs and other CPU heavy, high variance multiplayer games.

The last time I used a single monitor setup was around 2003. After that it was game + some mixture of browser, winamp, mirc, ventrilo. These days it's game + browser, youtube, twitch, discord. Honestly the biggest performance impact I see isn't the game, but rather the occasional choppy video playback on the second display. I often just resort to disabling the hardware acceleration in the browser, or moving the second monitor to the iGPU or another GPU.

I remember being annoyed by some of the first reviews of AMD's heterogenous CPUs, as I felt like their methodology for testing the gamebar core parking behavior didn't adequately replicate messy real world usage where you're alt+tabbing in and out of a game, launching and closing applications, etc. Still not entirely sure how well the hetero X3D CPUs cope with scheduling in worst case scenarios.

LTT did a video on performance impact of multi-monitor setups last year, but he's using a 4090 on a system with 1-4 monitors and just running youtube video playback, so it's hard to extrapolate from that to lesser CPU and GPU setups, and more varied background application loads.
 
I remember being annoyed by some of the first reviews of AMD's heterogenous CPUs, as I felt like their methodology for testing the gamebar core parking behavior didn't adequately replicate messy real world usage where you're alt+tabbing in and out of a game, launching and closing applications, etc. Still not entirely sure how well the hetero X3D CPUs cope with scheduling in worst case scenarios.

If we only consider gaming and non-gaming workloads, right now it seems that Gamebar always put known games to only use V3D cache cores. I'm not sure if Windows is smart enough to allocate non-game workloads only to non-V3D cores when running a game with a heavy non-game workload on the background, but that's not necessarily better depending on what you want.

Another issue is that, when a game is doing some sort of non-interactive heavy workloads, such as shader compiling, they won't use all cores if Gamebar sees them as games. Ideally a game should be able to use all cores when doing these tasks (not just shader compile, but also loading screens, etc.), but right now it's probably very difficult to do so, without explicit support from games.

LTT did a video on performance impact of multi-monitor setups last year, but he's using a 4090 on a system with 1-4 monitors and just running youtube video playback, so it's hard to extrapolate from that to lesser CPU and GPU setups, and more varied background application loads.

I also have some problems with a similar setup. Sometimes when I'm playing a game I keep Youtube playing some video on another monitor. However, in very few games (notably Final Fantasy 14), there's problem playing video smoothly. Turning off "Hardware-accelerated GPU scheduling" helps, but that disables frame generation. Turning hardware acceleration for Chrome also helps but that's also not ideal. I used to connect my second monitor to the iGPU and it seems to solve this problem to an extent, but not entirely. I found out that this is probably because the game eats up all GPU resources, because if I restrict the framerate to 60 fps it'll be fine. Unfortunately, Final Fantasy 14 does not have a 120 fps option.
 
I also have some problems with a similar setup. Sometimes when I'm playing a game I keep Youtube playing some video on another monitor. However, in very few games (notably Final Fantasy 14), there's problem playing video smoothly. Turning off "Hardware-accelerated GPU scheduling" helps, but that disables frame generation. Turning hardware acceleration for Chrome also helps but that's also not ideal. I used to connect my second monitor to the iGPU and it seems to solve this problem to an extent, but not entirely. I found out that this is probably because the game eats up all GPU resources, because if I restrict the framerate to 60 fps it'll be fine. Unfortunately, Final Fantasy 14 does not have a 120 fps option.
I use Edge as my secondary monitor youtube browser, with disabled hardware acceleration. Edge is not my preferred browser, so that allows me to not have to toggle hardware acceleration for my main browser when I'm watching youtube and gaming. Beyond disabling HA, I've found Edge to be slightly smoother than most browsers for youtube. The only disadvantage is it doesn't allow me to watch HDR content with hardware disabled.
 
I use Edge as my secondary monitor youtube browser, with disabled hardware acceleration. Edge is not my preferred browser, so that allows me to not have to toggle hardware acceleration for my main browser when I'm watching youtube and gaming. Beyond disabling HA, I've found Edge to be slightly smoother than most browsers for youtube. The only disadvantage is it doesn't allow me to watch HDR content with hardware disabled.

Yeah this is also a nice solution. I think if NVIDIA or AMD can provide a limiter in the driver to avoid the GPU being used near 100% it should solve most of these problems, but I'm not sure if that's easy or even possible.
 

00:00 Introduction/tech features
02:34 Res/Comparisons
04:10 Anti-aliasing (PC)
06:13 Shadows (PC)
07:48 Reflections (PC)
08:50 Performance
10:09 Verdict

Series S: 1080p fixed, 30fps
PS5/SX: 1440p->1800p DRS, solid 60
PS5 Pro: 1440p->2160p DRS (rarely gets to 4k), Solid 60 (no higher framerate option)

  • Very solid performance on all platforms. Very good 60, just with a weird one-off on the PS5 Pro that was resolved after a reload, no stutter on PC.
  • Consoles have high texture setting, which is one notch below PC's Ultra texture. Textures in general are not great at points.
  • Screen space reflection setting is low on consoles that produces a large amount of shimmer, the higher settings on PC largely resolve this.
The biggest issue though is image quality due to lack of any temporal AA solution (!) - which also means no FSR/DLSS/PSSSR. Horrible flickering, but hey no blur so rejoice r/FuckTAA crowd I guess? :)
 
AC: Shadows tech interview



And accompanying video


Great video. I still think Unity looks quite a bit better outside of the issues with distance rendering. The material response and lighting still have a much more artificial look in AC Shadows.
 
Great video. I still think Unity looks quite a bit better outside of the issues with distance rendering. The material response and lighting still have a much more artificial look in AC Shadows.

I’m currently playing Unity and I wouldn’t say it looks better than Shadows but it is very close. Material quality in Shadows isn’t much of an upgrade from the baked stuff. Stone and wood looks about the same. Clothing on random NPCs in Shadows is much better. This is comparing Unity at native 4K to compressed YouTube videos of Shadows so it’s not a fair comparison.
 
I’m currently playing Unity and I wouldn’t say it looks better than Shadows but it is very close. Material quality in Shadows isn’t much of an upgrade from the baked stuff. Stone and wood looks about the same. Clothing on random NPCs in Shadows is much better. This is comparing Unity at native 4K to compressed YouTube videos of Shadows so it’s not a fair comparison.
The seams in Unity show when on rooftops or the seemingly unfinished outskirts of the map where the LOD issues can be distracting, but otherwise it's hard for me to understand how anyone could think Shadows doesn't look noticeably more gamey.
 

Shadows has a lot more foliage and is a more colorful game for sure. Unity has a lot of white, brown and gray. Doesn’t make it less gamey.

When comparing like for like a tree in Unity doesn’t look more realistic than a tree in Shadows to my eyes. Materials really do need an upgrade though. Maybe that’s the next frontier.
 
Back
Top