Digital Foundry Article Technical Discussion [2024]

CBR has plenty of interlacing artifacts. In motion it reminds me a bit of FSR. Ghosting varies depending on the DLSS type. With DLSS you can choose. I would never prefer CBR to a current DLSS as long as the DLSS does not have any oversharpening.

CBR vs. native at 6:26:


DLSS 2.0 Quality is sharper and flickers much less. Even DLSS Performance (1080p) looks better than CBR.
 
Last edited:
TAA has horrendous ghosting as well, often times both are tied (when TAA is ghosting hard in a game, DLSS ghosts as well, it's an issue of temporal solutions in general), but ghosting with DLSS is often significantly lower. However, we are debating the reconstruction of fine details and image components here. DLSS2 is without a question the superior solution here.

I've rarely, I'd ever noticed ghosting with just pure TAA on and have seen it way more in DLSS.
 
Against DLSS1, sure CB wins. But against DLSS2? of course not. BF5 has the very first DLSS1 implementation.
I'm sorry but no, some games have horrendous ghosting with DLSS2 and to a level that I've not seen in any CB games I've played.
Mod: Can people please link to samples? I doubt many members here have a clear catalogue of upscaling techniques and rendering artefacts for many games across platforms. Let's see what CB and DLSS1 and DLSS2 look like. ;)
 
Mod: Can people please link to samples? I doubt many members here have a clear catalogue of upscaling techniques and rendering artefacts for many games across platforms. Let's see what CB and DLSS1 and DLSS2 look like. ;)

Not only this, but let's make sure we're comparing across equal internal resolutions.

CBR is usually half resolution. This is 12.5% MORE internal resolution than DLSS Quality mode.

So to be even comparable to DLSS, let alone better, CBR should be consistently beating DLSS Quality.
 
Not only this, but let's make sure we're comparing across equal internal resolutions.

CBR is usually half resolution. This is 12.5% MORE internal resolution than DLSS Quality mode.

So to be even comparable to DLSS, let alone better, CBR should be consistently beating DLSS Quality.
But DLSS quality is much more costly than CBR running on Pro! Imaging running DLSS2 qualiy mode on Pro (or even on a PS5)! Those comparisons are useless for the vast majority of gamers. Comparing very cheap CBR running on Pro vs DLSS2 running exclusively on Nvidia is pointless. It's just for 1500$ PC owners happy to see their rig giving objectively superior results than a 400$ Pro.

What they should do is compare FSR2 quality vs CBR both running on PS5. Now this would be enlightening. But as I posted previously we can already make some assumptions based on different games and proper CBR should logically be much superior than FSR2. There is something very wrong with the high cost of FSR2 running at 60fps on consoles vs CBR.
 
Not only this, but let's make sure we're comparing across equal internal resolutions.

CBR is usually half resolution. This is 12.5% MORE internal resolution than DLSS Quality mode.

So to be even comparable to DLSS, let alone better, CBR should be consistently beating DLSS Quality.
At this point people are talking in such extremes that it doesn't really matter other than comparing a game rendering with CBR versus one rendering with DLSS1/2/FSR2 etc., it should be obvious that CBR is subjectively 'better' without needing analysis. If the difference isn't obvious, than the calls to replace FSR2 as 'obviously better' don't work.

In short, AB comparisons of CBR and DLSS and FSR in the same games, regardless of costs, need to be presented that show CBR is 'better'. Than, if so, we can consider if it's technically feasible based on costs to use CBR versus other techniques.
 
It's just for 1500$ PC owners happy to see their rig giving objectively superior results than a 400$ Pro.
AI upscaling is much more beneficial to mid and low-tier GPUs than it is to high-end ones so not sure what you mean here.

Remember how everyone was cheering saying their old 1080s had just gained a couple of years of useful life when FSR2 and 3 were said to be hardware agnostic?

It makes perfect sense to compare DLSS to CBR. This train of thought that PCs should always drop everything to get down to console levels for "fair" comparisons is honestly pretty annoying. For instance, TLOU Part I runs like ass on PC but you can claw back some major gains with DLSS Quality at 4K, effectively making much weaker GPUs have image quality on par with the PS5 version for similar or at times better performance. Yet no one makes the comparison because then console gamers whine that it's not fair because PS5 is running the game at native resolution. So what? DLSS is there and most people use it so you should 100% do that comparison.

It's just like when people insist DF pairs their cards with pieces of crap CPUs from 6 years ago for fair comparisons against the PS5 as if budget CPUs that are much faster aren't available.

The point of PCs isn't to behave like consoles so it's crazy to me that many always insist that we don't use a slew of PC-specific features only so they can be on even grounds with consoles.
 
They can only be sure for the PC version, not the console using underclocked laptop CPU. From 60 - 80fps I'd say as a general rule we can suppose most games will be CPU limited on consoles notably when many games use DRS so reducing the risk of being GPU limited.

This is why you need to use a modest CPU on PC as well to see which games start to be CPU limited with such a weaker CPU! This is not some unreasonnable request here! This has being done for decades on PC vs PC benchmarks, for good reasons.

I'm not sure why this is still being argued.

Flappy has already demonstrated that the two biggest wins on the PC side (Hitman 3 and Monster Hunter Rise) are clearly not CPU limited in the tested areas unless you accept the PS5 CPU is 2-3x slower than a budget 6 core i5-12400.

It also ignores the basic principle that devs will try to max out console hardware, with resolution being one of the easiest ways of achieving this on the GPU. You even raise DRS yourself above - the whole point of which is to maximise GPU usage at the target framerate.

On top of that, your argument basically assumes that Digital Foundry, effectively the number one cross console/PC performance analysis site on the web, doesn't understand what they're doing, and that you personally know better.

In light of all of the above, I think you really need to present one hell of a compelling case, backed up by solid evidence, that the PS5 is indeed CPU limited in those games if you want to continue making that claim. As all that's been presented so far are assertions, without any evidence, or even solid reasoning to back them up.

But DLSS quality is much more costly than CBR running on Pro! Imaging running DLSS2 qualiy mode on Pro (or even on a PS5)! Those comparisons are useless for the vast majority of gamers. Comparing very cheap CBR running on Pro vs DLSS2 running exclusively on Nvidia is pointless. It's just for 1500$ PC owners happy to see their rig giving objectively superior results than a 400$ Pro.

But the question wasn't whether DLSS is a better option for PS5, obviously that's not the case since it's not available on the PS5 and never will be. The ongoing debate was simply around whether CBR can produce a better output that DLSS. So in performance terms the question there is whether on the same hardware (i.e. Nvidia hardware) is CBR cheaper than DLSS at the same internal resolution. Maybe it is, DLSS certainly isn't free, but I've yet to see any evidence one way or the other on that. I do agree though that the actual measure should be based on final image quality given a fixed performance saving from upscaling. So not necessarily the same internal resolution if one solution is more expensive than the other. The main point of my post though was to stop people comparing the likes of 1/2 resolution CBR to 1/4 resolution DLSS or similar.

What they should do is compare FSR2 quality vs CBR both running on PS5. Now this would be enlightening. But as I posted previously we can already make some assumptions based on different games and proper CBR should logically be much superior than FSR2. There is something very wrong with the high cost of FSR2 running at 60fps on consoles vs CBR.

I agree this would be an interesting comparison. I don't agree that there is any logical reasoning to assume CBR should look better than FSR2 though given the same performance savings (which in the absence of compelling evidence to the contrary I will assume means the same or at east very similar internal resolution). Essentially the comparison point here, assuming a half internal res CBR should be the best CBR implementations vs the best FSR2 Quality (internal 1440p res for 4K output) implementations.
 
It makes perfect sense to compare DLSS to CBR. This train of thought that PCs should always drop everything to get down to console levels for "fair" comparisons is honestly pretty annoying.
I think the issue is different folk are looking for different things from comparisons, where a comparison can only look at one thing at a time. If a comparison isn't comparing what you wish it was, there's no point arguing over where it's deviated from your preference. Just let it slide and let those interested in the comparison as it is, what it's trying to do, talk about it. ;)
 
Just like any other upscaling techniques... some developers do a better job of integrating them than others... and some techniques play nicer with certain games/visual styles than others. Sometimes devs make incorrect choices for their games.. and sometimes they nail it. In one game they can be comparable, and in the next one has a clear advantage over the other.

While I think that in general we can agree that DLSS does better in most situations, there are times when others come extremely close. CBR can be very convincing in the hands of the right developers and with the right material. In the end you can only hope the developers make the right calls for their games.. and deal with the results.

Will be interesting when Sony's rumored PS5 Pro AI upscaling enters the mix. Microsoft has DirectSR coming for Windows, and it's not unreasonable to think a future Xbox couldn't support it as well. So it's about to get a whole lot more complicated.. but it would be nice if some of these techniques completely replace and kill off old ones so we never have to hear about them again.
 
Of course DLSS2 is better than anything running on cheap $400 consoles. But we'll never get that and on consoles we'll only get some cheap variant, maybe in the near future on PS5 Pro (but I actually think we won't).

For now the problem on console is not DLSS2 as it would never run as it doesn't have the required hardware (and software) to do so. The problem is that damn FSR2 developers use in their games when IMHO others better solutions are available (but quite harder to implement properly). The problem is developers have too many versions to optimize so they choose the easiest way, ideally one implementation easily available on most platform, like FSR2.
 
Hard disagree here. I think Guerrilla with Decima basically solved the reconstruction problem on consoles for sharp, clean and 60fps games with Horizon Forbidden West on PS5 (native res being only half CBR 1800p so about native 1270p which is pretty decent for alphas). Every others developers should look at what they have done here. With FSR2 they are reconstructing from too low, cost is super high cause they try to target too high final output. Here for Skull & Bones:

Ah crap you're exactly right, completely forgot that HZD:FW was 1800p. Yeah it took a bit to get it settled, but checkerboarding in there is miles ahead of FSR2 scaling from comparative resolutions - at least with what I've seen so far. I'd say Remnant 2 at least, had a decent FSR2 implementation considering how low the internal res was.

Now CBR in FW isn't perfect; it can mangle some aspects such as the neon signs which almost look unrecognizable compared to native, but those are relatively rare.

Why developers don't use more CBR? Well because it's hard to get right obviously! Plenty of things must be looked at, some effects must be done at a higher resolution and such. Even Guerrilla had trouble to get it right at first, and they are one of the most talented out there. But once the work has being done, the 30fps mode has being rendered obsolete for many.

That's what I've heard, it's a bitch to wrangle. But probably could have been squeezed into an AAAA game with an 11 year development cycle, I think.

They can only be sure for the PC version, not the console using underclocked laptop CPU. From 60 - 80fps I'd say as a general rule we can suppose most games will be CPU limited on consoles notably when many games use DRS so reducing the risk of being GPU limited.

This is why you need to use a modest CPU on PC as well to see which games start to be CPU limited with such a weaker CPU! This is not some unreasonnable request here! This has being done for decades on PC vs PC benchmarks, for good reasons.

It's a rather pointless request though when the goal of a video was to compare GPU's, and you do that by purposefully stressing GPU limited scenarios, which is exactly what Rich did. If the bulk of his video was testing 120fps uncapped modes on the PS5 vs the PC and the video was more about comparing equivalent budget gaming system performance, the complaint about the CPU being mentioned would have merit, but that's not what he did by design.

As has been detailed in this thread, there are only two benchmarks in that entire suite where there's even the faintest possibility of them being CPU limited, and was have data we have on how those games perform with weak CPU's - they're not remotely bottlenecked by CPU performance. Absolutely ancient CPU's, weaker than the PS5's (which is not a weak CPU!) can blow past those framerates when put into CPU limited scenarios and those games. This, on a system with the 'overhead' of a PC API vs console.

As I said with MHR - perhaps the strongest (cough) case for being CPU limited - you don't even need to look at PC benchmarks for that! The supersampled ~6k mode had the exact same performance discrepancy between the PS5 and 6700 as the 4k, 120fps mode. It's GPU limited in both. There is simply no rational reason to believe the CPU used had any bearing on the results shown in that video with the tests performed. It's fine to want a different video testing different things, but there was nothing wrong with the methodology used in that one, it properly tested exactly what it was testing.

Here's MHR in a CPU-limited scenario on a 5600X btw: ~300fps. It's time to give up this argument folks.

1708887111230.png
 
Last edited:
Ah crap you're exactly right, completely forgot that HZD:FW was 1800p. Yeah it took a bit to get it settled, but checkerboarding in there is miles ahead of FSR2 scaling from comparative resolutions - at least with what I've seen so far. I'd say Remnant 2 at least, had a decent FSR2 implementation considering how low the internal res was.

Crunching the math there, Forbidden Wests solution on PS5 (1800p output with 50% CBR internal res) is more or less equivalent to 4k output with FSR2 balanced. So that will make an interesting comparison in a few weeks when it launches on PC since I assume it will include all the upscaling modes, hopefully including the PS5 CBR solution.
 
Crunching the math there, Forbidden Wests solution on PS5 (1800p output with 50% CBR internal res) is more or less equivalent to 4k output with FSR2 balanced. So that will make an interesting comparison in a few weeks when it launches on PC since I assume it will include all the upscaling modes, hopefully including the PS5 CBR solution.
What we need to compare is the settings of FSR2 having the same cost as 1800p CBR (if that's the same great implementation used on PS5 obviously). Native res won't tell you the overall cost.
 

00:00:00 Introduction
00:00:43 The Good - Frame Rates and PC UX
00:04:05 The Good - Sky and Water Simulation
00:05:22 The Bad - Water Reflections
00:08:13 The Bad - Animations, Textures, and LOD
00:11:12 The Bad - Lighting, Particles and Post-Processing
00:13:00 The Ugly - Xbox Series X/S and PS5 Performance Mode Image Quality
00:16:25 The Ugly - Quality Mode Controls
00:17:50 The Incomprehensible - AAAA Game Design and Bugs
00:21:34 Video avast!
very low quality particles is one of those aspects where the resolution affects the IQ more. Also in things like transparencies, 'cos if you look at the overall image, the different is not that big.

This is a picture I took the other day with Elden Ring running at 800x450p and the same picture of the game running at 2560x1440p:

 
It would be worth taking a closer look at the Starfield FSR2 implementation. With the latest update, the already very good quality was improved even further, the image is almost artifact-free. With 1440p render resolution on Series X, it gives a better picture than many other games with native 4K. I emphasized this before, but it is even more true now, the FSR2 is not a switch in itself that you turn on and that's it. If it is implemented well, the result is excellent.
 
It would be worth taking a closer look at the Starfield FSR2 implementation. With the latest update, the already very good quality was improved even further, the image is almost artifact-free. With 1440p render resolution on Series X, it gives a better picture than many other games with native 4K. I emphasized this before, but it is even more true now, the FSR2 is not a switch in itself that you turn on and that's it. If it is implemented well, the result is excellent.
Reviews seem to indicate not much has changed with the latest update.
 
Obviously with YT compression I'm not sure of how much value this brings, especially since performance mode is at 1800p so it's not really a 'native vs CBR' comparison. For what it's worth though made a brief video, zooms and slows down about 30% in:

(motion blur was disabled)

 
Back
Top