Digital Foundry Article Technical Discussion [2025]

That is something, which i dont understand: Can a frame only include information from the same "timeline"? What happens when a frame contains information from the past frame(s)? Isnt this just a "hallucinated frame", too?
we will need a bit more information on DLSS4 FG to know that. But if we're reading the slides right, all FG is based off the current rendered frame, so it' just produces the next 3 interpolated inbetween the past and present.
 
Nice debate here.

I understand everyone's points and they are all valid. I'm currently playing DCS World (flight sim) and notice that FG doesn't help the latency issues rendering the cockpit mask. So regardless of the FPS being 120fps, it's not helping that particular rendering window. I am glad that Nvidia has at least developed techniques for both FPS and Latency(with their Reflex tech). To me they sort of go hand-in-hand. They are both important and I think reviews should be made with both (i.e. FG FPS and non-FG FPS and/or Reflex).
 
I mean, look at all of the people claiming that the 1+ frame in latency incurred by frame gen is a deal breaker, which often adds less than 20% end to end latency.

Ha, great observation. I also love how this community can simultaneously not care about SSR or shadow map artifacts while being completely devastated by upscaling and framegen artifacts.

I don’t think we can dictate the minimum raw frame rate that someone would find acceptable for frame gen. There’s too much variation by game and personal tastes. I was messing around the other day with AC unity capped at 45fps and it wasn’t bad at all on a controller from the couch.

The only solution is to let people play the way they want to play.
 
I wonder what would happen if developers took full advantage of nVidia hardware like they do with consoles, the incredible things we would see.

It's a different story when we look at AI computational performance. We know the RTX 50-series will have FP4 number format support, but just as important, it seems to have twice the compute per tensor core as the RTX 40-series. That's not enough compute for the 5070 to surpass the 4090, but it's 'only' about 25% slower in theoretical performance. And if something can leverage FP4 on the 5070 where the 4090 needs to use FP8, then it might run better on the 5070. But even the INT8 TOPS favors the 4090.

As Huang mentioned, nVidia microchips have indeed broken the Moore's Law for the first time.

 
That's the marketed use case but not how it plays out in practice. The input lag penalty when using a low base frame rate is so high you'd never want to use it. It also introduces more motion artefacts. Under no circumstances, I'd recommend anyone take a 30fps output and run FG on it.
true that. With the only exception being native 30fps games like those running via emulation, where 30fps + FGx3 = 90fps and FGx4 = 120fps look absolutely incredible, especially if you knew and had played the original game, the difference is staggering.

Dunno about 36fps -for those with a 144Hz display- but 41fps base start to be ok-ish if you have a 165Hz display, and from that number on all the sheep are grey.
 
Lets go the other way on this: https://www.nvidia.com/en-gb/geforce/news/reflex-2-even-lower-latency-gameplay-with-frame-warp/

With reflex 2, nvidia is lowering latency by generating pieces of the image on the fly instead of waiting for the native rendering to complete. How should reviewers look at that? You're getting lower latency which is a big win. So if the IQ isn't impacted, is this a win win and doesn't need further dissection?
This is basically what VR does, and it is at this point well understood in that context. It can indeed make some of these cases feel better (especially the wiggling the mouse FPS one), but the limitations and fairly fundamental. Camera rotation looks pretty good, camera motion or anything moving significantly on the screen will look blurry or jittery or both. Inpainting artifacts are fairly obvious with any significant motion. Someone noted DCS so it's worth noting that lots of people can enjoy it at 45Hz time/spacewarped up to 90 or similar as head motion is pretty fine, but near-field moving objects or other aircraft are generally pretty blurry; everyone can tell the difference, even if you do have to find a set of compromises that work for you.

Again the severity is related to how much motion there is and what your base frame time/latency is. If you're doing some 1-2 pixel inpainting at 240->480Hz or something it's fine, but it's not a panacea that will make valorant running at 30fps feel like 60 with no image artifacts. Which is ultimately one of the main issues with the advertising around these kinds of features. They are things that bring some benefits to situations that are already running decently, but they can't really take cases that are running poorly and make them good.

On that note though I'd love to see some more attention on VR again. I realize it's a relatively small crowd that does PCVR, but we also tend to buy very high end hardware :D If Reflex 2 has some improvements possible in that context it would be great to see it better integrated. Even stuff like DLSS is generally pretty poorly supported and has various issues with VR.
 
Last edited:
I'm going to try one more time to steer this back on topic. My intention was very much not to start a general discussion about the pros and cons of frame gen. As I've said in every single post I've made, frame gen is great tech to have available and obviously in a lot of cases it's great to get additional smoothness. I'll state - yet again - that it's normally a better way to fill out additional vblanks on high refresh monitors than just repeating previous frames.

Everyone instead seems to have gotten fixated on the notion that the problem with it is that it adds some latency, and then arguing about whether it's "worth it" vs. the motion clarity improvements. This is a strawman to the original discussion. We can split it off to a separate thread if anyone actually wants to have this argument, but it's not an argument I'm making, and it's somewhat off-topic to the point that I made here specifically because it relates to the press. Also no one is saying that motion clarity doesn't matter; that is another strawman. I'll try one last time:

Generally the expectation if you report "card A at 120fps and card B at 60fps" (or 60 vs 30, or whatever) is that card A is notably more responsive than card B. Frame generation - even if it were completely free, had identical latency to the base frame rate and zero artifacts - breaks these assumptions in a major way. Since the feel of a game is pretty indisputably an important part of why we measure these things (of course there are diminishing returns, just like there are with motion smoothness!), that sort of reporting seems like a problem to me. If we are going to repurpose "FPS" to speak only to motion smoothness then we need to more consistently report a different metric for responsiveness.

As this is the DF thread, I'd propose we try and keep the discussion related to media reporting and benchmarking of FG cases rather than all the separate discussions, although we can start threads for those as desired. I just see people increasingly fighting strawmen and arguing past each other here in a way that isn't making any forward progress.
Agree, and I think it'd be great to just provide both numbers in reviews.

With 'optimized' settings including reflex enabled, Radeon Anti-Lag, Intel XeLL, or equivalent all enabled, it'd be great to see an average, 1% low, and 0.1% low for both FPS and input-to-photon latency.
If the 3 GPU companies can start an input-to-photon war and sell the improvements to customers, I'm definitely here for it. :)
 
@Andrew Lauritzen VR is the main reason I got back into high end pc gaming. Sim racing in assetto corsa. It’s an area where nvidia did well to support it for a while with vrss and sps.

I’m not a fan of asynchronous warp personally and prefer 90hz and adjust super sampling to keep it there.

Speaking of dlss, not long ago, I came across this modder and he managed to get dlss working in vr for AC. https://snowymoon.io/ I run DLAA with supersampling at 140% with openxr with maxed out settings and mods (csp and pure). This is on a reverb g2 as I couldn’t deal with the Quest3 compression artefacts.

The image stability and quality using DLAA in vr has been mind blowing. If you use AC at all, give it a shot.
 
Are you really trying to argue based on this that we should intentionally conflate the two cases even for the enthusiasts who care enough to bother looking up benchmarks? This feels like a pretty extreme level of motivated reasoning to me; I don't think that direction is a useful discussion to have.

Ultimately if you don't care, fine. But some of us really do care and buy products based on them feeling better to play games with, not just looking smoother. A 5070 absolutely will not be an equivalent experience to me as my 4090 (not even considering VR!) and I think it's completely reasonable to call out NVIDIA for making such a ridiculous claim, let alone folks who perpetuate it. 🤷‍♂️

From an academic stand point? Sure you can look at them separately in silos. But in terms of the actual holistic end user experience? I would think it's more complicated. What if GPU A has FG and latency reduction but GPU B has neither. Does describing to an user that GPU A's FG does not add to the user experience compared to GPU B because it has more latency make sense?

I feel this perspective issue has gone back to when AI upscaling with DLSS was introduced from a product comparison stand point.

The problem here is that comparison points sans things like frame generation from some seem to like it is based on the scenario of an alternative product being able to do the same without frame generation. But is that actually realistic as it pertains to the real world?

Nvidia's keynote might have had an attention grabbing line with the 5070 vs 4090 but to me that seems like it applies both ways as just negative or positive attention fodder with very little real implications. People in the market for a 4090 in term of price range are not the same as those for a 5070. At most realistically you might be debating a 4070 ti super (or 4080) as the outgoing last gen depending on price reductions vs the 5070 in the real world. Offerings from other competitiors also are not going to deliver anything close to an x2 difference in frame rates either.

This why the idea of things like comparing 60->120fps FG vs 120fps "native" or AI upscaled 4k vs. 4k "native" is basically purely academic and for online debate fodder. What kind of real user scenarios are you actually going to pick between those options from a product stand point?

The real world scenarios are more along the lines of you either get 60->120fps FG or 60->70fps (if even that) conventional as the alternative in terms of what you go out and buy. Not you can just buy 120fps output sans FG as the alternative or anything remotely close to that.

Addendum:
I typed most of the above awhile go so will add this as the conversation has moved on somewhat and I thought this was actually the HUB/reviewer oriented thread originally. I will agree with your point from a later post in that the FPS comparisons shouldn't be done but not based on entirely the same reasoning. FPS numbers (or any similar metric) should have always just been a data point. They never (well okay this is extreme, but maybe outside of the early days) should have been the end all and be all for reviews or judging the utility of a GPU for the end user. Yes it's comforting and convienent to have a supposedly all encompassing single "objective" (well there's caveats here) to definitively say GPU A is better than GPU B but that was always flawed and is just going to be more flawed going forward.
 
Last edited:
I'm not sure if you want to call it somethign else? like Rendered Frames per Second, or AIFPS, HFPS, etc. so that we can denote things going forward. I suppose the community can come to decision on the easiest naming convention forward.
The non generated frames could be called base fps, while the generated ones + base could be called total frames.
 
I didn't quite get to put this into words with my previous posts, but FPS was the end all be all of reviews in the early days of 3d acceleration. And then, features started being tested in reviews. 32bit vs 16bit color depth was featured in reviews when ATi and nVidia added the feature, and 3Dfx fell behind. Filtering quality started getting front billing when it was revealed that nVidia and ATi were lowering the quality of filtering in common benchmarks. FSAA comparisons were common. Then filtering quality again with the rise of AF, because graphics cards started to ship with optimizations to AF could cause image quality differences along with performance gains. The GeForceFX line had lots of image quality comparisons because it supported both higher quality and lower quality shaders than competing cards, and nVidia was swapping out shaders from shipping games at the driver level to gain performance, often with noticeable IQ reductions. This was a wild time because I remember updating drivers on my wife's computer and launching a game, and it looked like almost every surface ad VRS on. After that, reviewers struggled to describe frame pacing, often referring to micro-stutter but not having numbers to back up what they were feeling. Once frame times and frame pacing entered the standard vocabulary, reviews once again reverted to FPS, with mentions of frame times and pacing.But image quality has largely fallen out of fashion except maybe IQ differences when different upscalers were used. And latency... well only the "crazy" esports guys who were building custom hardware to test that even cared about it.

Reviews had a lot more nuance back then. Anyone who read them back then I'm sure will remember the AF test images of a checkerboarded tunnel, or the pages full of Half-Life 2 (I think?) utility poles and lines used to test AA. Or those zoom ins to see where mipmaps changed between cards. Will this new future, where reliance on upscaling and frame generation result in harder work for reviewers? Probably. But I reject the idea that we, the consumers, are too dim to understand the nuance of these reviews. Some of us went through this already, and we figured it out just fine. Plus, we have Digital Foundry to teach the youngins now.
 
Last edited:
Back
Top