Digital Foundry Article Technical Discussion [2025]

Rare outliers don't diminish statements that tend to hold true the vast majority of the time. FG is a great value add for Nvidia GPUs but I wouldn't classify it as performance either. Features like texture compression and materials seem like much more effective uses of AI but require too much dev effort in a market where it isn't supported by consoles.
Performance wise, it's been demonstrated that it's better to dedicate more silicon for AI tasks than for rasterisation nowadays. It'll give you more visual performance in games.

nVidia has done it again. You're going to get more than twice the performance of the previous generation in modern games. This hadn't happened between generations of GPUs on PC for more than 10-15 years. All of that while maintaining the raster power that is almost the same as the previous generation.
 
Rant done... for now :)
Huge disagree. That's highly highly subjective.

Latency varies from game to game at the same frame rate. Red Dead Redemption 2 will have 60ms of latency at 120fps, while Call of Duty will have 30ms of latency at that very same 120fps. So, if your latency is not universally tied and fixed to the frame rate, then you can't state that this is the one true measure of performance.

Additionally, that very same Call of Duty will have 20ms of latency at 120fps with Reflex on, Vertical Sync on will increase latency at that very same 120fps, while off will drastically reduce it. Variable Refresh Rate will mildly increase latency too. So you have additional layers of tech further decoupling latency from frame rate.

Comparing the game to itself is also meaningless after all of these variables, the gamer will adjust himself to whatever he finds comfortable, he might play Call of Duty with frame generation at 180 fps, reflex on, with a latency of 30ms and finds it acceptable, we can't say that frame generation increased his latency any more than Variable Refresh Rate increased his latency. Is testing with VRR now bad because it increases latency?

You simply can't draw the line at frame generation after all of these variables.

There are clear benefits to frame generation, such as improving the frame pacing of the game, increasing the fluidity of the presentation, and unlocking frame rates beyond the CPU limited code (which is a widespread issue in gaming nowadays). It's simply a very good tool in the box to achieve good performance, just like all the tools listed above.
 
Performance wise, it's been demonstrated that it's better to dedicate more silicon for AI tasks than for rasterisation nowadays. It'll give you more visual performance in games.

nVidia has done it again. You're going to get more than twice the performance of the previous generation in modern games. This hadn't happened between generations of GPUs on PC for more than 10-15 years. All of that while maintaining the raster power that is almost the same as the previous generation.
It's good tech, but it's also not quite that miraculous. Lossless Scaling has offered 4x frame generation for a while now as well, so the idea was kind of obvious and there. Heard decent things about it as well(though have never tried it myself).

Plus if we're still agreeing that we should be around 60fps baseline for optimal frame generation quality, then the practical benefits of being able to go well beyond the roughly 120fps of 2xFG are fairly limited, since you're gonna need a super high framerate monitor to even take advantage of the benefits, since any extra frames beyond your display's refresh rate are completely wasted in this situation. It's not like pushing super high framerates in games 'properly' where you gain in improved input response as you push beyond your display's refresh rate.

All this is why I maintain that the ability to get us from 30->60fps with good quality generation and minimal input latency penalty would be a far bigger deal. This would be useful for a much larger amount of people, not just because the vast majority of people have 60 and 120hz monitors at most, but also because it would grant lower end GPU's the ability to make heavier games perform well with higher resolutions and settings/RT and whatnot. That would feel more like a true 'doubling of performance' in more tangible terms.
 
Obviously there are some discussions of latency in the context of frame generation, but I think we need to basically invert how we talk about this tech. Reporting average FPS obviously covers all sorts of sins to start with, but gamers use it as a proxy for two things: visual smoothness and responsiveness/latency/"feel". I would argue the latter is as important if not more important than the former, especially since it typically implies a lower bound on the visual smoothness as well; it is the real hard constraint. Game genres and personal preference affect it, but certainly on PC in games with mouse controlling camera, I care a whole lot more about latency than whatever the FPS number says.
I agree from a Technical perspective. But from a use case perspective I think nvidia is on the right of it.

Ultra low latency afforded by super high frame rate is something in which graphical settings can be reduced heavily to achieve. No competitive gamer plays with ultra high RT settings. In fact, since graphics cards have been around it is always set to the lowest settings because it’s the easiest way to distinguish players from environment.

So provided that there is enough brute force in your graphics card to obtain your monitors maximum refresh rate at lowest possible settings, this is a win for most graphics cards and I think this is largely achievable.

Scaling up to 4K maximum settings however is boundary wall we are trying to break. How do we get that level of smoothness, with 4K resolution with maximum RT and texture fidelity. That’s the challenge, and no card is up to the task. The frame rate is false in that the update and latency are no representative, but neither are triple buffered games. But one thing is for sure, it certainly runs smooth and looks amazing!
 
I'd like to know the performance numbers of the transformer vs the CNN. The transformer model is a lot bigger and complicated, so what's the cost?

View attachment 12798
dunno tbh, might watch the video a second time.

Anyways, nVidia hasn't only destroyed the competition this gen -AMD and Intel- but also current consoles. Who is going to justify the ridiculous price of current gen consoles that are like 16 times slower than a 5090?
 
Performance wise, it's been demonstrated that it's better to dedicate more silicon for AI tasks than for rasterisation nowadays. It'll give you more visual performance in games.

nVidia has done it again. You're going to get more than twice the performance of the previous generation in modern games. This hadn't happened between generations of GPUs on PC for more than 10-15 years. All of that while maintaining the raster power that is almost the same as the previous generation.
You aren't getting more than twice the performance. FG is a visual benefit, not a performance benefit.
 
Plus if we're still agreeing that we should be around 60fps baseline for optimal frame generation quality, then the practical benefits of being able to go well beyond the roughly 120fps of 2xFG are fairly limited, since you're gonna need a super high framerate monitor to even take advantage of the benefits, since any extra frames beyond your display's refresh rate are completely wasted in this situation. It's not like pushing super high framerates in games 'properly' where you gain in improved input response as you push beyond your display's refresh rate.

All this is why I maintain that the ability to get us from 30->60fps with good quality generation and minimal input latency penalty would be a far bigger deal. This would be useful for a much larger amount of people, not just because the vast majority of people have 60 and 120hz monitors at most, but also because it would grant lower end GPU's the ability to make heavier games perform well with higher resolutions and settings/RT and whatnot. That would feel more like a true 'doubling of performance' in more tangible terms.

60 fps (or hz) and 120 is really thinking about it from a TV and console perpsective and not PCs and monitors.

Very few people would have 120hz monitors. High refresh monitors for the most part almost immediatly outside of the first couple of releases moved onto 144hz as being the defacto standard for high refresh. While for the last couple of years the low end for high refresh has really been 165hz, with now 170-180hz being common before the jump up to 240hz.

The vast majority of people buying these GPUs have high refresh monitors as the cost (and other compromises) to adopt them has basically been neglible now for years. Really if you're still using a 60hz monitor that should be your upgrade point before anything just to get actual VRR support with a proper low end from LFC.

Lastly the 60fps baseline as far as I know is not accepted as you frame it. We've had debate on it down to as far as ~40 fps and also there's debate on how it varies depending on the implementation so there is no universal baseline.
 
Before we rathole too far into the definition of "performance" let me just steer this slightly differently and something we can hopefully all agree on:

1) Responsiveness/latency in games matters. Obviously it gets to a point of diminishing returns but so does animation fluidity.
2) We have been using FPS and frame times measured at present time as a rough proxy of both, but with various limitations.
3) Frame generation is great, but it drastically decouples the FPS numbers from responsiveness/latency, even in the same game. This was already an issue with 2x frame gen but will be even more with 4x.
4) On top of this, it at least slightly reduces responsiveness.
5) Due to these factors, FPS-at-present-time should not be the measure we use to compare responsiveness with frame generation.
6) I and - I would wager - a lot of PC gamers care a lot about the latency numbers. If they are now going to be significantly conflated with other things I want them measured and reported separately in reviews.

I don't think you have to be a competitive gamer to notice the responsiveness differences. Literally everyone I've asked can notice the difference between flicking a blank window around on the desktop at 120hz vs. 60Hz, and that effect is not present with frame generation from 60->120 because the thing you are noticing is the fact that the movement feels immediately connected to your flicks. Obviously if I'm playing some slower game or most games on a controller the tolerance is higher, but the FPS with mouse and keyboard case is definitely the flicking windows one where I want it to feel as 1:1 as possible, even if I'm not trying to headshot opponents in CS2.
 
Last edited:
Better visuals aren't the only reason people buy GPUs. Higher framerates and more responsive gameplay is also important.

I didn't say better visuals, I used your term a better visual experience which would all faceats including the temporal component.

As for more responsive gameplay and other aspects I agree that they are considerations for buying a GPU. However I'm going to set this aside for moment and do another more encompassing response on this for simplicity at the moment.

But for moment then let's just stick with the visual component. To me it seems like if FG is delivering a better visual experience it is higher performing in terms of the visual aspect of gaming?
 
Before we rathole too far into the definition of "performance" let me just steer this slightly differently and something we can hopefully all agree on:

1) Responsiveness/latency in games matters. Obviously it gets to a point of diminishing returns but so does animation fluidity.
2) We have been using FPS and frame times measured at present time as a rough proxy of both, but with various limitations.
3) Frame generation is great, but it drastically decouples the FPS numbers from responsiveness/latency, even in the same game. This was already an issue with 2x frame gen but will be even more with 4x.
4) On top of this, it at least slightly reduces responsiveness.
5) Due to these factors, FPS-at-present-time should not be the measure we use to compare responsiveness with frame generation.
6) I and - I would wager - a lot of PC gamers care a lot about the latency numbers. If they are now going to be significantly conflated with other things I want them measured and reported separately in reviews.

Do gamers care about latency? I don't think it's as clear cut as you state it as especially once trade offs are involved. Some do, especially on the competitive PC side but plently of people who play competitive esports titles still run max settings at the expense of latency. People can prefer v-sync with higher latency to without. So it seems that latency hasn't been the end all be all and gamers have always been content to compromise here.

We also had people playing games on consoles with TVs that added massive latency via post processing for years. Awareness of game modes are kind of a given nowadays but that wasn't the case just a few years ago. So I don't think we can assume that gamers are going to unplayable (whatever that means) with FG latency.

I feel the latency issue seems like a perspective issue as it pertains to frame generation. Latency reduction (such as Reflex) for SP games and minimization seems to have really only come to the fore with the introduction of FG to SP games. So really we have the following 3 scenarios from lowest to highest latency -

1) Latency reduction

2) Latency reduction + frame generation

3) Neither being used

So why is it a perspective issue? Well it seems like we have the following -

1) Frame generation + latency reduction is lower latency than having neither, so therefore frame generation has better latency. Things are better than before.

2) You can enable latency reduction without frame generation, therefore frame generation has worse latency. Things are worse than it can be.
 
So why is it a perspective issue? Well it seems like we have the following -

1) Frame generation + latency reduction is lower latency than having neither, so therefore frame generation has better latency. Things are better than before.

2) You can enable latency reduction without frame generation, therefore frame generation has worse latency. Things are worse than it can be.
Are you really trying to argue based on this that we should intentionally conflate the two cases even for the enthusiasts who care enough to bother looking up benchmarks? This feels like a pretty extreme level of motivated reasoning to me; I don't think that direction is a useful discussion to have.

Ultimately if you don't care, fine. But some of us really do care and buy products based on them feeling better to play games with, not just looking smoother. A 5070 absolutely will not be an equivalent experience to me as my 4090 (not even considering VR!) and I think it's completely reasonable to call out NVIDIA for making such a ridiculous claim, let alone folks who perpetuate it. 🤷‍♂️
 
Frame generation is great, but it drastically decouples the FPS numbers from responsiveness/latency, even in the same game. This was already an issue with 2x frame gen but will be even more with 4x.
Again it is not an issue because:

1- 4x frame gen added very little latency over 2x
2- Latency is not a universal metric at all, your latency in Call of Duty is vastly different than your latency in Hellblade 2, how can we claim latency is king when it's variable like that?

Even in the same game, latency is hugely different based on rendering quality, 120fps at Ultra settings in Red Dead Redemption 2 has way higher latency than 120fps at Low settings. If you factor in Render Queue/Pre Rendered Frames/Future Frame Rendering (Battlefield parlance), things quickly go out of whack.

The best you can do is say that frame generation increases latency, just like any other tech (Ultra rendering/Vertical Sync/Variable Refresh Rate/Render Queue/Reflex off ...etc). But if we come out and say Frame Generation is absolutely not performance (because it increases latency), then this is not conducive to a constructive conversation. Is the performance from increased render queue not performance? is the performance at Ultra rendering not performance? is performance from Vertical Sync on or Reflex off not performance? because all of these things increase latency.
 
Last edited:
Again it is not an issue because:

1- 4x frame gen added very little latency over 2x
2- Latency is not a universal metric at all, your latency in Call of Duty is vastly different than your latency in Hellblade 2, how can we claim latency is king when it's variable like that?
This is just a series of strawmans at this point. Because latency varies between games, I shouldn't care about what the relative latency between two GPUs is in a single game? Because FPS varies when I change settings does that mean I don't care about the relative smoothness of different GPUs at the same settings? Ridiculous; we fix the settings so that we can compare GPUs.

Even in the same game, latency is hugely different based on rendering quality, 120fps at Ultra settings in Red Dead Redemption 2 has way higher latency than 120fps at Low settings. If you factor in Render Queue/Pre Rendered Frames/Future Frame Rendering (Battlefield parlance), things quickly go out of whack.
Yeah which is why... get this... we fix those other parameters when we measure.

But if we come out and say Frame Generation is absolutely not performance (because it increases latency), then this is not conducive to a constructive conversation.
Right which is why we are not going to discuss the definition of the word "performance", because it clearly means a bunch of vague things to different people.

I'm saying specifically that enabling frame generation does not reduce latency, which should be non-controversial.
Therefore saying the 5070 gives the same result as a 4090 would be incorrect in terms of responsiveness/latency, right?
Therefore I think we need to explicitly report the responsiveness differences and/or never compare the "FPS" of frame generation to non-frame generation, because it's apples and oranges.

And yes there are smaller examples of things that fall into a similar category in the past, albeit not the ones you mention. But they all have much smaller effects on both latency and throughput to the point that they have been largely ignorable (although occasionally called out when the vary by large amounts), and often they are fixed for a given workload anyways. The effect of frame generation on confounding the variables is so large that it cannot be ignored. It is equivalent to saying that all GPUs can run VR at 90Hz locked or whatever because spacewarp exists. These techniques are all great and in the case of VR - very necessary - but they are not equivalent.
 
Last edited:
Again it is not an issue because:

1- 4x frame gen added very little latency over 2x

Yes, but the point I think Andrew was making though with DLSS4 is that now frame generation is even more removed from the latency characteristics one would achieve from a game with that same native framerate. That would still be the case if DLSS4 has no latency added at all over DLSS3.

So showing something running at "240fps" with DLSS3, would mean you would be getting the latency of that game running at 120fps without framegen. But if you have a game you're advertising as '240fps with DLSS4', that could mean you'd be getting the latency of that game running at ~60fps.
 
Therefore I think we need to explicitly report the responsiveness differences and/or never compare the "FPS" of frame generation to non-frame generation, because it's apples and oranges.

I agree with this in principle but have no idea how you properly communicate the nuance to the general public. There’s clearly a tangible and desirable benefit from framegen. How do you put a number on it?

Maybe take a page out of the AA book and report FPS as 1x, 2x, 3x, 4x framegen etc. where 1x becomes “native”.

Even then how do you compare 4x framegen across IHVs when there can be massive quality, consistency and latency differences? There’s no perfect way to do it that works for marketing to the masses.
 
  • Like
Reactions: Rys
Back
Top