Digital Foundry Article Technical Discussion [2025]

Shifty Geezer

uber-Troll!
Moderator
Legend
New Year, New Thread.

Previous Thread @ https://forum.beyond3d.com/threads/digital-foundry-article-technical-discussion-2024.63461/


Rules of Engagement : Read before posting or run the risk of losing posting rights in the Tech Forum!

This is principally a technical discussion thread. It is allied to the other tech analysis threads and shares the same rules as those which you should familiarise yourself with. The purpose is to discuss the findings of the Digital Foundry articles on a technical level, including the techniques employed by game developers in their games, and the comparative design decisions off cross-platform titles. Digital Foundry is more closely allied with Beyond3D than other gaming sites which is why they get special mention here! :D

What this thread is not, is a place to complain about a port's quality and make accusations of developers, to offer feedback on the quality of the Digital Foundry writing or the writers' biases, trumpet your preferred console over the other, talk business and sales, or otherwise sidetrack the discussion from talking about the gaming technology covered in the Digital Foundry articles. If you do not post to the required standard, your posts will be removed, and persistent unwanted contributions will see you locked out of the Technology Forum.

If you want to leave editorial feedback for Digital Foundry, the best place is to leave a comment for the relevant article(s).
 
Lot's of artifacts from 0:24-0:27, 2:19-2:22, 2:26-2:30(glove in the background), I'm just going to stop now. There are way too many to count. I'm not even that far into the video and the artifacts are just too much. You can also clearly see the effects of mfg even though they pan the camera slowly. Maybe it's because the camera is panned so slowly that it's very visible. Regardless, I need to finish the video and will comment more on this shortly.
 
Lot's of artifacts from 0:24-0:27, 2:19-2:22, 2:26-2:30(glove in the background), I'm just going to stop now. There are way too many to count. I'm not even that far into the video and the artifacts are just too much. You can also clearly see the effects of mfg even though they pan the camera slowly. Maybe it's because the camera is panned so slowly that it's very visible. Regardless, I need to finish the video and will comment more on this shortly.

1736269502564.png
 
Alright... lots of good stuff here (especially good to see some improvement to the multisecond ghosting/smear), but permit me another on frame generation.

tldr: I straight up don't think we should report "FPS" numbers with frame generation at all. If we do, it should be buried, not highlighted.

This has nothing to do with the silly arguments about the "purity" of "native" pixels vs "generated" pixels or any of that nonsense. NVIDIA has very effectively confused this argument by tying upsampling and frame generation under the same marketing umbrella and continually implying to people that generated frames are really just the same as generated/upsampled pixels, right? From a "what we are doing to generate them" perspective, there are similarities, but from an end user point of view they are fundamentally different things.

A couple decades ago now the tech press decided to start highlighting frame times more for several reasons, but one of the major ones is that they are better proxy for how a game *feels* to interact with than frame rates. People noted that SLI/AFR would often give higher FPS numbers and sometimes smoother animations but it would not feel any better to play, despite the higher FPS. To explain this, the discussion shifted a bit more to frame times and how they relate to what actually matters here, latency.

Obviously there are some discussions of latency in the context of frame generation, but I think we need to basically invert how we talk about this tech. Reporting average FPS obviously covers all sorts of sins to start with, but gamers use it as a proxy for two things: visual smoothness and responsiveness/latency/"feel". I would argue the latter is as important if not more important than the former, especially since it typically implies a lower bound on the visual smoothness as well; it is the real hard constraint. Game genres and personal preference affect it, but certainly on PC in games with mouse controlling camera, I care a whole lot more about latency than whatever the FPS number says.

And that's the rub - by reporting the FPS of cyberpunk as >200fps or some basically made-up number when in reality it feels like 30-60, IMO you are doing the community a disservice. Normalizing these sorts of comparisons is what allows NVIDIA to say with a straight face that a 5070 is as fast as a 4090, which is obvious nonsense.

If I told you that all my games run at 240fps on integrated graphics because I play them on my TV with TruMotion blah blah enabled you'd tell me I was an idiot, and rightfully so. The same logic applies to frame generation even if there were zero visual artifacts. Marketing has successfully drawn the discussion away from what actually matters - i.e. what's the responsiveness of these games - into pixel hunting quality comparisons. Those comparisons are fine to do when comparing different frame generation techniques and generations, but they really are irrelevant to performance comparisons.

Personally I think we should maybe consider another industry shift towards making latency a/the main thing we report rather than frame times. If it becomes normalized to report FPS numbers with frame generation enabled then IMO the press has failed to cut through the bullshit.

And just to reiterate yet again to hopefully be absolutely clear: there is nothing wrong with frame generation tech existing and it can indeed be useful in a variety of situations. It's totally fine to evaluate it and advertise it as a feature of your product. Indeed it's great as a "let's fill up these other 240Hz OLED frames with something smarter than just the same frame again" and arguably a better long term solution than VRR. I could at some point see the input frame being entirely decoupled as well. But it's important to remember that it's a visual smoothing feature, not a performance feature. We should not allow anyone to discuss it in the context of game performance, as it fundamentally makes responsiveness worse, which is one of the - if not the - most important things we are using FPS and frame times as a proxy to measure in the first place.

Rant done... for now :)
 
Last edited:
Alright... lots of good stuff here (especially good to see some improvement to the multisecond ghosting/smear), but permit me another on frame generation.

tldr: I straight up don't think we should report "FPS" numbers with frame generation at all. If we do, it should be buried, not highlighted.

This has nothing to do with the silly arguments about the "purity" of "native" pixels vs "generated" pixels or any of that nonsense. NVIDIA has very effectively confused this argument by tying upsampling and frame generation under the same marketing umbrella and continually implying to people that generated frames are really just the same as generated/upsampled pixels, right? From a "what we are doing to generate them" perspective, there are similarities, but from an end user point of view they are fundamentally different things.

A couple decades ago now the tech press decided to start highlighting frame times more for several reasons, but one of the major ones is that they are better proxy for how a game *feels* to interact with than frame rates. People noted that SLI/AFR would often give higher FPS numbers and sometimes smoother animations but it would not feel any better to play, despite the higher FPS. To explain this, the discussion shifted a bit more to frame times and how they relate to what actually matters here, latency.

Obviously there are some discussions of latency in the context of frame generation, but I think we need to basically invert how we talk about this tech. Reporting average FPS obviously covers all sorts of sins to start with, but gamers use it as a proxy for two things: visual smoothness and responsiveness/latency/"feel". I would argue the latter is as important if not more important than the former, especially since it typically implies a lower bound on the visual smoothness as well; it is the real hard constraint. Game genres and personal preference affect it, but certainly on PC in games with mouse controlling camera, I care a whole lot more about latency than whatever the FPS number says.

And that's the rub - by reporting the FPS of cyberpunk as >200fps or some basically made-up number when in reality it feels like 30-60, IMO you are doing the community a disservice. Normalizing these sorts of comparisons is what allows NVIDIA to say with a straight face that a 5070 is as fast as a 4090, which is obvious nonsense.

If I told you that all my games run at 240fps on integrated graphics because I play them on my TV with TruMotion blah blah enabled you'd tell me I was an idiot, and rightfully so. The same logic applies to frame generation even if there were zero visual artifacts. Marketing has successfully drawn the discussion away from what actually matters - i.e. what's the responsiveness of these games - into pixel hunting quality comparisons. Those comparisons are fine to do when comparing different frame generation techniques and generations, but they really are irrelevant to performance comparisons.

Personally I think we should maybe consider another industry shift towards making latency a/the main thing we report rather than frame times. If it becomes normalized to report FPS numbers with frame generation enabled then IMO the press has failed to cut through the bullshit.

And just to reiterate yet again to hopefully be absolutely clear: there is nothing wrong with frame generation tech existing and it can indeed be useful in a variety of situations. It's totally fine to evaluate it and advertise it as a feature of your product. Indeed it's great as a "let's fill up these other 240Hz OLED frames with something smarter than just the same frame again" and arguably a better long term solution than VRR. I could at some point see the input frame being entirely decoupled as well. But it's important to remember that it's a visual smoothing feature, not a performance feature. We should not allow anyone to discuss it in the context of game performance, as it fundamentally makes responsiveness worse, which is one of the - if not the - most important things we are using FPS and frame times as a proxy to measure in the first place.

Rant done... for now :)
This is my take as well however you’ve worded it a lot better than how I’ve been trying to describe in the other few threads that are talking about Blackwell.
 
And that's the rub - by reporting the FPS of cyberpunk as >200fps or some basically made-up number when in reality it feels like 30-60, IMO you are doing the community a disservice. Normalizing these sorts of comparisons is what allows NVIDIA to say with a straight face that a 5070 is as fast as a 4090, which is obvious nonsense.

It's not simple. Many more factors would have to be included since same fps does not mean that it plays equally well. I also know games that feel worse at 60 fps than others at 30 fps. Killzone 2 had a high input lag of several hundred milliseconds on the PlayStation 3.
 
It's not simple. Many more factors would have to be included since same fps does not mean that it plays equally well. I also know games that feel worse at 60 fps than others at 30 fps. Killzone 2 had a high input lag of several hundred milliseconds on the PlayStation 3.
Rare outliers don't diminish statements that tend to hold true the vast majority of the time. FG is a great value add for Nvidia GPUs but I wouldn't classify it as performance either. Features like texture compression and materials seem like much more effective uses of AI but require too much dev effort in a market where it isn't supported by consoles.
 
It's not simple. Many more factors would have to be included since same fps does not mean that it plays equally well. I also know games that feel worse at 60 fps than others at 30 fps. Killzone 2 had a high input lag of several hundred milliseconds on the PlayStation 3.
Of course it depends on the game as well, as the game is part of the latency path. And certainly more games should be compared on a latency front as well but that is orthogonal to the frame generation discussion.

In this context we're comparing the same game to itself... and frame generation only ever makes latency worse. It still may be fine and a good smoothness feature in a given game/situation, but for the reasons I outlined I don't think we should ever really discuss or put emphasis on the "FPS" that you get with frame generation, as that broadly misses the point.
 
nVidia has won the generation once again! The price of the 5070 is ok. And it's not that everything is super exciting. AMD has shown that they aren't competing. There is nowhere to be seen an AMD's planned roadmap.

They were already posting data on their website before the cancellation, such as the equivalences with the 4070, which is a full-fledged setback. The most logical explanation, nVidia is the best as usual and it has superior numbers...

The people in Las Vegas had data from the Nvidia keynote a few hours before (which is logical) and left quickly and running before the comparison that was coming upon them and beat them to a pulp.
 
Of course it depends on the game as well, as the game is part of the latency path. And certainly more games should be compared on a latency front as well but that is orthogonal to the frame generation discussion.

In this context we're comparing the same game to itself... and frame generation only ever makes latency worse. It still may be fine and a good smoothness feature in a given game/situation, but for the reasons I outlined I don't think we should ever really discuss or put emphasis on the "FPS" that you get with frame generation, as that broadly misses the point.
Yes it makes the latancy worse but it has to be said that a game with frame gen comes with Reflex and thanks to Refkex it often plays as directly as if you turn off Reflex and frame gen. Reflex makes quite a difference.

What happens if you don't have a comparable feature like Reflex in a game on a competitor GPU? Then the competitor's product would have to achieve much higher frame rates in order to be comparably good,

Personally I don't care whether an image is rendered in realtime or not as long as it plays well and I don't notice it. The most important thing for me is how widely the feature is supported. If it only appears in a few big titles I wouldn't use it to measure the GPU performance. If it is used widely in big games I wouldn't mind.
 
Last edited:
Alright... lots of good stuff here (especially good to see some improvement to the multisecond ghosting/smear), but permit me another on frame generation.

tldr: I straight up don't think we should report "FPS" numbers with frame generation at all. If we do, it should be buried, not highlighted.

This has nothing to do with the silly arguments about the "purity" of "native" pixels vs "generated" pixels or any of that nonsense. NVIDIA has very effectively confused this argument by tying upsampling and frame generation under the same marketing umbrella and continually implying to people that generated frames are really just the same as generated/upsampled pixels, right? From a "what we are doing to generate them" perspective, there are similarities, but from an end user point of view they are fundamentally different things.

A couple decades ago now the tech press decided to start highlighting frame times more for several reasons, but one of the major ones is that they are better proxy for how a game *feels* to interact with than frame rates. People noted that SLI/AFR would often give higher FPS numbers and sometimes smoother animations but it would not feel any better to play, despite the higher FPS. To explain this, the discussion shifted a bit more to frame times and how they relate to what actually matters here, latency.

Obviously there are some discussions of latency in the context of frame generation, but I think we need to basically invert how we talk about this tech. Reporting average FPS obviously covers all sorts of sins to start with, but gamers use it as a proxy for two things: visual smoothness and responsiveness/latency/"feel". I would argue the latter is as important if not more important than the former, especially since it typically implies a lower bound on the visual smoothness as well; it is the real hard constraint. Game genres and personal preference affect it, but certainly on PC in games with mouse controlling camera, I care a whole lot more about latency than whatever the FPS number says.

And that's the rub - by reporting the FPS of cyberpunk as >200fps or some basically made-up number when in reality it feels like 30-60, IMO you are doing the community a disservice. Normalizing these sorts of comparisons is what allows NVIDIA to say with a straight face that a 5070 is as fast as a 4090, which is obvious nonsense.

If I told you that all my games run at 240fps on integrated graphics because I play them on my TV with TruMotion blah blah enabled you'd tell me I was an idiot, and rightfully so. The same logic applies to frame generation even if there were zero visual artifacts. Marketing has successfully drawn the discussion away from what actually matters - i.e. what's the responsiveness of these games - into pixel hunting quality comparisons. Those comparisons are fine to do when comparing different frame generation techniques and generations, but they really are irrelevant to performance comparisons.

Personally I think we should maybe consider another industry shift towards making latency a/the main thing we report rather than frame times. If it becomes normalized to report FPS numbers with frame generation enabled then IMO the press has failed to cut through the bullshit.

And just to reiterate yet again to hopefully be absolutely clear: there is nothing wrong with frame generation tech existing and it can indeed be useful in a variety of situations. It's totally fine to evaluate it and advertise it as a feature of your product. Indeed it's great as a "let's fill up these other 240Hz OLED frames with something smarter than just the same frame again" and arguably a better long term solution than VRR. I could at some point see the input frame being entirely decoupled as well. But it's important to remember that it's a visual smoothing feature, not a performance feature. We should not allow anyone to discuss it in the context of game performance, as it fundamentally makes responsiveness worse, which is one of the - if not the - most important things we are using FPS and frame times as a proxy to measure in the first place.

Rant done... for now :)
true that -even if FGx4 is really nice, especially on emulated games locked at 30fps natively, the difference is night and day-, but nVidia is destroying the competition once again.

They managed to improve DLSS once again, which is a hydrogen bomb on the competition's flotation waterline.

R0Z7Dy3.jpeg


Welcome nVidia to another era of domination and destruction of competition, effortlessly. And I have an Intel GPU with which I'm happy, if that matters...
 
true that -even if FGx4 is really nice, especially on emulated games locked at 30fps natively, the difference is night and day-, but nVidia is destroying the competition once again.

They managed to improve DLSS once again, which is a hydrogen bomb on the competition's flotation waterline.

R0Z7Dy3.jpeg


Welcome nVidia to another era of domination and destruction of competition, effortlessly. And I have an Intel GPU with which I'm happy, if that matters...
I'd like to know the performance numbers of the transformer vs the CNN. The transformer model is a lot bigger and complicated, so what's the cost?

Screenshot_2025-01-07-23-05-12-12_64ef5fc2000c1caa954c114bb372e1d5.jpg
 
Back
Top