Value of Hardware Unboxed benchmarking

You aren’t going to believe what people usually call BFI: motion smoothing.

(really FG is more like the frame insertion algorithms TVs have, idk why you think it’s anything like BFI, there are no black frames inserted?)
I hope nobody calls BFI "motion smoothing". It is the opposite of it...

No, this is like saying plasma at 60Hz is some higher frame rate in LCD equivalence. Nobody denies that FG and plasmas and CRTs improve fluidity but they don’t improve latency like doubling frame rate does.

I would not call a 60Hz plasma equivalent to a 120Hz LCD just because it has better motion clarity precisely because it still feels like 60Hz input lag.

Unlike most people that talk about plasma gaming, I actually have a plasma TV and despite the better motion handling it still feels like 60Hz (worse actually since it’s ancient and input lag is off the charts).
Plasmas and CRTs have better motion clarity with 60FPS at 60Hz than OLED and LCDs. In booth cases there is a disconnect from input latency and motion clarity.
 
We've already discussed this. Marketing doesn't compare the cards this way. It was a one off "attention grab" line for the CEO during the keynote, nothing more.
So Nvidia can lie in their marketing once and it’s fine, but if they put it in print that’s too far? This seems like a really strange line to draw lol, how about “Nvidia should be honest about performance”.
And there is no disconnect because you can easily find games where latency would be the same at the same fps without framegen.
genuinely, where? I’ve never seen a game where latency is the same at 120 as another game at 60. Latency is generally from TV lag (so consistent across all games played on that TV) and whatever the time until the next frame.
 
I hope nobody calls BFI "motion smoothing". It is the opposite of it...
What do you think the end result of BFI is? It’s smoothing motion by inserting black frames, it’s reducing motion blur.


Plasmas and CRTs have better motion clarity with 60FPS at 60Hz than OLED and LCDs. In booth cases there is a disconnect from input latency and motion clarity.
I generally agree (way overblown btw, I’d invite you to actually try gaming on an old plasma), and that’s fine and I’d agree framegen has a similar affect where motion is smoother (although it has nothing to do with BFI, ironically framegen is the same technology as the other TV motion setting that interpolated frames, just a lot better) but I think it’s dishonest to claim 60 with FG is 120.

To complete the plasma CRT analogy that’s like claiming 60Hz on a plasma is like 120Hz on an LCD.
 
With traditional rendering if a game runs at 60 it has a certain latency, and at 120 it’s roughly halved (at least excluding the TV’s latency). Framegen disconnects this, the frame rate will double while latency will not halve.

Agreed.

This feels strange to those used to playing at high frame rates.

This is the part I’m not getting. Why would it feel strange? It will just feel like 60. Do you mean your brain would say hey this “looks” way too smooth to be updating at all what “feels” like 60 fps and get all confused?

That doesn’t make sense because there’s a bunch of different systems that update at different intervals in every game. So why isn’t the brain confused by that?
 
BFI and strobing are reducing the sample and hold time of the displayed frame. Both results in flickering. This is the opposite of "smoothing".

Using CRT and Plasma is just an example of the disconnect between input latency and motion clarity resulting because of a different display technology. With LCD and OLED more frames have been the way to improve motion clarity in games.
The problem is that to get more frames the whole host system has to provide the power to provide the performance for the GPU. A lot of games even reduces this load to make it easier.

/edit: An example from reddit: With backlight strobing the LCD provides a much better motion clarity than a OLED. Should we call these frames displayed on a sample and hold monitor fake, too?
 
Last edited:
genuinely, where? I’ve never seen a game where latency is the same at 120 as another game at 60.
Oh that's definitely a thing. Inherent rendering latencies for games can vary pretty hugely. At the faster end, you might get games with like 30-40ms at 60fps, but many games are running more like 60-80ms, and not uncommonly in the 80-120ms range(and you'd be surprised which ones).

So it's definitely possible that you could have one game at 120fps with higher input latency than another game at 60fps.

That said, when we're talking the difference between 60fps and 240fps, I can see your argument where the mismatch could well be noticeable and jarring for a number of people. That's pretty extreme.

Also, it's not true that doubling framerate cuts input lag in half. Improving framerate reduces input lag, but not necessarily quite so linearly like that. And this will also vary a fair bit from game to game.
 
Do you mean your brain would say hey this “looks” way too smooth to be updating at all what “feels” like 60 fps and get all confused?
Yes. This is exactly it. This is why I don’t like using framegen, or at the very least I don’t think we can call these equivalent to real frames.


That doesn’t make sense because there’s a bunch of different systems that update at different intervals in every game. So why isn’t the brain confused by that
Because character animation or networking or whatever isn’t updated by user input. Latency is like 99% the feel of the camera movement. That updates almost always in sync with whatever your frame rate is.

With backlight strobing the LCD provides a much better motion clarity than a OLED. Should we call these frames displayed on a sample and hold monitor fake, too?
Uh yes, obviously yes. In that they don’t reduce input latency. They provide better motion clarity.

BFI and strobing are reducing the sample and hold time of the displayed frame. Both results in flickering. This is the opposite of "smoothing".
By reducing sample and hold they provide better motion clarity. Most call this a form of motion smoothing. Sample and hold motion either looks smeared or stuttery.
 
This is copied from the DF thread, but is relevant here as well.

Latency varies from game to game at the same frame rate. Red Dead Redemption 2 will have 60ms of latency at 120fps, while Call of Duty will have 30ms of latency at that very same 120fps. So, if your latency is not universally tied and fixed to the frame rate, then you can't state that this is the one true measure of performance.

Additionally, that very same Call of Duty will have 20ms of latency at 120fps with Reflex on, Vertical Sync on will increase latency at that very same 120fps, while off will drastically reduce it. Variable Refresh Rate will mildly increase latency too. So you have additional layers of tech further decoupling latency from frame rate.

Comparing the game to itself is also meaningless after all of these variables, the gamer will adjust himself to whatever he finds comfortable, he might play Call of Duty with frame generation at 180 fps, reflex on, with a latency of 30ms and finds it acceptable, we can't say that frame generation increased his latency any more than Variable Refresh Rate increased his latency. Is testing with VRR now bad because it increases latency?

You simply can't draw the line at frame generation after all of these variables.

There are clear benefits to frame generation, such as improving the frame pacing of the game, increasing the fluidity of the presentation, and unlocking frame rates beyond the CPU limited code (which is a widespread issue in gaming nowadays). It's simply a very good tool in the box to achieve good performance, just like all the tools listed above.
 
Having personally experienced framegen on a 1070MQ, 3080Ti, and a 4090, and also having enjoyed gaming almost every day of my life since my first experience on a Apple ][e back in the early/mid 80's, I'll say that I'm very happy with the tech and expect to continue using it for the foreseeable future.

I also personally see framegen as a sort of temporal antialiasing, with the "jaggies" in this effect being the longer frame times which require some smoothing. I personally equivocate those who are blasting it as "fake frames" as if it were people from 25 years ago railing against AA, telling everyone to skip that blurry shit and just run at higher resolution -- just this time, it's a higher temporal resolution rather than viewport resolution.

The frames are real if I can see them. Less jaggies is better, temporal or otherwise.
 
I haven't encountered a game that supports DLLS framegen where I didn't think it was great. Maybe I'm old but I can't notice the increased input lag. But I can definitely notice the increased framerate.

Of course I can tell the latency difference between 120fps native and 60->120fps with FG. But it makes no difference to me in terms of latency if it's 60fps native or 60->120fps with generation, so I don't see the problem. I'd only turn off framegen if my framerate was already near my monitor's refresh rate with the settings I want.
 
BTW can you imagine what a game changer this would be if it were available on the next Switch? With the new DLSS upscaling and MFG I think a handheld could exceed Series S level quality. In my experience DLSS upscaling and FG really is that good. If Nintendo added anything to the Ampere chip in Switch2, they would've been very wise to add whatever is necessary for FG.
 
Now that we have Reflex and whatnot one manufacturer has an explicit advantage at all frame rates so it’s been decoupled a bit.
Nvidia isn't the only one with latency reducing feature, in fact now that Intel introduced their XeLL all 3 have them.
 
But it makes no difference to me in terms of latency if it's 60fps native or 60->120fps with generation
I actually played the elden ring dlc with lossless scaling, its a locked 60hz game so I was doing the 60-120 thing. Early on I had a particularly good fight where I landed more parries than I normally ever do, so I turned FG off to see how I went and I didn't do as well. For some reason I parry better with FG than native, I don't really get it maybe I notice the tells i'm looking for when to parry easier when it's smoother? because latency hasn't improved obviously. Maybe I need to find a soulslike that has an uncapped framerate and lock it at 60 and 120 and see how I go.

I have to go back and try sekiro at 120 aswell and see if my parrying in that improves at all. I use mouse and keyboard aswell so I do enjoy the smoother feel in general. I don't use it in competitive multiplayer games, the latency probably wont make a difference for me now i'm old but every little thing helps trying to keep up with the young savants these days lol.
 
What do you think the end result of BFI is? It’s smoothing motion by inserting black frames, it’s reducing motion blur.
BFI and strobing are reducing the sample and hold time of the displayed frame. Both results in flickering. This is the opposite of "smoothing".
You are both correct. BFI improves the perception of motion. Unlike CRTs, LCDs are sample-and-hold, so within a 16.67ms window the LCD pixel will continue to hold a color static for the entire duration, and then instantly jump to a different frame at refresh. This is horrible. BFI reduces the sample duration. By inserting black frames, it tricks our brain into interpolating the space between the two visible frames, thereby creating the illusion of smoother motion. It's exactly why scanlines have an antialiasing effect on low-resolution content, except that is over space instead of time. Frame generation is actually inserting interpolated frames instead of having to trick our brain into doing it. With sufficient interpolated frames, the sample-and-hold artifacting will become almost imperceptible.

I also personally see framegen as a sort of temporal antialiasing, with the "jaggies" in this effect being the longer frame times which require some smoothing. I personally equivocate those who are blasting it as "fake frames" as if it were people from 25 years ago railing against AA, telling everyone to skip that blurry shit and just run at higher resolution -- just this time, it's a higher temporal resolution rather than viewport resolution.

The frames are real if I can see them. Less jaggies is better, temporal or otherwise.
100%. I would give you more % if I could but sadly in this context I am limited to 100.

Yes. This is exactly it. This is why I don’t like using framegen, or at the very least I don’t think we can call these equivalent to real frames.
I don't understand this. Let's take FG to its extreme and say that hypothetically it's interpolating 10000 frames every 16.67ms. Motion is now perfect, but the system still responds to inputs every 16.67ms. Are you saying that would feel weird? A higher sampling rate would be obviously and perceptibly better, but you're saying that 10000 fps motion with 60fps sampling feels worse than 60fps motion at 60fps input sampling? I won't challenge you, but have you actually done A/B comparisons? Are you sure it isn't input lag (which is horrible) that's bothering you rather than the input sampling rate?
 
Are you saying that would feel weird? A higher sampling rate would be obviously and perceptibly better, but you're saying that 10000 fps motion with 60fps sampling feels worse than 60fps motion at 60fps input sampling? I won't challenge you, but have you actually done A/B comparisons? Are you sure it isn't input lag (which is horrible) that's bothering you rather than the input sampling rate?
I can't generalize for others but for me, yes. It would feel weird to me to have essentially perfect motion that takes 0ms to flash a new image on the screen but my motions be delayed for 16.7 ms. I've done my own A/B comparisons, in that when I enabled framegen it felt weird and floaty and it didn't feel like a genuine FPS increase beyond just looking fluid. Not exactly scientific but here we are.

Input sampling rate would be intrinsically tied to input lag here, no? Input lag in this case (so excluding the things unaffected by FG like the lag from your peripheral and the lag to the TV) is pretty much just how fast the screen updates to reflect your input.

To draw away from the hypothetical, what FG offers is increased fluidity but iso input lag. I'm saying Nvidia (or anyone) should not say a 50 series card is faster than a 40 series card simply because it can quadruple frames instead of double frames. This is how Nvidia came up with the 5070 matching a 4090. I'm not a Luddite, and I think framegen is cool, but you aren't getting half the benefits of a higher framerate (ie lowered input latency and responsiveness).
 
Frame gen

Pros:
Improved motion smoothness or fluidity
Reduced motion blur (MPRT) without reducing brightness

Cons:
Increased latency
Reduced image quality/increased artifacts

It’s really pick your poison and reviewers will handle it differently. We know where HUB will land. 60 and 70 series will be super interesting if the trend continues.

They’ll probably like the dlss and ray reconstruction improvements but not like frame generation, but the primary focus will be raster performance and $/frame so they won’t be highly enthused.
 
Input sampling rate would be intrinsically tied to input lag here, no? Input lag in this case (so excluding the things unaffected by FG like the lag from your peripheral and the lag to the TV) is pretty much just how fast the screen updates to reflect your input.

I can’t help but think you’ve missed the info on Reflex 2. It makes the input polling rate the same as the display frame rate - including polling and updating a generated frame just before the frame buffer swap.
 
I can't generalize for others but for me, yes. It would feel weird to me to have essentially perfect motion that takes 0ms to flash a new image on the screen but my motions be delayed for 16.7 ms. I've done my own A/B comparisons, in that when I enabled framegen it felt weird and floaty and it didn't feel like a genuine FPS increase beyond just looking fluid. Not exactly scientific but here we are.

Input sampling rate would be intrinsically tied to input lag here, no? Input lag in this case (so excluding the things unaffected by FG like the lag from your peripheral and the lag to the TV) is pretty much just how fast the screen updates to reflect your input.
No, lag and rate are different. For example, if I have a really crappy peripheral it could take 10 seconds for the first button press to trigger an event in the game engine, but every subsequent button press could get queued up every 16ms after your first press and trigger events at a steady 16ms rate at the game engine, except that each event is delayed 10 seconds from when you pressed the button. That would feel awful, but it's really an input lag issue and not a sampling rate issue. In computer architecture terms it's a latency problem not a throughput/rate problem. For most people a "small" amount of *consistent* lag is something that the brain can adapt to and learn to ignore. A/B'ing will remind you immediately, but play with a small lag for a while and your brain will acclimatize to it. But of course, lower lag is strictly better.

Since your A/B comparison was with FG on-vs-off I suspect (but of course can't confirm) that what you're perceiving is just vanilla input lag, and it's not a sample rate issue. If true, then the "floatiness" you feel should be no worse with 3x or 4x FG compared to 2x FG because the lag doesn't change much. But all FG will feel more disconnected than no-FG (all other factors being equal) and especially so if you're playing a mouse-based game.
 
Back
Top