Value of Hardware Unboxed benchmarking

Also you need to define what "feel" means? Is feel based on the number of individual frames your eyes see? Animation framerate? Mouse latency?
The last one.


It's not a performance improvement in the classic sense but seems like a net win to me assuming those generated frames are any good of course.
This is exactly what I believe.
that the idea that MFG would somehow produce a clear "latency disconnect" is just wrong.
The game looking like 120 FPS while feeling like it’s not that produces a disconnect. I definitely can feel it and I suspect most people who usually run at high frame rates can feel it.


No, it's not. You will be seeing that more frames on a 50 series card. Yes you won't get better latency on it but since you're comparing FG to MFG it is expected already.
When Nvidia said “5070 is about equivalent to a 4090” I thought it actually performed on par with a 4090 without framegen. This is what most people think when Nvidia says this.

I don’t think it’s wrong to advertise the new frames but if someone advertises a TV as “240Hz” when it’s actually 120 with interpolation most people would call that deceptive (indeed, some manufacturers do this lol).
 
...


If you take a single game and run it at 40 FPS interpolated to 80 FPS it will feel like 40 FPS and not 80 FPS, due to latency. To get 80 FPS yes you’d have to lower settings but I think we’re making this hypothetical more complicated than it needs to be, with FG you get more fluidity at the same latency, with more raw frames you get more fluidity at lower latency. This is the point I am making and that’s why they aren’t directly comparable.

Say I have game X. I test it on Nvidia with Reflex enabled. I get 100 fps and about 20 ms of latency. Then I run it on another brand of gpu, and I get 115 fps, but since it doesn't have Reflex I get 40 ms latency. Which gpu is performing better?
 
Say I have game X. I test it on Nvidia with Reflex enabled. I get 100 fps and about 20 ms of latency. Then I run it on another brand of gpu, and I get 115 fps, but since it doesn't have Reflex I get 40 ms latency. Which gpu is performing better?
The latter for fluidity (although not by much) and the former for latency. Is this supposed to be a gotcha?

For what it’s worth I’d much rather be running the former. Actually I’d say the former is probably the better performing one because a 15 fps difference is going to be dwarfed by halving the latency.
 
This is exactly what I believe.

Ok.

The game looking like 120 FPS while feeling like it’s not that produces a disconnect. I definitely can feel it and I suspect most people who usually run at high frame rates can feel it.

Sorry but I don’t see how this is a problem. Input, animation, rendering, physics etc rates aren’t guaranteed to be in sync.

Frame gen isn’t significantly extending the duration of how long your eyes are seeing a particular slice of game time. It’s just carving up the same slice and showing you interpolated steps within that time frame. So your brain should only see smoother motion. It shouldn’t “feel” any different.
 
Sorry but I don’t see how this is a problem. Input, animation, rendering, physics etc rates aren’t guaranteed to be in sync.
They are in most games or they do a very good job of hiding it.


So your brain should only see smoother motion. It shouldn’t “feel” any different.
You’re just restating what I said: it looks smoother but feels like lower FPS content.

I’m not saying framegen is bad I’m saying it’s not going to feel like an authentic high frame rate experience. Most people don’t care about this and are fine with 60Hz latency. But it should still be noted in benchmarks.
 
The latter for fluidity (although not by much) and the former for latency. Is this supposed to be a gotcha?

For what it’s worth I’d much rather be running the former. Actually I’d say the former is probably the better performing one because a 15 fps difference is going to be dwarfed by halving the latency.

Just a genuine question, because it's not actually super uncommon. I agree that input latency is a performance metric, usually entirely ignored in gpu reviews. I think frame times are performance, frame pacing is performance, gpu power is performance. Then I circle back to wondering if frame generation is fairly called performance or not. Frame times are a real performance metric. The issue for sites like HUB is they want to compare apples to apples, so they want upscaling off, frame gen off, to try to make the displayed image as similar as possible, and then infer performance from benchmarks with that starting point. But is that actually just too disconnected from the user experience, and what a typical user would experience as performance? Maybe it depends on how much of a pixel peeper you are, or how sensitive to latency you are. Truth is, there's no singular answer. I think it's good that HUB is opinionated and they clearly define what they care about, and then we can choose to value their content or not based on their own preferences.

It gets weird because in the past reviews really fought it out over $/frame value, or just absolute number of frames, and a 15% margin in frames could be declared a "winner," but it was never the entire story. Performance was framerate and value was $/frame and that was it. Latency still never really enters the picture except as a footnote about features.
 
Just a genuine question, because it's not actually super uncommon. I agree that input latency is a performance metric, usually entirely ignored in gpu reviews. I think frame times are performance, frame pacing is performance, gpu power is performance. Then I circle back to wondering if frame generation is fairly called performance or not. Frame times are a real performance metric. The issue for sites like HUB is they want to compare apples to apples, so they want upscaling off, frame gen off, to try to make the displayed image as similar as possible, and then infer performance from benchmarks with that starting point. But is that actually just too disconnected from the user experience, and what a typical user would experience as performance? Maybe it depends on how much of a pixel peeper you are, or how sensitive to latency you are. Truth is, there's no singular answer. I think it's good that HUB is opinionated and they clearly define what they care about, and then we can choose to value their content or not based on their own preferences.

It gets weird because in the past reviews really fought it out over $/frame value, or just absolute number of frames, and a 15% margin in frames could be declared a "winner," but it was never the entire story. Performance was framerate and value was $/frame and that was it. Latency still never really enters the picture except as a footnote about features.
Well, we didn’t have to benchmark latency before because given everything else being equal frame rate and latency we’re essentially one and the same. Now that we have Reflex and whatnot one manufacturer has an explicit advantage at all frame rates so it’s been decoupled a bit.

That said, Nvidia is comparing this to older cards from the 40 series which all can enable reflex so it’s all apples to apples. My problem remains that treating interpolated frames as equal to ‘real’ frames is inaccurate because they don’t reduce latency.

Steve HardwareUnboxed plays a lot of competitive shooters it seems so it makes sense he’d have the same concerns. If latency never troubles you and you only increase frame rates to get a more fluid image then FG might as well be real frames to you.
 
They are in most games or they do a very good job of hiding it.

There’s no hiding. That’s just how game engines work. Systems aren’t coupled to rendering.

You’re just restating what I said: it looks smoother but feels like lower FPS content.

I was addressing your concern about there being a disconnect caused by framegen. There’s no disconnect. 40 fps animation will feel like 40fps animation no matter the frame rate. More displayed frames doesn’t make that any different/worse which you seem to be implying.
 
More displayed frames doesn’t make that any different/worse which you seem to be implying.
It also doesn’t make it better which is my entire point…

A genuine 120 fps will feel better than 60 interpolated to 120 because the latter has the latency of a 60Hz frame rate. I don’t understand why this is a point of contention. Anyone that has used framegen can feel this lol.
40 fps animation will feel like 40fps animation no matter the frame rate.
What do you mean animation? You don’t feel character animation, you feel responses to input. For most games this is like 99% camera movement.
 
Well, we didn’t have to benchmark latency before because given everything else being equal frame rate and latency we’re essentially one and the same. Now that we have Reflex and whatnot one manufacturer has an explicit advantage at all frame rates so it’s been decoupled a bit.

That said, Nvidia is comparing this to older cards from the 40 series which all can enable reflex so it’s all apples to apples. My problem remains that treating interpolated frames as equal to ‘real’ frames is inaccurate because they don’t reduce latency.

Steve HardwareUnboxed plays a lot of competitive shooters it seems so it makes sense he’d have the same concerns. If latency never troubles you and you only increase frame rates to get a more fluid image then FG might as well be real frames to you.

I guess you could boil frames down to improving smoothness of a moving image, reducing motion blur and reducing latency (unless you hit ~98% GPU utilization). Frame gen improves two of the three. I have no idea if reviewers should call that performance or not. It's performance, performance with a caveat or it's not performance. The latter seems a little hard to argue to be honest. The middle feels the most appropriate.
 
A genuine 120 fps will feel better than 60 interpolated to 120 because the latter has the latency of a 60Hz frame rate. I don’t understand why this is a point of contention. Anyone that has used framegen can feel this lol.
This is a point of contention because you're creating a situation which won't appear in practice. Some path traced game running on a 5090 at 60 fps and then improving that to 240 via MFG will not be able to run at 240 fps without MFG on the same GPU without severe cuts to rendering quality. Which means that you will not be able to compare these framerates directly and the idea that someone could "notice" that the MFG fps has some input lag which "should" be a lot lower simply won't happen in reality as you won't see that lower lag. More than that there is no inherent "framegen lag" which would immediately tell you that this game is using framegen. Loads of game have latency HIGHER than what you get with framegen while not using any framegen.

Hence why we're trying to tell you that there won't be any "latency disconnect" - if said latency will be low enough for you to play the game. It just feels like a game with high latency - nothing framegen specific about that.
 
I guess you could boil frames down to improving smoothness of a moving image, reducing motion blur and reducing latency (unless you hit ~98% GPU utilization). Frame gen improves two of the three. I have no idea if reviewers should call that performance or not. It's performance, performance with a caveat or it's not performance. The latter seems a little hard to argue to be honest. The middle feels the most appropriate.
I agree.
 
This is a point of contention because you're creating a situation which won't appear in practice. Some path traced game running on a 5090 at 60 fps and then improving that to 240 via MFG will not be able to run at 240 fps without MFG on the same GPU without severe cuts to rendering quality.
I agree, that’s why we have frame gen. But that has nothing to do with my point, I’m saying you shouldn’t use interpolated frames to argue a 5070 is as good as a 4090, because the latter is going to have much lower latency because only half of its frames are interpolated instead of 3/4ths.
 
that there won't be any "latency disconnect"
That’s interesting because we have framegen and I can tell you easily when it’s enabled because if I compare it to a better card that can run those frames without interpolating it feels better, or if I drop quality settings. Multi framegen is the same thing.
 
because if I compare it to a better card
Which no regular user will be able to do, ever. So again you're constructing a situation which has a very low possibility of happening. Especially with the new MFG feature which could drive your fps up by some x3-4 times - not many games are able to scale that well on the same h/w with settings even. And such comparison would essentially be similar to comparing a very low preset with ultra - you do get worse latency from the latter too, haven't seen anyone saying that there's "latency disconnect" between low and high settings in some game.
 
Which no regular user will be able to do, ever.
Dude I’m not talking about end users, I’m talking about marketing comparing two cards. Nvidia says 5070 = 4090, but it isn’t and never will be because they using frame quadrupling to make this comparison. The former will have twice the latency of the former.

You are claiming the end user won’t notice and yeah sure maybe not, but it’s still deceptive marketing.
haven't seen anyone saying that there's "latency disconnect" between low and high settings in some game.
i don’t think you are following what I’m saying at all. If you increase your settings and lower your frame rate ofc latency increases, it increases in proportion with your frame rate. They are coupled, connected. When you start doubling or quadrupling via interpolation your latency is basically decoupled from the frame rate you see on the screen; it looks like twice the FPS but doesn’t feel like it. That’s the disconnect.
 
It also doesn’t make it better which is my entire point…

Agreed.

A genuine 120 fps will feel better than 60 interpolated to 120 because the latter has the latency of a 60Hz frame rate. I don’t understand why this is a point of contention. Anyone that has used framegen can feel this lol.

Agreed.

What do you mean animation? You don’t feel character animation, you feel responses to input. For most games this is like 99% camera movement.

I’m really unclear on what you’re arguing. You said earlier that framegen causes a disconnect because the rendering speed is out of sync with other game systems (animation, input etc). My point is that framegen won’t make that any worse as those systems are already disconnected anyway. I agree with you that framegen doesn’t make it any better if you’re below the acceptable “tick rate”.
 
This is a point of contention because you're creating a situation which won't appear in practice. Some path traced game running on a 5090 at 60 fps and then improving that to 240 via MFG will not be able to run at 240 fps without MFG on the same GPU without severe cuts to rendering quality. Which means that you will not be able to compare these framerates directly and the idea that someone could "notice" that the MFG fps has some input lag which "should" be a lot lower simply won't happen in reality as you won't see that lower lag. More than that there is no inherent "framegen lag" which would immediately tell you that this game is using framegen. Loads of game have latency HIGHER than what you get with framegen while not using any framegen.

Hence why we're trying to tell you that there won't be any "latency disconnect" - if said latency will be low enough for you to play the game. It just feels like a game with high latency - nothing framegen specific about that.
It doesn't have to be about a specific game. If somebody is very accustomed to the fluidity and feel of higher framerates, the fact that they cant hit 240fps in CP2077 with path tracing properly wont change that they still know what 240fps should feel like, roughly.

I can get that perspective and how that might well be a flaw for those truly accustomed to such super higher framerates. And that does seem like it's going to be one of the main target audiences here.
 
Dude I’m not talking about end users, I’m talking about marketing comparing two cards. Nvidia says 5070 = 4090, but it isn’t and never will be because they using frame quadrupling to make this comparison. The former will have twice the latency of the former.
We've already discussed this. Marketing doesn't compare the cards this way. It was a one off "attention grab" line for the CEO during the keynote, nothing more. You're still insisting on this while nowhere on their website or in their posts on Blackwell do they even compare 5070 to anything but 4070. Feel free to check.

If you increase your settings and lower your frame rate ofc latency increases, it increases in proportion with your frame rate. They are coupled, connected.
To a degree. Latency is only partially because of the on screen framerate. It's not a linear connection usually.

When you start doubling or quadrupling via interpolation your latency is basically decoupled from the frame rate you see on the screen; it looks like twice the FPS but doesn’t feel like it. That’s the disconnect.
And there is no disconnect because you can easily find games where latency would be the same at the same fps without framegen.

We're going in circles.
Your point is that gamers are checking how every game with FG works without FG while hitting the same output framerate as with it. I've basically NEVER have done that, and I'm an enthusiast. With MFG it is likely to be impossible even.
The only thing which matter to me is an answer to a simple question: "is the current input latency with framegen active good enough for me to play this game?"
If the answer is "yes" then all is fine, no "disconnect" of anything anywhere.
If the answer is "no" then I'm lowering graphical settings until it becomes "yes". Never ever during the last 2 years on a 4090 have I decided to disable framegen for that because framegen gives me much higher motion fluidity at the expense of pretty much nothing. There are no reasons to not use framegen if you're hitting a good enough latency - and if you don't then you're better off reducing settings with framegen still being active.
So the comparison of FG vs MFG makes total sense as this is exactly how I would be playing games on a 4070 vs 5070 - FG on vs MFG on, similar game settings.
 
You said earlier that framegen causes a disconnect because the rendering speed is out of sync with other game systems (animation, input etc).
I don’t remember saying this at all. I’m saying framegen disconnects latency from frame rate.

With traditional rendering if a game runs at 60 it has a certain latency, and at 120 it’s roughly halved (at least excluding the TV’s latency). Framegen disconnects this, the frame rate will double while latency will not halve. This feels strange to those used to playing at high frame rates.

So someone’s whose experienced 120Hz gaming will be able to tell pretty easily when framegen is in use because while it may look like 120, it doesn’t feel as such. Someone whose never played at a consistent 120/144 before won’t understand this and probably won’t notice.
 
Back
Top