Nvidia Geforce RTX 5090 reviews

No, because the thumbnail says it all. FG works better at low frames, too. Play a game with 30 FPS and than play it with FG. Nobody would take the 30 FPS option. And what does "works well in some content at 60 FPS baseline" even mean? It works always "well" at every baseline because FG do not sent the same frame again to the display...
 
So your take is that framegen on is always better than framegen off and therefore any criticism should be dismissed? We know how the tech works and that’s obviously not true. He shared footage of Alan Wake that’s been highlighted by others as well where framegen clearly makes the image worse.
 
Did you guys actually watch the video? He also says it works well in some content at 60fps baseline.
The problem is that I disagree with "some content" working well at 60 FPS baseline claim.
My own experience with FG suggests that "some content" is working well at 40 FPS baseline.
While at 60 it's not "some content", it's pretty much "all content".
 
No, because the thumbnail says it all. FG works better at low frames, too. Play a game with 30 FPS and than play it with FG. Nobody would take the 30 FPS option. And what does "works well in some content at 60 FPS baseline" even mean? It works always "well" at every baseline because FG do not sent the same frame again to the display.
That's incorrect. Your latency would then be noticeably higher than 30 FPS without Frame Gen because the rendering cost of frame gen would push the game below 30 FPS. I've tried FSR Frame Gen in Ratchet and Clank at a locked 60 FPS and it was unplayable.

Tim's review is excellent and demonstrates why so many people are skeptical towards this technology. It's great in some scenarios like having a base framerate of over 60 FPS and pushing that to 120 FPS on a 120 Hz monitor, but it doesn't make a path traced game running at ~35ish fps playable on a lower end GPU. And MFG is great for people with 240 Hz monitors, but for others it's not much of use.
 
The problem is that I disagree with "some content" working well at 60 FPS baseline claim.
My own experience with FG suggests that "some content" is working well at 40 FPS baseline.
While at 60 it's not "some content", it's pretty much "all content".

So what should he say instead? He was super clear that your experience will vary based on a bunch of factors and he pointed out where it worked well for him with no noticeable artifacts or latency impact. The one thing he was adamant about is that framegen sucks at 30fps base fps though and maybe that’s not true for every title but is likely true in general.

I don’t think you can claim framegen works well in pretty much all content at 60fps. The tech isn’t magic and does not interpolate all motion perfectly even at 60.
 
So your take is that framegen on is always better than framegen off and therefore any criticism should be dismissed? We know how the tech works and that’s obviously not true. He shared footage of Alan Wake that’s been highlighted by others as well where framegen clearly makes the image worse.
No, criticism is alright. It has to be happen within the context.

Taking FG out of the "system" and starting to analyse it on itself is the problem. For me FG is a solution to a lot of problems existing while playing a game such as overdrive at lower frames, staying outside of the VRR range, sample and hold. All these problems are resulting in higher latency (using V-Sync), tearing, flickering (VRR flickering for example), inverse overdrive artefacts, LFC etc.

FG is another tool for displaying better frames. My LG monitor supports LFC and the real VRR range is around 60 to 160 FPS. So at 30 FPS my GPU is already sending two addtional fake frames to my monitor. And despite my monitor is showing me "90 Hz" only 1/3 of the displayed FPS is real. Why wouldnt i exchange these two same images with two proper generated ones and make a much better transistion from the base 30 FPS on my LG monitor? Faster displays have even a real VRR range from only 120 FPS to 240 FPS. MFG can provide proper frames to get to 240 FPS and gamers to not have to rely on one addtional worseless identical frame to stay at least within the real VRR range with a baseline of 60 FPS...

That's incorrect. Your latency would then be noticeably higher than 30 FPS without Frame Gen because the rendering cost of frame gen would push the game below 30 FPS. I've tried FSR Frame Gen in Ratchet and Clank at a locked 60 FPS and it was unplayable.
FG doesnt reduces the baseline performance. It increases latency to proper generate and to pace the frame within two normal one. But this is a good example: Normally you would play with 30Hz and V-sync on. Now compare this to 60 FPS (with FG) on a 240 FPS display. It will be so much better despite only rendering a new frame only every 33,3ms.
 
Last edited:
I don’t think you can claim framegen works well in pretty much all content at 60fps. The tech isn’t magic and does not interpolate all motion perfectly even at 60.
I can because this is what I see with my own eyes. No amount of FUD from HUB will change that.
60 FPS is enough for 99% of content. Only high speed MP focused shooters would need more - but they need more without FG regardless so it's a nothing burger issue.
Claiming that MFG would be great for turning 250 into 1000 is a bit like saying "you know what it's useless now but maaaaaaybe" - which is just plain not true.
 
Edit: While we're on this topic - FG 4.0 does seem to have some sort of compatibility with Reflex 2 frame warp tech. I'm still unsure how it would work with (M)FG though considering how the frames are generated and presented. Maybe it's "turning warp off when FG is active" sort of compatibility.
Without warping Reflex 2 would be just Reflex
 
We don't need to because NVIDIA told how it works. Without warping it's just JIT submission for rendering just like Reflex.
Again, we don't know how FG will work with Reflex 2 since there are no games implementing both at the moment.
My statement was just a theory since I can't imagine how frame warp would work with FG - but hey maybe they'll use warped frames as interpolation source, who knows.
 
I can because this is what I see with my own eyes. No amount of FUD from HUB will change that.

You can certainly speak to your own experience but knowing how the tech works it’s clearly far from perfect.

Claiming that MFG would be great for turning 250 into 1000 is a bit like saying "you know what it's useless now but maaaaaaybe" - which is just plain not true.

It’s more like acknowledging there are still issues which would largely be eliminated at higher render rates. Everything doesn’t have to be one extreme or the other.
 
Again, we don't know how FG will work with Reflex 2 since there are no games implementing both at the moment.
My statement was just a theory since I can't imagine how frame warp would work with FG - but hey maybe they'll use warped frames as interpolation source, who knows.
I wasn't commenting on how FG and Reflex 2 interact, just that your suggestion of "Maybe it's "turning warp off when FG is active" sort of compatibility." isn't an maybe, since without warping there's no Reflex 2
 
This is the take from the youtube optimum in relation to MFG:

Playing games is different from watching videos of games. To do make a proper statement about FG/MFG you have to actually play a game. And with high speed displays coming to the market the perception of FG will rapidly change.
 
Last edited:
Imo good review and summary that you need at least 70-80fps already to use mfg and have good experience. Also the more frames generated the more time artifacts are noticible. Its far from being any killer feature.
 
Not 100% sure this is the right spot, but does anyone know if Bestbuy is going to give paying members a perk when trying to get the 50-series FE cards next week?
 
Back
Top