Nvidia Geforce RTX 5090 reviews

This is the take from the youtube optimum in relation to MFG:

Playing games is different from watching videos of games. To do make a proper statement about FG/MFG you have to actually play a game. And with high speed displays coming to the market the perception of FG will rapidly change.
I came to the same conclusion as this guy when using FG on my 4070. Whatever drawbacks there were got easily overshadowed by doubling the framerate.

After watching the Tim Unboxed video, those artifacts do seem more distracting at 4xFG since instead of 1/2 the frames artifacting it's 3 out of every 4. Still it's hard to overstate the impact of quadrupling your framerate.
 
I think HUBs conclusions about the use cases for frame gen are about 95% correct. The only thing I might say is it’s true that enabling frame gen will lower your base frame rate so I’d work around it.

If I had a 360 Hz VRR monitor I’d enable reflex and nvidia control panel/app vsync. That would automatically set a frame cap. Then I’d set 3x frame gen and start adjusting in-game settings and dlss upscaling until I was consistently at my cap and below 98% gpu usage. That should pretty much guarantee the highest possible base framerate for that fps limit. That's the case where you could really see in an optimal perfect fps scaling how much latency frame gen adds. They can already see this with cpu-limited games like Microsoft Flight Simulator.

Same process for 240 Hz 2x and 480 Hx 4x.

This is essentially how I already use my 240Hz monitor. I bought it to play at 240Hz in as many games as possible, and I know reflex + nvidia vsync will cap my frames around 225, so I adjust DLSS and in-game settings until I hit 225. Basically adding one step by selecting a frame-gen scaling factor. With frame gen here it would be good for some reviewers/creators to start making videos about how to optimize it, and test it in scenarios like 180->360 or 240->480, as some games have engine caps of 300 etc. I want to see if FG bottlenecks if the base frame rate is really really high.

If I had the equipment, like an LDAT, 480Hz display and a 40 or 50 series card, I'd make the video myself. This seems like the kind of thing Battle(non)sense would have made, or maybe Optimum would make it. DF is more focused on "ultra settings" and ray-tracing, so I'm not sure if they'd make a vid about optimizing latency with FG. @Dictator ?
 
Last edited:
I think HUBs conclusions about the use cases for frame gen are about 95% correct. The only thing I might say is it’s true that enabling frame gen will lower your base frame rate so I’d work around it.

If I had a 360 Hz VRR monitor I’d enable reflex and nvidia control panel/app vsync. That would automatically set a frame cap. Then I’d set 3x frame gen and start adjusting in-game settings and dlss upscaling until I was consistently at my cap and below 98% gpu usage. That should pretty much guarantee the highest possible base framerate for that fps limit.

Same process for 240 Hz 2x and 480 Hx 4x.
I wish they would implement a mode where it always outputs your monitor refresh rate (as long as your base FPS is >25% of your refresh rate) and uses FG to make up the difference between your base FPS and your refresh rate. That would be killer. I think I heard Wendell at Level1Techs talk about that possibility.
 
I wish they would implement a mode where it always outputs your monitor refresh rate (as long as your base FPS is >25% of your refresh rate) and uses FG to make up the difference between your base FPS and your refresh rate. That would be killer. I think I heard Wendell at Level1Techs talk about that possibility.

You mean by adjusting the 2x, 3x, 4x factor dynamically? That would be cool.
 
You mean by adjusting the 2x, 3x, 4x factor dynamically? That would be cool.
Yea pretty much. It could even go to no FG if your base FPS is nearing your refresh rate. So it would have to make some possibly subjective decisions about which level of FG to use depending on your refresh rate and base FPS.

You could also implement some interesting battery/power saving modes using a system like this. Could be useful for laptops and handhelds.
 
I think HUBs conclusions about the use cases for frame gen are about 95% correct.
I dont think he is honest. Comparing nativ 120 FPS to FG up to 120 FPS is absolut nonsense. I dont even get nativ 120 FPS (reality is ~60 FPS in worst case scenarios) in Cyberpunk and Alan Wake with Pathtracing in 1080p on my 4090. So it doesnt make any sense to make such a comparision. Reducing image quality is basically faking frames, too.
 
I dont think he is honest. Comparing nativ 120 FPS to FG up to 120 FPS is absolut nonsense. I dont even get nativ 120 FPS (reality is ~60 FPS in worst case scenarios) in Cyberpunk and Alan Wake with Pathtracing in 1080p on my 4090. So it doesnt make any sense to make such a comparision. Reducing image quality is basically faking frames, too.

I think the visual presentation was kind of dumb. I understand the limits of capturing, but showing the whole video that way was kind of weird. Maybe he should have captured 1080p native with FG or something. There have to be some cards that could capture 1080p240 that could be slowed down and presented as something more reasonable with 60, 60->120, 60->180 and 60->240.

But I think the actual conclusions are probably very close to correct.
Works better with slower movements
Works better with higher base framerate, ideally closer to 120
Enabling FG will lower your base framerate as it uses computing power
Not (as) suitable for competitive games
More useful for high-refresh monitors in the 240+ range

I can't disagree with any of this, though I'd like to see 180->360, 240->480 for a competitive game. Maybe not for tactial shooters, but you can play so-called competitive games in a more casual game. Not everyone playing Fortnite is actually trying to be the best gamer ever. Not sure which competitive games actually support frame generation though.
 
I dont think he is honest. Comparing nativ 120 FPS to FG up to 120 FPS is absolut nonsense. I dont even get nativ 120 FPS (reality is ~60 FPS in worst case scenarios) in Cyberpunk and Alan Wake with Pathtracing in 1080p on my 4090. So it doesnt make any sense to make such a comparision. Reducing image quality is basically faking frames, too.
I took that as him refuting the claim that the 5070 has the same performance as the 4090. In that context it is useful to point out that 120FPS native is not the same thing as 120FPS achieved with FG.
 
I took that as him refuting the claim that the 5070 has the same performance as the 4090. In that context it is useful to point out that 120FPS native is not the same thing as 120FPS achieved with FG.

Yeah that was the idea behind showing 120fps native. Don’t think anyone believes an end user would willingly choose 120fps framegen over 120fps native. The whole point of framegen is that there’s no other way to hit those frame rates on your hardware at the same IQ.
 
Agreed entirely. Given the option, native will always end up providing the very best visual result. At the same time, if I want to use all 120Hz of my monitor's ability while playing Cyberpunk, then I'll need to enable some FG on top of DLSS4 to get there. And with a minimum frame rate well into the 70's, my experience has been excellent. The consistency of high framerate is really what I'm getting out of FG, and absolutely not a change from slideshow to playable.
 
I dont think he is honest. Comparing nativ 120 FPS to FG up to 120 FPS is absolut nonsense. I dont even get nativ 120 FPS (reality is ~60 FPS in worst case scenarios) in Cyberpunk and Alan Wake with Pathtracing in 1080p on my 4090. So it doesnt make any sense to make such a comparision. Reducing image quality is basically faking frames, too.
There are a lot of games that have FG. Cyberpunk and Alan Wake 2 are the most demanding ones of the bunch. 120 fps is achievable without FG in many games that have FG, even at 4k resolution.

There is nothing dishonest about including the non-FG 120fps latency numbers. It is crucial because it shows the baseline latency without FG.

For example, I've been playing Ghost of Tsushima DLDSR'd to 5120x2880 + DLSS performance mode on a 4k120 display. No FG, because I like responsiveness.

3x and 4x are a nice addition for people with high refresh rate monitors, but lots of people will find that mouse movement in 30 -> 120 and 40 -> 120 scenarios feels limp. Nvidia is not emphasizing this limitation for obvious reasons so it's good that reviewers are highlighting this. 5070 = 4090 is just shady marketing.
 
Last edited:
I took that as him refuting the claim that the 5070 has the same performance as the 4090. In that context it is useful to point out that 120FPS native is not the same thing as 120FPS achieved with FG.
The 4090 is less than 50% faster. Even less in 1080p because the GPU doesnt scale. So he should have compared for example nativ 100 FPS vs. 240 FPS (base 66 FPS or so) with MFG...
 
No, because the thumbnail says it all

Man, come on now. We all hate how the YT algorithm almost requires clickbaity thumbnails sure, but it's a pretty detailed video. At least make an attempt to view it so you can argue your points with some merit.

This is the take from the youtube optimum in relation to MFG:

Did you not watch that video either? Even he says some of the artifacts are "clearly visible". While not the ultimate (480hz display?), he is playing on a 240hz OLED and as such, still has one of the ideal environments for FG to really strut its stuff.

"On a bike especially, it's really bad"

But yes, overall in the conditions he used it in and not focusing on particular segments of the game where the artifacts rear their head, the positives of the smoothness for a high framerate far outweigh the negatives. On his setup, with that game.

Ultimately, his focus was more narrow that HBU's though. HBU was speaking more of 3X/4X FG as a 'tech' for the entire 50 line and the multiple conditions that need to be met for it to offer significant value. The problem is that the caveats were already there for FG, and now they've multiplied along with the generated frames. This is why discussing this is such a minefield. Base framerate? Input method? Type of environments that could show off artifacts more visibly? All of these become more critical with more generated frames. And the worst part it's incredibly difficult with todays limitations of youtube to actually convey the effect of these. It's just DLSS/TAA arguments but now more with even more qualifiers (eg: my experience with DLSS on a 55" 4K TV is going to be quite different than someone with a 1080p display, and conversely someone who automatically turns off bloom/dof/motion blur for every game will wonder 'what's all this DLSS artifacts nonsense I hear about, it's perfect!' :))

Ultimately reviews are going to review the product in the context of that company's marketing. If Nvidia wants to advertise the value proposition of the 50 series by using FG as 'performance', then it's perfectly reasonable to highlight the long list of caveats with that claim.

Is 4X DLSS4 FG "great"? Sure, it can be! But...it depends. And depends. And depends.
 
So is that how you get rid of artefacts?

Post process effects are usually the one area where a hastily/improperly implemented reconstruction method will show its faults. These are generally already being rendered a lower resolution than the basic geometry so best practice is to avoid the reconstruction pipeline entirely with these kinds of effects, but sometimes the performance hit from doing so ends up negating much of the performance boost you get from reconstruction. See Alan Wake 2 on the consoles where you get that massive specular aliasing at times when indoors + using DOF because there's no Post Process quality settings like there is on the PC, so much of the game relies on these that they just couldn't exclude it from the reconstruction pipeline on consoles, that setting has a decent impact to performance on PC.

Generally these days it's much better, plenty of games can have these effects using reconstruction with no downside, but there will be times where it's still missed - Forbidden West's motion blur for example is just wrecked at anything below 4K DLSS quality (also one of the reasons I really hate when games don't separate out the camera/object motion blur setting, as character motion blur artifacts are much harder to discern).
 
Man, come on now. We all hate how the YT algorithm almost requires clickbaity thumbnails sure, but it's a pretty detailed video. At least make an attempt to view it so you can argue your points with some merit.



Did you not watch that video either? Even he says some of the artifacts are "clearly visible". While not the ultimate (480hz display?), he is playing on a 240hz OLED and as such, still has one of the ideal environments for FG to really strut its stuff.

"On a bike especially, it's really bad"

But yes, overall in the conditions he used it in and not focusing on particular segments of the game where the artifacts rear their head, the positives of the smoothness for a high framerate far outweigh the negatives. On his setup, with that game.
I didnt want to make statements. His take was just better because it was neutral.

Ultimately, his focus was more narrow that HBU's though. HBU was speaking more of 3X/4X FG as a 'tech' for the entire 50 line and the multiple conditions that need to be met for it to offer significant value.
"Multiple conditions"? Which one? That the reviewer isnt called "Hardwareunboxed"? MFG has the same limitation as FG. In fact MFG has even less problems because latency is slighty higher instead of up to one whole frame. Like optimum said it doesnt make sense not to use MFG 4x when you play with FG...

There are a lot of games that have FG. Cyberpunk and Alan Wake 2 are the most demanding ones of the bunch. 120 fps is achievable without FG in many games that have FG, even at 4k resolution.

Which one? UE5 games with Software Lumen are limited at around 100 FPS on a 4090 in 1080p. So how can i use these 500 Hz on my new 500 Hz 1080p display with the fastest GPU available?
 
Last edited:
"Multiple conditions"? Which one? That the reviewer isnt called "Hardwareunboxed"?

Base framerate? Input method? Type of environments that could show off artifacts more visibly? All of these become more critical with more generated frames.

MFG has the same limitation as FG. In fact MFG has even less problems because latency is slighty higher instead of up to one whole frame.

The latency penalty you pay for engaging higher levels of frame generation is small, yes. But Tim's reasoning for needing a higher baseline FPS before engaging higher levels of FG is also due to the higher % of generated frames vs. native, this increases the chances of noticeable FG artifacts. Additionally FG has a rendering cost to it that increases with higher levels of FG.

Regardless, not going to transcribe the video for you. If you think his arguments are bunk fine, but at least know them.
 
Not 100% sure this is the right spot, but does anyone know if Bestbuy is going to give paying members a perk when trying to get the 50-series FE cards next week?
Unless they say so explicitly I would expect not, and they probably would have done so by now. BB is largely a crap shoot even if you do everything right at launch. Theres no point lining up outside a brick and mortar since there’s no guarantee they’ll have any (none of the BB near me carried the 4090 on launch day).
 
Back
Top