Nvidia Geforce RTX 5090 reviews

Scott_Arm

Legend
Video:






Written:



Gamer's Nexus did upscaling benchmarks with FSR Quality on all cards? What exactly is that testing? I know they're trying to equalize settings, but the results are going to be totally unrepresentative because no one will ever run an RTX5090 with FSR. It's equalizing image quality and the performance characteristics of the upscaler to produce a chart that doesn't actually show anything the user will experience. I don't get it.

I got to the "marketing BS" section and he actually says the performance footnote on their slide set "blends into the background of the page." That is an ENORMOUS stretch. Do I like the slide set? Not particularly no. But all of the detail is on it to tell you what they're comparing. Yah, lazy readers or people that don't understand the tech can miss the footnote and be deceived, but the information is there. Saying it "blends into the background" makes me question someones vision more than anything. Calls frame gen "artificial frames" vs "real frames", no investigation of latency or frame gen. Pretty much what I expected.
 
Last edited:
My 4090 neither saw FSR usage nor DLSS performance usage.
Ever.
Neither will my coming 5090.
Those metrics are 100% irrelvant for me.

DLAA, FG, DLSS Quality, RT.
Those matter to me.

DF's video also had some game running with FSR...that was the time I closed the video 🤷‍♂️
 

Hogwarts and Spiderman are not suitable for benchmarking the highest end cards. CPU limited at 4k on a 9800x3D with the RTX 5090 and even the 4090. Just not good data for benchmarks. At 1440p the list of games is even bigger. Reviewers need to be careful and take note of gpu usage. Baldur's Gate 3, Space Marine 2, Starfield, Rainbow Six Siege show up as cpu-limited at 1440p.

Seems like a pretty level-headed review.
 
Last edited:
I'm mixed about this.

On one hand it's important to have some of that data to inform people about the real world implications of what they are going to get in terms the experience.

On the other hand just lumping it in without context into those aggregate results can lead to the wrong interpretation.
 
I'm actually VERY curious now about RTX 4080 and 4080 Super. What kind of CPU do you need to keep them from being CPU-limited at 1440p, especially if you use DLSS quality. I feel like there's a good chance that there's a lot of games where that could be the case. Something I hadn't really though about.

gpu-driven rendering really can't come soon enough. Either with work graphs or some other engine re-writes. What's the point of releasing super-powered GPUs if you can be cpu-limited with the top of the line CPU.
 

This seems like a very even-handed review. The way the charts are presented makes it fairly easy to spot games where scaling looks cpu-limited. Good amount of time spent on dlss and frame gen and outlining where it might be useful and ends by saying it's up to the consumer to decide if it's something they would like to use.
 
Last edited:
I'm mixed about this.

On one hand it's important to have some of that data to inform people about the real world implications of what they are going to get in terms the experience.

On the other hand just lumping it in without context into those aggregate results can lead to the wrong interpretation.

Yep exactly. All of the “average” metrics out there include titles where performance is limited by system and CPU bottlenecks too and not just the GPU. HUB addressed this well I think by hypothesizing that the delta likely grows in the future with more demanding games etc. Probably still looking at 40-50% best case though.
 
Gamer's Nexus did upscaling benchmarks with FSR Quality on all cards? What exactly is that testing?
They did it for literally two games, and only in those two games RT modes. I also think they did a good job explaining why... A: they called it an experimental chart because the data leans so hard towards NVIDIA, and B: at 4K there's basically only two cards in the entire lineup that can get to >60FPS even with FSR. For those who weren't listening, the timestamp linked here:

What's the reason behind your angst here?

Aside from that, the results all seem to indicate the 5090 is only going to perform at peak with 4K resolutions if not higher. So, if you're into excessive framerates at 4K rez, then the 5090 is still going to do a solid job outperforming the 4090. Otherwise, it's probably worth waiting for the 5080 options instead.

Also worth nothing: they're testing factory OC cards (EVGA 3090ti FTW, 4090 Cybertank) versus the 5090 FE. Just thought I'd mention that, because the FE clocks of the former cards aren't gonna be the same as the aftermarket ones tested.
 
They did it for literally two games, and only in those two games RT modes. I also think they did a good job explaining why... A: they called it an experimental chart because the data leans so hard towards NVIDIA, and B: at 4K there's basically only two cards in the entire lineup that can get to >60FPS even with FSR. For those who weren't listening, the timestamp linked here:

What's the reason behind your angst here?

Aside from that, the results all seem to indicate the 5090 is only going to perform at peak with 4K resolutions if not higher. So, if you're into excessive framerates at 4K rez, then the 5090 is still going to do a solid job outperforming the 4090. Otherwise, it's probably worth waiting for the 5080 options instead.

Also worth nothing: they're testing factory OC cards (EVGA 3090ti FTW, 4090 Cybertank) versus the 5090 FE. Just thought I'd mention that, because the FE clocks of the former cards aren't gonna be the same as the aftermarket ones tested.

There's no angst. I just don't really like the test. It's kind of like something the customer buying the card would do, but it's not really. They're likely going to use whatever DLSS is in the game, especially if it's DLSS4 with the transformer model, which might actually run slower. Feels like a case of trying to compare apples-to-apples to the point where you're actually not comparing an apple someone wants to eat. It would be more complicated to compare DLSS to FSR, but that's just the reality of how people are going to use them, so it'd be more insightful to explore that. I don't mind the upscaling comparisons being done in a dedicated video. It's a tough choice, but I think it's better to think from the consumer's point of view vs what's easier to test.
 
It's kind of like something the customer buying the card would do, but it's not really. They're likely going to use whatever DLSS is in the game, especially if it's DLSS4 with the transformer model, which might actually run slower.
Except now we're talking about apples and oranges comparison, which leads us to:
Feels like a case of trying to compare apples-to-apples to the point where you're actually not comparing an apple someone wants to eat.
Except that's literally the right comparison. Yeah, we took a bite of this bitter thing, and here's how it turned out. It was fair to all cards, because it's the same workload for all cards. Sure, you and I would've preferred the lovely orange flavored DLSS over the gross apple FSR, but that's not a fair comparison when dealing with performance. Funny, now we're back to trying to decide how to define performance again, aren't we?

It's a tough choice, but I think it's better to think from the consumer's point of view vs what's easier to test.
I doubt it was easier; those apps all default to DLSS when they find themselves on an NVIDIA platform. GN actually had to make the change on the NVIDIA platforms to tell them to use FSR as their upscaling method.
 
Because at some point in the future you wont. Be nice to have a gfx card last several cpu upgrades

There’s also DSR which I use all the time and reviewers never cover. If I can snag a 5090 the plan is to run 4xDSR + DLSS perf with the new TNN hotness or a similar combination. That’s at least 20% more pixels than 4K so hopefully a bit less CPU limited. In less demanding games it can probably do straight 8K DSR + DLAA.
 
@Albuquerque Native is apples-to-apples. Right now upscaling is not. Each vendor has its own solution. I would rather see them do head-to-head testing at native for apples-to-apples, and then just test with DLSS and compare the 5090 against itself (native, quality, balanced, performance) to see how it scales. I really don't need to see how cards compare running FSR because no one is going to do that. I'd rather see benchmarks of CNN vs TNN or something that's more aligned with how people will actually use it.
 
There’s also DSR which I use all the time and reviewers never cover. If I can snag a 5090 the plan is to run 4xDSR + DLSS perf with the new TNN hotness or a similar combination. That’s at least 20% more pixels than 4K so hopefully a bit less CPU limited. In less demanding games it can probably do straight 8K DSR + DLAA.
This is the way. :)
 
If you change upscaling between vendors, you're right. If you're using the same upscaling between all the offerings, it's still apples to apples. Tell me why it isn't.

It's apples-to-apples, I just don't care about the data because I would never use FSR3 on an Nvidia gpu unless there was no other option. I'd rather see how games scale with DLSS across resolutions or something, especially now that there's a new model. Nvidia has tensors to do DLSS, so work them out. Tell me if they ever bottleneck. If you use DLSS on a 4090 and you start switching from Quality to Balanced to Performance is there ever a point where it bottlenecks? I don't know, something I'd actually do. FSR is essentially just another shader core test.
 
Back
Top