Nvidia Geforce RTX 5090 reviews

I agree there is value in what you describe, however I disagree when comparing performance between dislike platforms. We are back to trying to compare the NVIDIA FX series forcing 24 bit computation on shaders as somehow comparable to the same era ATI Radeon series pushing a legit 32 bit precision and claiming "performance is equal."

They weren't then, they aren't now. If we are going to compare performance then we need equal computational loads, and I feel the upscaling is relevant because there's literally no way for nearly all of the pack to even achieve 60fps peak let alone average. The only way to do this is to keep it all the same, and not try to hide under software differences that muddy the outcome.

Everyone who is buying a 5090 already knows about DLSS, even if they don't yet know how good DLSS4 TNN is. I see lots of benefit in doing some very in-depth and lengthy DLSS comparisons and benchmarks to illustrate these points. But i don't see why it makes sense to change upscalers when comparing cards, even though I do see and understand why an upscaler absolutely makes sense in seriously performance limited scenarios.
 
A problem here is if we are testing upscaling performance the issue is that all the upscaling solutions are essentially "biased" and optimized to specific hardware platforms.

As mentioned if we just want to test raw compute or tensor performance we can do that apples to apples but that wouldn't be then just injecting one vendors software solution on top for all hardware.

I think the disconnect might be that I guess people view FSR as "unbiased" just because it's runnable on more platforms but that doesn't mean it's optimally approaching the task of upscaling for every respective hardware platform.
 
Problem is, DLSS is often faster than FSR on NVIDIA GPUs, sometimes up to 15% faster. Testing with FSR will hamper performance on NVIDIA GPUs.
It's fair in performance tests because both cards are processing the same workload.

If we are testing upscaling efficacy then it becomes a different and more reasonable statement. But that's not what is being tested today...
 
Last edited:
@Albuquerque The way I'd do it is close to Digital Foundry. I'd limit the number of cards I was comparing to RTX5090, RTX4090, RTX3090, RXT2080ti and Radeon 7900 XTX. Basically the flagships that someone would realistically upgrade from or consider as an alternative. I'd compare them all at native resolution. For upscaling, I'd drop the AMD card and just compare apples-to-apples across Nvidia so everything is DLSS. I'd test scaling from Quality to Performance to see how it scales with resolution, and see when cpu limits are hit. You could handle frame gen in the same way. If anyone was upgrading from a Radeon 7600 to a flagship part, or something like that, I'd expect they could figure that out on their own. Like I have a 3080 and I think if I wanted a 5090 it would be reasonable for me to take the extra step to compare my 3080 to a 3090 and extrapolate from there.

I'd do direct DLSS to FSR comparisons as their own video where you could look at quality differences and scaling differences. I just think testing with FSR on Nvidia is kind of useless. Like, you could test with XeSS on both Nvidia and AMD, but no one would ever use that. Assuming FSR4 or some other upscaler doesn't come up and basically challenge the status quo in a vendor agnostic way. If you really think about it, now that you have DLSS CNN vs TNN, how do you compare apples-to-apples across Nvidia? Do you pick the TNN because it looks the best or do you pick the CNN because it's faster and runs better on the older cards? It's all tricky now.
 
There’s also DSR which I use all the time and reviewers never cover. If I can snag a 5090 the plan is to run 4xDSR + DLSS perf with the new TNN hotness or a similar combination. That’s at least 20% more pixels than 4K so hopefully a bit less CPU limited. In less demanding games it can probably do straight 8K DSR + DLAA.
Is 4xDSR + DLSS Perf better than using DLAA at native res? I feel like you are dabbling in the dark arts and it's making me nervous 😰
 
Last edited:
Where does one purchase the Nvidia OEM cards? I’ve never seen them for purchase at the local computer shop.
 
Where does one purchase the Nvidia OEM cards? I’ve never seen them for purchase at the local computer shop.

I'm guessing you mean the FE version from Nvidia? Nvidia only sells them in specific regions and has specific retail partners for them in those regions.

For RTX 4xxx I believe in US it's from Nvidia directly, Best Buy USA and Microcenter. In Canada they are Best Buy Canada only. In the UK they are from Scan I believe? Not sure about others off hand.

Geforce.com and Best Buy are the only places I have ever seen them. I don't think many of them are made though. Possibly just the initial batch at launch.

At least for 3xxx and 4xxx (and also 2xxx but they weren't popular being more expensive) they were periodically restocked.
 
Hogwarts and Spiderman are not suitable for benchmarking the highest end cards. CPU limited at 4k on a 9800x3D with the RTX 5090 and even the 4090. Just not good data for benchmarks. At 1440p the list of games is even bigger. Reviewers need to be careful and take note of gpu usage. Baldur's Gate 3, Space Marine 2, Starfield, Rainbow Six Siege show up as cpu-limited at 1440p.
I don't understand your view here. Reviews are supposed to give consumers realistic impression of what they're buying beforehand, not to make reviewed cards look as good as possible. Are you suggesting people buying 5090 wouldn't play Hogwarts or Spiderman? Or that there wouldn't be any 1440p buyers?
 
At least for 3xxx and 4xxx (and also 2xxx but they weren't popular being more expensive) they were periodically restocked.
No way to be certain due to variance, but I can't recall ever seeing one in stock after the initial batch. Partner cards were often available but never the Nvidia OEM board.
 
No way to be certain due to variance, but I can't recall ever seeing one in stock after the initial batch. Partner cards were often available but never the Nvidia OEM board.

Especially for the 4090 the demand was high enough it would just get bought instantly, which is understandable due to them holding MSRP against the AiB cards. So if you were just going by randomly browsing the sites and seeing it then you would've never seen them.

Going with nowinstock for example it shows the 4090 FEs last being seen in June/July 2024 - https://www.nowinstock.net/computers/videocards/nvidia/rtx4090/

If you at communities which track this stuff like r/bapcsales you can see multiple posts of notices that FE's are in stock - https://www.reddit.com/r/buildapcsa...7ce6&iId=e057ba0f-3614-4735-9fb9-8516c91c99a5

I'm not in the US but for Canada at least the lower SKUs FEs were actually readily available in that you could go onto Best Buy Canada and just see them for purchase. The 4090s though were hard to get as again similarly price wise they were just way lower than the AiB versions along with being inherently more desirable in for many even if price were the same.

Going by leaked AiB prices I'm guessing it's likely going to be the same situation. The RTX 5090 FE is going to very hard to get, you're going to have the be setup to catch the launch batch or basically actively monitor restocks to have any chance. The RTX 5080 may have high demand as well.
 
Bless you man


I will try. lol thanks

If you're really going to try for one I would make sure your Best Buy account is logged in already. All delivery information is filled and payment information. Have the product page pre bookmarked and be on their refreshing prior to sales lifting.

The RTX 5090 is almost surely going to instantly sell out. I'd also suspect the RTX 5080 won't last much either. From what I remember with 4xxx the other SKUs other than the 4090 weren't really any issue (as in you could even buy them later in the week) but the 4080 wasn't priced all that well compared to the 5080 relative to the rest of the stack.

The other issue is if the rumours about trying to get in before US tariffs were true it could be that Canadian launch supply might be poor.
 
I don't understand your view here. Reviews are supposed to give consumers realistic impression of what they're buying beforehand, not to make reviewed cards look as good as possible. Are you suggesting people buying 5090 wouldn't play Hogwarts or Spiderman? Or that there wouldn't be any 1440p buyers?

I think it's definitely valuable to know that you can run into games that are cpu-limited on the 5090 at 4k. I'm actually curious to see the 4080 super and 5080 with more mid-range cpus at 1440p to see if they have the same issue. If you're going to average out those results and say "on average you gain x% at 4k" than you're just polluting the data. Basically those games will not reflect the capability of the card, or the performance advantages you'd have in future titles, so it really depends on how you present the data.

I would say that I'd highlight them as games that will not see big performance increases, if any increase, depending on the resolution, but I'd leave them out of any kind of averages that are meant to reflect the general capabilities of the gpu.
 
I agree there is value in what you describe, however I disagree when comparing performance between dislike platforms. We are back to trying to compare the NVIDIA FX series forcing 24 bit computation on shaders as somehow comparable to the same era ATI Radeon series pushing a legit 32 bit precision and claiming "performance is equal."
Umm... am I wrong thinking you got this backwards? FX were able to do 32 bit but used 16 bit most of the time because of performance issues. The Radeon cards used 24 bit max but looked much better than the 16 bit that Nvidia pushed for performance reasons.
 

Doing it right. Set 3x frame gen to get close to 240Hz. Pretty good subjective impression of the experience. I wish there was a way I could actually see it on my 240Hz display raw.


Just skimming but this has comparisons for rtx5090 vs rtx4090 with DLSS-Q, DLSS-P, DLSS FG 2x apples-to-apples and DLSS FG 4x vs 2x. Bunch of stuff haven't seen looked at this extensively yet.
 
Last edited:
Is 4xDSR + DLSS Perf better than using DLAA at native res? I feel like you are dabbling in the dark arts and it's making me nervous 😰

Nah you’re right. I forgot DLSS perf is 1/4 resolution.

You can get creative with the DSR and DLSS combos. It’s easier now that you can force DLAA too. DLSSQ + DLDSR 1.78x is a nice combo. Not as heavy as native/DLAA but a good step up from DLSSQ.

I don't understand your view here. Reviews are supposed to give consumers realistic impression of what they're buying beforehand, not to make reviewed cards look as good as possible. Are you suggesting people buying 5090 wouldn't play Hogwarts or Spiderman? Or that there wouldn't be any 1440p buyers?

Isn’t this exactly why everyone is testing with a 9800X3D? The explicit goal is to make the GPU look as good as possible by minimizing other bottlenecks. The issue being highlighted here is that in many cases those bottlenecks persist.
 
CPU/Mem even tuned were already a bottleneck in some scenarios for me with the 4090 at 1440p UW so 5090 making it more prominent isn't a surprise. It's a great fit for 4k 240hz screens.

When I switch to a 6090, it'll be with a higher PPI screen as I don't expect the CPU's to have some revolutionary jump from now until then.
 
It's fair in performance tests because both cards are processing the same workload.
Are they? Do we know that FSR is optimized to run on Nvidia h/w as well as on AMD's or Intel's? Are there no h/w specific paths in it?
IMO this is a questionable claim. Press used to avoid running Nvidia s/w when it was present in some game because it would "gimp" other h/w and yet somehow the same press is fine with doing that with AMD's s/w.

I don't understand your view here. Reviews are supposed to give consumers realistic impression of what they're buying beforehand, not to make reviewed cards look as good as possible. Are you suggesting people buying 5090 wouldn't play Hogwarts or Spiderman? Or that there wouldn't be any 1440p buyers?
Reviews are supposed to show the capabilities of the product review is about. If you're running that product in an environment which is limiting its capabilities by some other product then you're not doing that and you're providing a false data on the capabilities of the product you're reviewing. Which is why testing GPUs in CPU limited games or testing CPUs in GPU limited games are completely off the whole point of doing a review in the first place. Should you say or show something about the fact that many games will in fact be CPU limited on a 5090? Sure. Should that affect your final average percentages of what the GPU is capable of? No.
 
Back
Top