Nvidia GeForce RTX 4090 Reviews


Valhalla seems to have been improved a little bit even on Ampere. Managed to find a screenshot i made back in 2020 when the game launched, there's a metric ton of patches as well, who knows which improvement is from the driver and which from 2 years of updates. The 4k result also shot from 55 in 2020 to 68-69 now
 
Valhalla seems to have been improved a little bit even on Ampere. Managed to find a screenshot i made back in 2020 when the game launched, there's a metric ton of patches as well, who knows which improvement is from the driver and which from 2 years of updates. The 4k result also shot from 55 in 2020 to 68-69 now

Driver update a while ago. It was notoriously inefficient on Nvidia vs. Amd at launch, a Nvidia driver update significantly improved its performance.
 
Last edited:
When you remove frame generation it’s a whopping 12% faster than the 2 year old 3080.

And this is why the argument "So what if you don't like DLSS3, just turn it off - it's good to have the option" really only works on the 4090 and perhaps 4080/16 GB. They have enough brute horsepower that DLSS3 is just an added bonus, it's not necessary to justify their expense.

But on the 4080 12GB, it's far more closely tied to any potential value proposition. Of course you can turn it off on any card, but it's clear Nvidia is banking on it to make an argument for the generational uplift in some SKU's, otherwise their improvements gen on gen look decidedly weaker.

This is also why I've been wincing a bit when I people wax on about how this is the 'biggest gen on gen leap we've seen', as it also has some of the most drastic performance fall-offs from the absolute top-tier card, which is now $1500+. When I think 'generational leap', to my mind that means a new architecture brings about a significant improvement for at least several tiers of product that employ that architecture, not one or two models.
 
Congrats. In my region the Asus Tuf model went online for 2450 euros :ROFLMAO:

It's also quite a bit more expensive in Taiwan, as a Gigabyte 4090 Gaming costs NT$58,990 (~US$1,850). Still they are selling out quite fast, though I guess there weren't too many available from the start.
 
Not a review but *interesting*:
12092100719s.jpg


RTX 4090 doing 2.1x over the 3090 Ti in this game with raster alone with no DLSS? SER and OMM implemented?
 
Not a review but *interesting*:
12092100719s.jpg


RTX 4090 doing 2.1x over the 3090 Ti in this game with raster alone with no DLSS? SER and OMM implemented?

Hmm...that is interesting but it cant be SER or the OMM because no RT and lets look at the other interesting data on this chart.

Also interesting (and very disappointing), is the difference between the RTX 3080 (699.99 MSRP) vs the RTX 4080 12GB (899.99 MSRP).

With DLSS off (which is our only comparison since they include DLSS 3 with the RTX 40 series) it looks like Nvidia saved all the gen-over-gen performance gains for their absolute highest end of the stack.

Nvidia are absolute masters at marketing and brand image. The narrative is already set. Increase the price of the flagship (which is now the RTX 4090 as opposed to an 80 series card like previous gens) and then only release this top of stack card which was only $100 MSP over the last gen equivalent. It looks good with its impressive gains and sets the narrative that RTX 40 has massive gen-over-gen improvements but then when you look down the stack, the numbers get a bit more eye raising.

But that doesn't matter to Nvidia, because the narrative has already been set.

Well played.
 
So where is that impressive path tracing demo called Justice: Court of something? It's supposed to release today.
 
The WCCFtech reviewer posted the Unity "Enemies" demo showing DLSS3 in action

FezwHqoXEA8kZ-f


I asked him if they were actually releasing this demo to the public but he hasn't answered me.

I'm guessing they must if reviewers have it to play around with. I'd love to try it out.

As a side note, 3dmark released their new Speedway test today. It's very short, but it's pretty when ran at high resolutions.
 
No RT which means no SER and no OMM. Likely just lots of compute, and it's 40 vs 90 TFs of FP32.
Hmm...that is interesting but it cant be SER or the OMM because no RT and lets look at the other interesting data on this chart.
Thinking about it... probably the game just scales very well with SMs which is damn weird Vs other games. Maybe a lot of newer games want to do things by SMs?

Wonder how UE5 games will perform with Ada, more than average what is being benchmarked?
 
Epic Operation Flashpoint meme included - so sweet.
It's not just Operation Flashpoint, but also Arma 3, Arma 2, Watch Dogs 2, Crysis Remastered, and all Total War games with the ground camera .. bascially any game that stretches draw distance to very high settings. Here even the strongest of CPUs today fall flat on their faces. We really need DLSS3 in these games.
 
Last edited:
Some tests that show 2X uplift from the 3090 to 4090.

Time Spy:
3090: 9.7K
4090: 19.2K

Port Royal:
3090: 12.8k
4090: 25.5k

LuxMark:
3090: 8.1K
4090: 15.8K

Superposition (DirectX)
3090: 16,375
4090: 33,362

Total War Three Kingdoms 4K:
3090: 52 fps
4090: 103 fps



Chernobylite Ultra Ray Tracing 4K:
3090: 32 fps
4090: 62fps

Cyberpunk Ultra Ray Tracing 4K:
3090: 21 fps
4090: 43 fps

 
Last edited:
Back
Top