Nvidia GeForce RTX 4090 Reviews

for gaming, we won't see a monster like this in years. It's not without its flaws though. The power consumption is not very gaming like. Gaming has always been based on tricks to perform better while trying to look as good as possible, with every possible hack enabled. A device consuming 450W of power or even more in some cases, kinda goes against that trend. nVidia could make a 300W monster, they have the skills.

The performance hit is very un-perceptible going from 450W to 300W. As shown by this interesting take on computerbase.de https://www.computerbase.de/2022-10/nvidia-geforce-rtx-4090-review-test/

View attachment 7217
Nvidia raised the tdp because they fear rdna 3 they dnot know the performance.. i feel amd will surprise like they did with rdna 2
 

    • Nvidia raised the tdp because they fear rdna 3 they dnot know the performance.. i feel amd will surprise like they did with rdna 2

      From 350 to 450W the difference is 29%. Not only is the percentage of the performance difference a single digit number, but here's what the reviewer itself has to say:

      Das Ergebnis ist eindeutig: Viel schneller wird die GeForce RTX 4090 durch den Sprung von 350 Watt auf 450 Watt (nicht immer ausgereizt) im Testparcours auf dem Core i9-12900K nicht. Im Durchschnitt legt Nvidias neues Flaggschiff dadurch nur um 3 Prozent zu. Ausnahmen bilden Spiele, in denen die RTX 4090 die 450 Watt ausreizt, der Einschnitt auf 350 Watt also groß ist. Aber selbst dann bleibt der Zuwachs einstellig.

      For those that don't understand German, the author claims a 3% difference on average from 350 to 450W. For a 29% consumption difference, it's hardly worth to operate the GPU over 350W but that's just me.

      One step higher from 300 to 450W ie a 50% power consumption difference, the average performance difference is at 9+%.

    • We'll see what AMD will release in the immediate future and the characteristics of their competing product family, but so far I can't see anything extraordinary in the fore mentioned results
 
This is why I worry about the rest of the stack. Where can they possibly go with a 4070 that isn't poor value for money? The absolute best case scenario as I see it is something that is as fast as the 3080 for the same price - more than 2 years later. And crazily that would be a great value product compared to the current 4080 12GB!

You hit the nail on the head, for people like me with a 3060ti what upgrade path do we realistically have?

A 3070/3070ti have the same amount of VRAM so aren't really a wise upgrade decision.

A 10Gb 3080 is nearly twice the price is no where near twice the performance (39% faster according to Techpowerups reviews)

The 3080ti is nearly 3x the cost and isn't even 50% faster.

The AMD side is slightly better but means I would have to sacrifice RT performance which I don't really want to do.

There's literally no cost effective or sensible upgrade option for people with the RTX2000 series or the mid-range RTX3000 series.
 
You hit the nail on the head, for people like me with a 3060ti what upgrade path do we realistically have?
When you bought this card did you expect to upgrade in 2 years or less after your purchase?

Didn't you buy your card in August?

Just because Ada currently offers no upgrade path, that doesn't mean it won't. Or Blackwell won't.
 
When you bought this card did you expect to upgrade in 2 years or less after your purchase?

Didn't you buy your card in August?

Just because Ada currently offers no upgrade path, that doesn't mean it won't. Or Blackwell won't.

I didn't buy it with the intention of having it last for years so 'expecting' to upgrade isn't really the point.

I purchased it as out of the all options in the UK the 3060ti was (and still is) the best GPU in terms of price/performance ratio.

My intention was to always replace it with a RTX4060ti but looking at what Nvidia have done it's looking likely that plan now won't ever happen.
 
Nvidia raised the tdp
Nvidia didn't raise the TDP, it's the same as on 3090Ti.

As for why single GPU TDPs in general are getting higher (PCIE5 power spec goes up to 600W) - silicon scaling is at its end in both sizes and power which means that you have no other way of increasing performance but to raise power ceilings. This is happening in CPUs too so Nv does nothing extraordinary here. AMD will most likely follow the same road with RDNA3 as well.

And even at 600W per card we're still nowhere close to how much power 3/4-way AFR systems used to consume so I still don't understand why this is suddenly a big issue now. Just don't buy a card which consume more than you'd like.
 
Last edited:
I wonder how the 4090 and 4090 Ti will stack in UE5 games Lumen Hardware RT or other non-UE5 games that implement SER and OMM with RT. And I wonder could CP2077 with the new Overdrive update have SER & OMM that carries into less intensive RT modes (Psycho and below)? Unless it's not possible to augment existing games with SER & OMM. Because I could see a CP2077 RT updated with SER & OMM able to do 120 FPS on Ultra/Psycho RT on DLSS 2 Performance mode on a 4090 Ti.
 
These are incredible gains. I still run games great with my 2080 Ti at 3440x1440 but this card is like 2.5x faster than it in raster and 3x faster in RT heavy workload. This is ridiculous.
Why ridiculous? It's expected. Your card was first generation RT, is two generations old and on a much more inferior process node (14nm Vs 4nm). It also uses up to 80% more power in the process. It seems that reducing the power consumption might reduce it to be 2x faster instead, so in line with expectations. Anything else and it would be disappointing.
 
Last edited:

The card can run this at 8k with RT and Performance dlss at ~70 frames and native 8k at 40 frames. Riva Tunner shows his vram at 19 gigs
I would gladly play it at 40 fps instead of using DLSS Performance on a desktop. Performance is great for laptops though.
 
The card can run this at 8k with RT and Performance dlss at ~70 frames and native 8k at 40 frames. Riva Tunner shows his vram at 19 gigs
Power consumption never exceeded 380w! Often lower than that!

This reminds me, in many games .. the 4090 is often operating lower way lower than the 450w max, often around 350w to 380w, there is clearly some head room for the card to push more fps when more powerful CPUs are released.
 
Last edited:
Power consumption never exceeded 380w! Often lower than that!

This reminds me, in many games .. the 4090 is often operating lower way lower than the 450w max, often around 350w to 380w, their is clearly some head room for the card to push more fps when more powerful CPUs are released.

I think it'll be like the old GTX480, a monster under a water block when given more voltage!

Fermi could nearly hit 1Ghz core speed under those conditions and seeing a single GPU pull over 500w was cool and scary at the same time.
 
May have to wait until Meteor Lake for the 4090/4090 Ti to stretch its legs. Hopefully when CPU sockets change we'll see renewed tests with top end RTX 40 & Rx 7000.
 
Nvidia didn't raise the TDP, it's the same as on 3090Ti.

As for why single GPU TDPs in general are getting higher (PCIE5 power spec goes up to 600W) - silicon scaling is at its end in both sizes and power which means that you have no other way of increasing performance but to raise power ceilings. This is happening in CPUs too so Nv does nothing extraordinary here. AMD will most likely follow the same road with RDNA3 as well.

And even at 600W per card we're still nowhere close to how much power 3/4-way AFR systems used to consume so I still don't understand why this is suddenly a big issue now. Just don't buy a card which consume more than you'd like.
there is still a problem. Which PSU does your computer have? 'Cos I can tell you that if you have a 550W PSU like me, the 4090 is a no go as much as I like it for what it offers. There are some retailers recommending a 1200W PSU for their 4090 SKU.

The best thing for those with a 500W to 700W PSU is that the 4090 paints a great future for the 4060-4070, power consumption wise. I already purchased an Intel A770 -still waiting for it to be delivered- but this GPU generation represents a big leap in many ways.
 
there is still a problem. Which PSU does your computer have? 'Cos I can tell you that if you have a 550W PSU like me, the 4090 is a no go as much as I like it for what it offers. There are some retailers recommending a 1200W PSU for their 4090 SKU.
1200W and it's ehm 8 years old I think? This "problem" hasn't appeared today, anyone who was seriously into PC GPUs has been used to their power needs for quite some time now.

The best thing for those with a 500W to 700W PSU is that the 4090 paints a great future for the 4060-4070, power consumption wise.
Exactly. Nobody is forcing everyone to get the top most possible configuration. Lower models may not provide much in terms of perf/dollar compared to previous generations but they still do provide better perf/watt.
 
Back
Top