Nvidia GeForce RTX 4090 Reviews

Linus induced DP 1.4 bashing is getting tiresome. All panels which aim at providing >120 Hz over DP 1.4 support DSC these days and DP 1.4 with DSC can go up to 4K@240 or 8K@75 without any IQ loss. For those who aren't content with these there's HDMI 2.1 which can do 8K@120 through the same DSC.

I mean would it be nice for Nv to include DP 2.0 support in Ada? Sure. Does its lack limit your output options in any meaningful way? Nah.

Also of note - DP 2.0 "support" by itself isn't telling much and is similar to HDMI 2.1 "support" in this. As an example RDNA2 support HDMI 2.1 with up to FRL5 which isn't the maximum spec of HDMI 2.1 (Ampere/Ada do FRL6 which is). DP 2.0 can also be supported with UHBR10 only which is about half of maximum DP 2.0 spec and while being faster than DP 1.4 isn't better than HDMI 2.1 even at FRL5. Without knowing what speed level a device supports it's impossible to say if said support is even meaningfully better than DP 1.4.

And Linus should really know all this.

Thinking about it... probably the game just scales very well with SMs which is damn weird Vs other games. Maybe a lot of newer games want to do things by SMs?
Not weird at all. The newer a game is the more likely it is to be limited by math processing first and other things later. The older a game is the more likely it is to be limited by memory bandwidth, CPU, old geometry engines, etc.


Wonder how UE5 games will perform with Ada, more than average what is being benchmarked?
I'd expect UE5 to scale very well on Ada vs Ampere. I do wonder though if UE5 will be able to support Ada's RT features. They don't look like a straight fit at a glance.
 
Last edited:
pic_disp.php

 

pic_disp.php


These preliminary compute results for the new NVIDIA RTX 4090 look very good and offer a significant performance boost over last generation RTX 3090 (which was already very good)! I expect results to be better with code compiled against CUDA 12 which will have full support for Ada Lovelace and Hopper arch GPUs. There will be more RTX 4000 series and Pro Axxx-ada series GPUs over the next few months. I will revisit testing.

One surprising result was how respectable the double precision floating point (fp64) was on RTX 4090. fp64 performance is not a highlighted feature of RTX GPUs. Single precision fp32 is typically 20 times fp64 for these GPUs. However, the fp64 performance of the RTX 4090 is competitive with 16-34 core CPUs. I feel this could be used for code testing and development that is target to run on high-end compute GPUs like A100 and H100.

Overall it looks like NVIDIA has kept the performance increase near doubling on successive GPU generations for another release. They have been doing this for over 10 years now. Impressive indeed!
 
Last edited by a moderator:
So, back at Radeon HD 7970 GE levels of FP64 perf? Not sure why you picked this particular benchmark, as it shows *very low* figures, but in line with expectations.
 


index.php
 
Last edited by a moderator:
When you remove frame generation it’s a whopping 12% faster than the 2 year old 3080.

This is why I worry about the rest of the stack. Where can they possibly go with a 4070 that isn't poor value for money? The absolute best case scenario as I see it is something that is as fast as the 3080 for the same price - more than 2 years later. And crazily that would be a great value product compared to the current 4080 12GB!

There is of course the enormous gulf between the 4080 16GB and the 4090 which will likely be filled by several products. But the problem there is that the price can only go up from the already eye watering £1200.

Even the 4090 is so cut down that in any previous generation it would have been a x080 Ti at best, perhaps even an x080 class GPU. So while we think it's great value compared to previous "Titan level" cards, in fact the Ada Titan card is yet to come. And it's going be expensive.

Granted the Ada architecture is so crazy fast that it can arguably justify such a premium at the top end. But not when it's only offering 12% more performance than a standard 3080.
 
This is why I worry about the rest of the stack. Where can they possibly go with a 4070 that isn't poor value for money? The absolute best case scenario as I see it is something that is as fast as the 3080 for the same price - more than 2 years later. And crazily that would be a great value product compared to the current 4080 12GB!
Yeah been thinking about this since the reveal, they've painted themselves into a corner with these crazy prices, spec differences and 4080 encore. From 3080 -> 3070 and 2080 -> 2070 is about 0.8x performance which would make the 4070 about 3080 performance given 4080 12GB's expected raster perf, between 3090 and 3090Ti. The choices look to be a very high price (~$650-700) for an x70 card so similar value to the 4080s, an "expected" $500 price making 4080s terrible value, or a a 4070Ti in between with a bigger reduction in performance with but then it's really close to the 3070 because the 3080 is only about 1.2x faster than the 3070. Even if it's 3080 perf for $600 that's 1.2x faster for 1.2x the price 2 years later and that'd mean the 4080 12GB would be about 1.25x faster for 1.5x more

RT performance is much improved and will get better if SER is widely adopted, DLSS 3 looks good but it's one hell of a situation when the $1600 GPU looks like it'll be better value than those almost half its price. Not good for consumers at all
 
Spider-Man RT 4K achieves 2X uplift going from 3090 to 4090, Cyberpunk achieves 2.1X, RE Village achieves 90% uplift, and Metro Exodus achieves 90%.

Rasterization wise, Spider-Man does 97% uplift, Forza Horizon 5 does 92%, RE Village 92%, Valhalla does 80% , Cyberpunk 80%, Red Dead 2 78%, Dying Light 2 75%, and God Of War 75%. With an overall rasterization uplift of 80% across a 12 game average @4K. This is some solid gen over gen gains.

 
Spider-Man RT 4K achieves 2X uplift going from 3090 to 4090, Cyberpunk achieves 2.1X, RE Village achieves 90% uplift, and Metro Exodus achieves 90%.

Rasterization wise, Spider-Man does 97% uplift, Forza Horizon 5 does 92%, RE Village 92%, Valhalla does 80% , Cyberpunk 80%, Red Dead 2 78%, Dying Light 2 75%, and God Of War 75%. With an overall rasterization uplift of 80% across a 12 game average @4K. This is some solid gen over gen gains.

These are incredible gains. I still run games great with my 2080 Ti at 3440x1440 but this card is like 2.5x faster than it in raster and 3x faster in RT heavy workload. This is ridiculous.
 
4090 is a compute titan. We need more modern AAA games to show its true power
for gaming, we won't see a monster like this in years. It's not without its flaws though. The power consumption is not very gaming like. Gaming has always been based on tricks to perform better while trying to look as good as possible, with every possible hack enabled. A device consuming 450W of power or even more in some cases, kinda goes against that trend. nVidia could make a 300W monster, they have the skills.

The performance hit is very un-perceptible going from 450W to 300W. As shown by this interesting take on computerbase.de https://www.computerbase.de/2022-10/nvidia-geforce-rtx-4090-review-test/

1665701231970.png
 
Spider-Man RT 4K achieves 2X uplift going from 3090 to 4090, Cyberpunk achieves 2.1X, RE Village achieves 90% uplift, and Metro Exodus achieves 90%.

Rasterization wise, Spider-Man does 97% uplift, Forza Horizon 5 does 92%, RE Village 92%, Valhalla does 80% , Cyberpunk 80%, Red Dead 2 78%, Dying Light 2 75%, and God Of War 75%. With an overall rasterization uplift of 80% across a 12 game average @4K. This is some solid gen over gen gains.

God damn what a beast! I can't wait to get mine. Will be such a massive jump up from this still awesome 2080ti.
 
Will be such a massive jump up from this still awesome 2080ti.

That sounds so weird lol. My 2080Ti which was a true beast now all of a sudden seems very.... 'weaksauce'. I know i shouldnt ever complain, its still a powerfull GPU, more than capable enough for the coming years and then some. Its easy to forget 'last years' hardware when looking at todays stuff.
 
That sounds so weird lol. My 2080Ti which was a true beast now all of a sudden seems very.... 'weaksauce'. I know i shouldnt ever complain, its still a powerfull GPU, more than capable enough for the coming years and then some. Its easy to forget 'last years' hardware when looking at todays stuff.
Yea I know that feeling. It was weird seeing the 2080ti appear lower and lower on all the charts, especially after it dominated for ~2 years with no competition.

The 2080ti is special though. I honestly feel like I got my money's worth from getting that card on day 1. Cool new tech, and awesome performance. It delivered. It's also the first "RT card".. and the first card with the RTX nomenclature. I'm going to keep mine I think.
 
Just got my 4090 and while my case has plenty of space, the triple-slot sized heatsink forced me to remove my PCIe NVMe expansion card (it is PCIe x4 and can't fit in a x1 slot). Since many MB have their two PCIe x16 slots separated by two slots, this is something you may want to check before getting one (or maybe get a riser card?)
 
Back
Top