Nvidia GeForce RTX 4090 Reviews

Performance is great for the price on the 4090 , I think it is a much better value the 3090 was. Still waiting to see how the 4080 runt is.
We were mostly always looking at around 3090 pricing for 24Gb halo cards in recent years, and good to see the 4090 not increasing the premium by a huge amount but getting a very significant performance increase.
 
The 4090 is certainly a very good GPU. It was never going to match the fantasy numbers being rumored pre release. AMD will not be reaching this RT performance level.
 
This is a magnitude of a performance leap that we haven't seen since the GTX 1080 Ti. Going from 980 Ti to 1080 Ti was a 60-70% performance jump. Going from 2080 Ti to 3090 was a 50-60% perf jump. The RTX 4090 is a cut-down die and offering a 70% increase already means that NVIDIA has really outdone itself in offering one of the biggest gen over-gen performance increases. This is an absolute beast of a card that simply has no competition at the moment.
 
Yeah, the gap to bridge is huge for RDNA3. AMD needs to bring 5 times faster RT engines to compete with Ada

View attachment 7178
Somewhat realistically I would like to see AMD be able to slightly surpass the 408012GB at a lower price to claw back some market share. It would be great for everyone if we could get back closer to a 50/50 split.
 
Yeah, the gap to bridge is huge for RDNA3. AMD needs to bring 5 times faster RT engines to compete with Ada

View attachment 7178
Very impressive stuff!
Let's not ask AMD for the impossible haha. I think they will be able to compete well in real world workloads with "just" a 3x increase. RDNA2 is so abysmal in RT that surely they should be able to improve a lot. Intel did it out of the gate, AMD must be able to.
 
Just watched the DF video.

As others theorized early on we are indeed running into CPU bottlenecks in many games, even at native 4k - I suspect Ryzen 70003D may eventually replace DF's 12900k as their GPU stress-tester if it lives up to its promise. The fact that it's so fast that it's actually inducing stuttering due to the CPU not being able to keep up was not something I necessarily expected, kudos to DF for highlighting this. I would never expect 100% increase over 3090ti, but it's actually able to reach 80-100% over the regular 3090 in even a few games is still impressive, especially considering main VRAM bandwidth has barely changed. With Samsung's upcoming GDDR7 it doesn't look like we'll be hitting a performance wall due to bandwidth anytime soon, so that's good.

DLSS3 though? Well we need to see Alex's in-depth video on this for sure as this was barely touched upon. But damn, those are some pretty huge caveats at this early stage. The inability to cap the frame rate, and the early 'solution' to this by forcing vsync through the CP potentially adding lag too is...not great. When you combine that inability to cap the framerate with the aforementioned stuttering issue when the CPU is being overloaded - which it definitely will be when the framerate is uncapped - and that DLSS can potentially end up magnifying poor frametimes to boot, I'm actually somewhat surprised it shipped in this state.

DLSS3's garbage-in-garbage-out also may put a crimp in the argument that DLSS3 will be the only way to bring 'true' next-gen games at 60+ fps that wouldn't be possible otherwise, it really doesn't look like having anything less than at least a solid 60fps before DLSS3 kicks in will provide anything approaching a good experience, and I'm not even talking about latency. Again it's early, and we'll see with Alex's video, but those are concerning issues. I could also pick up some rather prominent artifacts in some games without pausing the video, in Cyberpunk in particular the power lines had very prominent breakup with DLSS performance and additional artifacts with DLSS3. This isn't necessarily a huge knock against DLSS3 itself as I get that a 60fps video is not presenting it in its more likely real-world use case where it will be employed for 120+ fps , but like I mentioned before, DLSS2 in Performance mode just isn't close enough to native 4K for me to consider it a viable replacement like DF seems to believe. The fact that DLSS3's frames may have negligible additional artificing over DLSS2 Performance mode frames isn't so much a concern for me, it's that DLSS Performance frames are the starting point.
 
DLSS3 though?
DLSS3 is an optional feature you get on top of these +80-100% performance boosts, and it's rather unlikely that any competitor will offer something similar over the market lifetime of Ada. So it's not exactly needed for Ada to be successful and you're hardly forced into using it. From this point of view the fact that it comes with a number of issues and limitations doesn't seem that important to me.
 
Back
Top