Nvidia Geforce RTX 5090 reviews

I find it funny that people were saying NVIDIA were not innovating enough "because no fast raster" yet when NVIDIA introduced new things like G-Sync, DLSS or raytracing it's "useless because no one will use them" or "these are something trivial, anyone can do that" or "fake frames!" Even now with 5090, with the new transformer DLSS models and neural textures. I guess it's best to just have super fast 2D sprite engines just like the old NES days, that'd be the most innovative thing ever!

5090 is definitely an iterative product, just making improvements on what already exists. The biggest advancements is probably the scheduling of the "neural shader" stuff so you can have ai workloads in flight with the graphics workloads. But that's usually how gpus go. The really big revolutionary changes are rare. The 50 series is probably to "neural shaders" what the 20 series was to ray tracing. It'll take a while to see how it pans out.
 
This is probably the funniest take I’ve seen. The 5090 is disappointing because Nvidia employees are rich and lazy. 😅

Funny isn't the word I'd use. More like brick-wall facepalm stupid with a 100 metre run-up. Someone here has said the same thing recently IIRC, quite possibly the same poster.

In tech terms this generation seems to me to be what we should expect. The old, golden days of 2x per generation are just something that the "grandads" remember, and some of us here are the grandads.
 
The difference is the 2000 series introduced a new workload.... RT.
Which no one compared the cards in since it couldn't run (generally) on the previous generation. Which is kinda similar to how things are with neural rendering and mega geometry now. And just like with RT we'll see eventually if these will be used and how they will run on the new gen vs the previous ones.

It's a horrible gen on gen increase for sure that it shouldn't even qualify as a new generation.
A horrible increase would actually be a decrease with zero new features. This isn't it.
 
I find it funny that people were saying NVIDIA were not innovating enough "because no fast raster" yet when NVIDIA introduced new things like G-Sync, DLSS or raytracing it's "useless
because no one will use them" or "these are something trivial, anyone can do that" or "fake frames!" Even now with 5090, with the new transformer DLSS models and neural textures. I guess it's best to just have super fast 2D sprite engines just like the old NES days, that'd be the most innovative thing ever!

I’m actually surprised there aren’t more chants of “Nvidia doesn’t care about gaming”. I’m always amused when people complain about Nvidia’s initiatives as if doing nothing is the better option. Ideally we would have competing visions and investments but hating on the only company that’s trying to do something is bizarre to me.
 
This is probably the funniest take I’ve seen. The 5090 is disappointing because Nvidia employees are rich and lazy. 😅
That's one way to interpret that post.... Obviously the incorrect way but it's certainly a way..... The correct way to interpret that post is that Nvidia will lose talented employees because they can afford to go seek other pursuits in life. It's hard to find and replace really talented employees because the base required knowledge to even compete for that type of position severely limits the available talent pool....

Anyway, it's often interesting how people can read similar things and some can derive extremely ridiculous interpretations of said thing.....
 
That should enable higher clocks, which would increase the performance per transistor.
Kinda but the gains there aren't linear so you could get -25% on power but only +10% on clocks for example. Would that be a big improvement? People here are saying that +30% isn't.
And then you may need to actually reduce the complexity of the dies to make them cost at least similar to the previous generation which would fully negate that clock improvement.
Nvidia isn't stupid, the choice of 4N for Blackwell isn't stupid either.
 
That's one way to interpret that post.... Obviously the incorrect way but it's certainly a way..... The correct way to interpret that post is that Nvidia will lose talented employees because they can afford to go seek other pursuits in life. It's hard to find and replace really talented employees because the base required knowledge to even compete for that type of position severely limits the available talent pool....

Anyway, it's often interesting how people can read similar things and some can derive extremely ridiculous interpretations of said thing.....

How is the interpretation incorrect? You’re saying yet again that Nvidia employees hit the jackpot and that’s why the 5090 is disappointing.
 
Which no one compared the cards in since it couldn't run (generally) on the previous generation. Which is kinda similar to how things are with neural rendering and mega geometry now. And just like with RT we'll see eventually if these will be used and how they will run on the new gen vs the previous ones.
Yea, Neural rendering and mega geometry are a long way in the distance. Even RT's uptake after 6 years is sub 1% in newly released games...
A horrible increase would actually be a decrease with zero new features. This isn't it.
By Nvidia's standards, this is very poor. There's no real architectural gains to speak of and most of the increases come with the increases in SMs. A Nvidia release similar to Intel's core ultra series would almost certainly see an immediate downgrade in the stock.
 
Yea, Neural rendering and mega geometry are a long way in the distance.
MG is being added to AW2 right now and the patch is supposedly days away.
NR is likely years away from any implementation though.

There's no real architectural gains to speak of
If you look at shading h/w only for whatever reason then sure. But that's completely beyond the point of all architectural improvements in GPUs over the last 5 years.
Even then they've updated their SIMDs with uniform INT support. People are saying that this should help with running work graphs for example.
 
Mega Geometry will see a very big uptake, it's supported by all RTX GPUs.

RT uptake is now about ~50% of newly released AAA games. We have games releasing with mandatory RT such as Indiana Jones and Doom Dark Ages.
I very much doubt it.... The devs that are most likely to adopt mega geometry first are likely the ones that will need it for their projects. So it's safe to assume that AAA devs will be the first to adopt it. However, it'll likely be adopted in projects which are in the early stage of development or projects that Nvidia sponsors. AAA games have a long lead time which means by the time it's actually meaningful, it'll be 2-3 gpu generations from now so when the new consoles launch. That's provided that the consoles support a similar feature. If consoles don't support such a feature, I expect an even slower uptake for multi-platform projects.

With regards to the second point, I didn't specify AAA release, just all releases. The 2 examples you provided that have mandatory IT are 2 out of how many AAA releases this year? While nice to see, it's insignificant at this point in time especially when we see that RDNA 2 qualifies as RT enabled....

Finally, the reason I specified all releases is because if you look at top 50 most played game on steam, that list is not remotely ruled by AAA games.... So most people spend a lot of time playing games that are not considered AAA which makes filtering by AAA entirely pointless.
 
Back
Top