Speculation and Rumors: Nvidia Blackwell ...

  • Thread starter Deleted member 2197
  • Start date
But why? That's the bit I'm not getting. They can showcase games on the 5090, that no-one can afford. What's the downside to that for nVidia? Will they lose sales?
It's not that Nvidia would lose anything from that, it's more that game developers won't support a card which would sell in thousands (probably) over its lifetime - which means that Nvidia won't have anything to show really aside from 8K (pointless) and 200+ fps (also a bit pointless as you won't see these through YouTube).

Producing such expensive tier makes little sense if the next one lower to it isn't capable of running the same s/w at an acceptable performance.
 
You can always reduce resolution or other options to gain performance. Path tracing with Cyberpunk 2077 is insanely demanding, but still usable on lower tier cards if you are willing to sacrifice resolution fairly heavily. Things dont need to run really well on lower tier parts to still be implemented. There's always the occasional developer who likes pushing super taxing graphics features in games even if there's not gonna be some big audience for it. I mean, I think we've all talked about the merits of games including 'Future' options that might only be more usable for people playing the game in the future with more powerful GPU's. There's not always some great immediate financial benefit to doing so, but sometimes devs just like to showcase their tech. There's still good publicity from it.
 
Tim weighs in.


Interestingly he's framing it as "weakest 80 class" instead "strongest 90 class". Funny how people can look at the same situation and interpret it very differently.
 
Interestingly he's framing it as "weakest 80 class"
680, 980, 1080, 2080 were all about similar to their respective previous gen higher tier parts. If we account for things like 590 and 690 (when they actually worked) then this seems like a far cry from "weakest 80 class" really.

But anyway, no one can say anything about how weak something is without knowing the retail price of that product. 5090 looks like a huge jump up from specs but I can easily see a situation in which it will in fact be the 5080 which will provide a much better price/perf improvement.
 
The improvement over the prior generation is also a factor though Tim didn’t cover it. If a 5080 is 40% faster than a 4080 would folks still think the 5080 is 5070 class?
 
The improvement over the prior generation is also a factor though Tim didn’t cover it. If a 5080 is 40% faster than a 4080 would folks still think the 5080 is 5070 class?
Yes. The important part is how it stacks up to the rest of the cards in its generation.

I don't know why people think a $2500+ 5090 is unreasonable. NVIDIA said the 4090 was a $1600 card. Consumers (probably mostly gamers) decided it was a $1900-$2000 card. This doesn't indicate to me that NVIDIA has any reason to keep flagship pricing in check.
 
Tim weighs in.


Interestingly he's framing it as "weakest 80 class" instead "strongest 90 class". Funny how people can look at the same situation and interpret it very differently.
He's honestly far more generous than the situation merits, already considering the 4080 an actual '80' class GPU and not the 70 class it really is(cut down sub 400mm² upper midrange die with 256-bit bus). It's seriously tragic how bad things have gotten in the GPU space. And it's worse that people defend and apologize for it, even though we can see that in other areas, things are not nearly as bad, like with CPU's, all because we have proper competition there.
 
The improvement over the prior generation is also a factor though Tim didn’t cover it. If a 5080 is 40% faster than a 4080 would folks still think the 5080 is 5070 class?
Pascal was a massive performance improvement over the prior generation, but Nvidia didn't call a 1060 a 1070Ti and double the price for it. That's exactly what happened with Lovelace, though.

We shouldn't be paying based on performance. We should be paying based on the actual hardware and costs. Obviously sometimes performance comes through bigger dies or more costly process nodes, and so we can expect some level of price increases in such cases, but there needs to be a reasonable level here where we recognize that if say, a chip costs 50% more, it doesn't merit the price of the entire graphics card to rise more than 50%, when the actual chip cost is only like half the overall graphics card cost.

Arguing that we should be paying based on level of performance improvement is basically the kind of thing you'd expect a senior exec at Nvidia to argue. To see consumers arguing this is insanely depressing.
 
I don't know why people think a $2500+ 5090 is unreasonable. NVIDIA said the 4090 was a $1600 card. Consumers (probably mostly gamers) decided it was a $1900-$2000 card. This doesn't indicate to me that NVIDIA has any reason to keep flagship pricing in check.
Speaking of which, this could apply to the med range too, the 4060 is now the most used GPU on the Steam Hardware Survey. I think this is the first time in a while that a 60 class card climbs up the charts in the same generation. It only happened back with the 970 generation as far as I can remember.
  1. RTX 4060 Desktop – 4.58% + RTX 4060 Laptop – 4.37% = 8.95%
  2. RTX 3060 Desktop – 5.86% + RTX 3060 Laptop – 3.00% = 8.86%
  3. RTX 4060 Ti Desktop - 3.66%
  4. GTX 1650 Desktop + Laptop - 3.64%
  5. RTX 3060 Ti Desktop – 3.57%
  6. RTX 3070 Desktop – 3.31%
  7. RTX 2060 Desktop + Laptop – 3.30%
  8. RTX 4070 Desktop – 2.91%
 
Last edited:
We shouldn't be paying based on performance. We should be paying based on the actual hardware and costs.
That's not how the technologhy world works, by any stretch of anyone's imagination. GPUs have never sold that way, CPUs have never sold that way, motherboards, storage devices, host bus adapters, monitors, powersupplies... This is not a rational take.
 
We shouldn't be paying based on performance. We should be paying based on the actual hardware and costs.
Costs include features and R&D, they are the highest in the case of NVIDIA compared to any competitor, you also pay based on support and longevity, NVIDIA offers the longest driver support in the industry, their current support includes 5 generations (vs 3 generations for AMD).
 
Prices depend on the market. If your $1 production cost product can compete with a $500 retail price product from another company then you will sell it at $500 and bank the profit. No amount of whining will change the way the market economy works, and honestly all suggestions on how to change that have been dangerously close to communism thus far.
 
Costs include features and R&D, they are the highest in the case of NVIDIA compared to any competitor, you also pay based on support and longevity, NVIDIA offers the longest driver support in the industry, their current support includes 5 generations (vs 3 generations for AMD).
Nvidia are genuinely famous for dropping any kind of driver support for architectures more than a single generation old. lol What on earth kind of bizarro world are you living in? And by driver support, I dont mean 'still functions', I mean 'is actively being optimized for'. You dont need driver support for things to still function. I'm still using a Pascal GPU and there's very few things I literally cant run at all due to lack of driver support(mostly just lack of horsepower), even though they've dropped driver support ages ago.
 
That's not how the technologhy world works, by any stretch of anyone's imagination. GPUs have never sold that way, CPUs have never sold that way, motherboards, storage devices, host bus adapters, monitors, powersupplies... This is not a rational take.
That's literally how things have ALWAYS worked in the processor world up until super recently. And still how it actually works with CPU's.
 
This is just completely false.
No, it's really not. Most PC gamers paying attention know this, but of course you're going to deny it to your deathbed because you've literally never once accepted any criticism of Nvidia ever, in any situation, and for any reason, no matter how legitimate.
 
No, it's really not.
Yes, it really is. Check the "Supported products" tab.

Most PC gamers paying attention know this, but of course you're going to deny it to your deathbed because you've literally never once accepted any criticism of Nvidia ever, in any situation, and for any reason, no matter how legitimate.
Most PC games don't know anything and are just spreading the same misinformation which has been debunked dozens of times over and over again.
 
Yes, it really is. Check the "Supported products" tab.


Most PC games don't know anything and are just spreading the same misinformation which has been debunked dozens of times over and over again.
Again, I'm talking active optimizations. We all know and have seen how Nvidia architectures more than a single generation old tend to fall backwards heavily in performance in newer games. You're like the Ministry of Truth telling us to deny the things we're seeing with our eyes and ears.
 
Again, I'm talking active optimizations.
Most optimizations Nvidia implement are relevant to all h/w which the driver supports.
More than that due to how their h/w+drivers are these days the "active" optimizations aren't bringing much changes to performance.
Someone who's actually paying attention would know this.

We all know and have seen how Nvidia architectures more than a single generation old tend to fall backwards heavily in performance in newer games.
Which has zero to do with the drivers and is usually a result of game developers dropping optimization for these particular h/w families.
That being said though recently this hasn't been the case either. The last generation which noticeably suffered from this due to most games using console GCN specific features was Kepler.

You're like the Ministry of Truth telling us to deny the things we're seeing with our eyes and ears.
I'm like actually the one who knows what he's talking about while you're describing things existing only inside your head.
 
Back
Top