Nvidia GeForce RTX 50-series Blackwell reviews

Profit isn’t a four letter word like so many seem to think it is.


In what way is it worse than what came before?


I have no idea how you think product segmentation causes FOMO.


Outside of the false statements above, it sounds like how you stay in business and pay your employees and owners.

I don't know why you segmented the text, since the logic is that it is complete. The problem is not profit, the problem is creating methods that are bad for the consumer in order to achieve this. For example, I can sell pure whole milk or I can add water to it to make more money and make more products.

Nvidia makes a worse product today compared to the past. The chip scaling is very aggressive, and the performance improvement is small. A 60 series chip used to be half the flagship chip. Today, an 80 series chip has that.

cuda_percent_7_full-4x_foolhardy_Remacri.png.webp


The segmentation creates FOMO in the following. You look at the 5060 and see that it has 8 GB of memory. Many games have limited performance due to the low amount of memory, so you look at a segment above. The 5060 Ti 16 GB seems better, but it doesn't seem that much better, and the gap to the 5070 is large and the price is close to the 5060 Ti 16 GB. However, the 5070 only has 12 GB, and this can be limiting for future games, since you want a card that will last for a good few years. Finally, you end up getting the 5070 Ti, which is 3x the price of the 5060.

It's the same strategy that Apple uses in its products, you will always feel the need for something better, but the options above always have something that makes you go higher and higher.
 
A 60 series chip used to be half the flagship chip.
Also a flagship chip used to be less than 500mm^2.
Also the launches for these families used to be staggered where the top chip would be released a year after all smaller ones.
Also there was a completely different production situation back then.

This image is providing a skewed look at the data omitting a lot of what is important to consider when talking about this.
 
Also these kinds of mind set seem to encourage IHV to sand bagging because if you make something extremely good such as 1080 Ti or 4090, you'll almost certrainly got ridiculed for the next "not so good" products (such as 2080 Ti which was considered as not very good value when it's released).
 
The question is what happens when workload doesnt scale with wider GPUs? The 5090 should be 2x as fast as the 5080. But the 5090 is only around 50% faster. So either you only improve die size by the amount of performance improvement or you go all in for the fun.
 
99% of these discussions end up boiling down to arguing over what metrics we are allowed to use to compare value gen over gen. However the underlying phenomenon is pretty hard to dispute: PC gaming has become more expensive and the value proposition of this generation is pretty poor. The reasons for this are numerous, part of it is just plain old semiconductor physics, but part of it is that PC gaming has exploded in popularity in the past 10 years and Nvidia can charge more due to the higher demand.

In the end I don't think consumers will care, those who can't afford the high end cards will buy the low end junk, and for those with excess cash they will buy the nice high end stuff. Most consumers aren't very value oriented.
 
Most consumers don’t care cause they’re not doing historical analysis to make a purchase decision. It simply boils down to what can I get for my money “today”. The enthusiast/review scene assumes everyone is upgrading from a 2-3 year old card and completely miss a significant chunk of the buying market.

I’ve already accepted that upgrading a GPU every 2 years is a waste and it won’t get any better. I’m already on a 6 year cycle for cell phones cause progress has slowed to a crawl there.
 
Most consumers don’t care cause they’re not doing historical analysis to make a purchase decision. It simply boils down to what can I get for my money “today”. The enthusiast/review scene assumes everyone is upgrading from a 2-3 year old card and completely miss a significant chunk of the buying market.

I’ve already accepted that upgrading a GPU every 2 years is a waste and it won’t get any better. I’m already on a 6 year cycle for cell phones cause progress has slowed to a crawl there.
Sure, I agree. That doesn't mean the analysis is worthless. Consumers making poor decisions on value does not mean we throw out the concept of value entirely. Nvidia gives us a worse value proposition every generation, reviewers will call it out but ultimately someone buying into PC gaming now can't buy at prices from 5 years ago.
 
Sure, I agree. That doesn't mean the analysis is worthless. Consumers making poor decisions on value does not mean we throw out the concept of value entirely. Nvidia gives us a worse value proposition every generation, reviewers will call it out but ultimately someone buying into PC gaming now can't buy at prices from 5 years ago.

The analysis isn’t worthless. It’s the conclusions that are completely useless to someone who’s just trying to buy a card. According to some reviewers if you have $400 to spend today you should find a new hobby instead because $400 got you a better card 5 years ago. As if you’re a time traveler. I don’t know of any other hobby where this happens even though graphics cards are still very cheap as far as hobbies go.

It’s ok to look at data and acknowledge things are slowing down. It’s the rage baiting and completely unhelpful advice that I can’t stand from influencers who know better.
 
I said the entire prebuilt market is a borderline scam, not that it’s all low end PCs.
Nice, so now I got scammed because I wasn’t smart enough to do research before I bought?

Quit moving the goalposts again, and try arguing in good faith. It’s okay to not be right 100% of the time and learn from others.
 
The segmentation creates FOMO in the following. You look at the 5060 and see that it has 8 GB of memory. Many games have limited performance due to the low amount of memory, so you look at a segment above. The 5060 Ti 16 GB seems better, but it doesn't seem that much better, and the gap to the 5070 is large and the price is close to the 5060 Ti 16 GB. However, the 5070 only has 12 GB, and this can be limiting for future games, since you want a card that will last for a good few years. Finally, you end up getting the 5070 Ti, which is 3x the price of the 5060.

It's the same strategy that Apple uses in its products, you will always feel the need for something better, but the options above always have something that makes you go higher and higher.
Ok, yeah can understand that causing anxiety amongst buyers. But it’s been that way forever and is marketing 101.

If you have two products at a value and a premium price point, the way to increase sales of the premium product is to place a medium priced product between them. That “pushes” people to the more premium product.

Look at the car market with 3-5 trim levels for each model, usually priced pretty close to each other as an example.

I don’t see Nvidia being anymore anti-consumer here than any other business.

Nvidia makes a worse product today compared to the past. The chip scaling is very aggressive, and the performance improvement is small. A 60 series chip used to be half the flagship chip. Today, an 80 series chip has that.

I don’t buy a chip size. I buy product features and performance. How the creator of the product of the product delivers features and performance is irrelevant to me as a consumer.

Back to a car analogy, do you know how much it costs the manufacturer to make your engine pistons, or do you just care that it has a certain performance level compared to comparable products?
 
The question is what happens when workload doesnt scale with wider GPUs? The 5090 should be 2x as fast as the 5080. But the 5090 is only around 50% faster. So either you only improve die size by the amount of performance improvement or you go all in for the fun.

This is a forever problem in computer engineering, of course. Look at the history of CPU, they used to simply go faster by increasing clock rate, because it was relatively easy to do so (just use a better process, use pipelining, etc.) The benefit is that you don't need to modify your software to make them faster. However, then they hit the memory problem, that is, memory latency did not (and still does not) scale well as CPU clock rate. So they added cache to hide latency. Even so, as predicted, CPU's clock rate can't really go that faster. Intel tried desperately with the insanely pipelined Pentium 4, which did achieve high clock rate but not high performance, because longer pipeline is, in a sense, requiring more parallelism in software, also the branch misprediction penalty. More execution units also helps but that also requires more parallelism and is hugely expensive. This was probably the first time a consumer CPU actually hit a "power wall."

So people at the time knew that CPU had to go multicore, but that requires new software, a chicken and egg problem. It persisted for some time but there was just no other solution and software just had to change. But of course this is a slow process and even today most everyday applications do not really benefit a lot from a many cores CPU (e.g. more than 8 cores).

Now, imagine that a journalist back then wrote an article about a "dual core CPU won't improve the performance of your old applications." It's obviously true. But if they also wrote "so a higher clock rate CPU is what you should go for, forget about dual core CPUs" that'd be stupid. What they should tell their readers is that single core performance is hitting a wall so people should start investing in multicore CPU and corresponding software updates. That'd be what a responsible journalist should write.

What we are seeing right now is that many streamers calling themselves "journalists" but not even have this basic understanding, or worse that they do but they choose to use ragebait just to improve their viewerships.
 
It’s ok to look at data and acknowledge things are slowing down. It’s the rage baiting and completely unhelpful advice that I can’t stand from influencers who know better.
Yeah I can probably agree with you there.

Nice, so now I got scammed because I wasn’t smart enough to do research before I bought?
I don't know what your specs are exactly but... yeah, probably? That's why we research before we buy. Rule #1 of PC gaming is pre-builts are low quality and are way overpriced for the component quality, particular the low end models that end up using cheaper, shittier versions of budget cards (ie, 3060 8GB instead of 12GB).

Quit moving the goalposts again, and try arguing in good faith. It’s okay to not be right 100% of the time and learn from others.
The problem is you are reading what I am saying, an imagining some other extension or conclusion. Read what I wrote and it will make sense: these lower VRAM versions of budget cards end up being put into substandard prebuilts bought by people that don't know any better. It's basically just taking advantage of uniformed parents buying gifts for their children, or other non-technical people who don't even know what VRAM is (I fell into this camp many years ago!). The market would be better off if Nvidia didn't even release this product, that's how bad it is imo. It is horribly lopsided: the card with more VRAM (even 12GB would be fine tbh but I get they can't do that with the bus width they've chosen) would age infinitely better considering it's actually decently powerful.
 
Back
Top