Nvidia GeForce RTX 50-series Blackwell reviews

I sorely miss the days when GPU coverage skewed more mature and more technical.
I havn't watched many "reviews" for this gpu generation because the reviewers have been annoying me (this is amd and nvidia reviews) with rubbish I don't need to know while they go on spending more time sniping at things. Your post actually just made me realise why and your right. I used to watch or read reviews from many places and reviews even for 3rd party cards that really were just factory overclocks etc.

Influencers aren’t trying to help people make informed decisions though.
They have turned into this without me realizing it, this is indeed what my problem is from.
 
and think that other people who buy PC h/w can't read/think/research and thus must be limited in their options.
This isn’t an opinion, this is a fact. See: the entire prebuilt market.

IIRC one of the most popular prebuilts on Amazon was running some sort of relabeled Fermi card. In like 2023.
 
This isn’t an opinion, this is a fact. See: the entire prebuilt market.
So you're saying people cannot opt to buy what they want, instead their money is yanked from their hands and directly replaced with an 8GB video card? No choice at all, it's simply a fact?

You keep on using that word, I do not think it means what you think it means. /PrincessBride
 
This is a communist type opinion, not a fact.
Like I said, Fermi cards in prebuilts in 2023.

(Also, ‘you shouldn’t sell junk to consumers’ is not a ‘communist type opinion’).

So you're saying people cannot opt to buy what they want, instead their money is yanked from their hands and directly replaced with an 8GB video card? No choice at all, it's simply a fact?
No, I didn’t say this. Feel free to read what I wrote and if you want me to clarify a section let me know. Hope this helps!
 
You said it was a fact, you said people would just buy prebuilts without researching them at all and then will have to "drop settings on day one" -- drop them to what, exactly? Modern games will autodetect VRAM, hell even GTA V when it was released could autodetect VRAM quantity a dozen years ago and make configuration changes on your behalf to avoid performance issues.

So, it's not "a fact" -- it's your opinion, and not a well informed one IMO. Sure, anyone can force more settings than their hardware can reasonably support, and now we're back to the failed argument of why games support settings far in excess of what most gamers have in their arsenal. Apparently there's no way at all to make everyone happy; even a 16GB version of the 5060 is still going to run into severe performance issues running at high levels intended for 24+ GB VRAM cards. Why is the line drawn only where you want it to be?

Last I looked at the Steam hardware survey (today) 65% of Steam gamers are running 8GB of VRAM or less. the "most typical" VRAM quantity is still 8GB, with an installed base 2.5x that of the next highest VRAM quantity (12GB)

It's not a fact, not in the slightest.
 
You said it was a fact, you said people would just buy prebuilts without researching them at all
Correct, it is a fact that this happens. If you don’t think this happens then you don’t have any non-technical friends. I’ve literally seen this happen, they see it has ‘Nvidia XX60’ and think it’s a fine budget Nvidia card. This isn’t the first time Nvidia has done this with two tiers of the 60 card (in fact this is the third time in a row, and each time is more egregious).

Let me rephrase my point in a simpler way: this product is purely there to trap pre-built buyers or uninformed DIYers.

Modern games will autodetect VRAM, hell even GTA V when it was released could autodetect VRAM quantity a dozen years ago and make configuration changes on your behalf to avoid performance issues.
Whether the user or the game drops settings is completely irrelevant, they are dropping settings. This is such an absurd technicality even for this forum.


Sure, anyone can force more settings than their hardware can reasonably support, and now we're back to the failed argument of why games support settings far in excess of what most gamers have in their arsenal
The point is this card is lopsided: it has plenty of ‘juice’ but will be hampered from day one by low VRAM capacity.

even a 16GB version of the 5060 is still going to run into severe performance issues running at high levels intended for 24+ GB VRAM cards.
I have yet to see a (not broken) game that needs 24GB for any level of settings.


Last I looked at the Steam hardware survey (today) 65% of Steam gamers are running 8GB of VRAM or less. the "most typical" VRAM quantity is still 8GB, with an installed base 2.5x that of the next highest VRAM quantity (12GB)
So? Most people on Steam aren’t running brand new systems, we are talking about a brand new card. Once upon a time 8GB was more than enough (Pascal, coincidentally one of the most popular generations on the Steam charts), and this will be reflected in the Steam survey. That doesn’t mean it’s appropriate on a new release.

I swear this is like those Japanese soldiers still fighting WW2 into the 1970s, everyone else with technical knowledge agrees 8GB cards are pretty much junk outside of super low end contexts (I’d be fine with a like 5050 8GB for a very low price because that’s honestly the only tier where 8GB is proportional), except a few people on here still think it’s acceptable.
 
8 GB isn't enough to run console equivalent settings in plenty of games in plenty of games.
It is enough because XSS has 10GB with only 8 of them being fast.
People have weird understanding of what "console equivalent settings" are though.
Running at a mix of medium and high in 4K is not "console equivalent" when the console in question runs them in 900p.

Like I said, Fermi cards in prebuilts in 2023.
So? Have you ever stopped to think that prebuilts aren't bought only for gaming? Or that people have their own ideas on how to use these after they've bought them?

(Also, ‘you shouldn’t sell junk to consumers’ is not a ‘communist type opinion’).
8GB GPUs are not "junk", that's HUB talking out of your mouth.
 
It is enough because XSS has 10GB with only 8 of them being fast.
People have weird understanding of what "console equivalent settings" are though.
Running at a mix of medium and high in 4K is not "console equivalent" when the console in question runs them in 900p.


So? Have you ever stopped to think that prebuilts aren't bought only for gaming? Or that people have their own ideas on how to use these after they've bought them?


8GB GPUs are not "junk", that's HUB talking out of your mouth.
Is comparing the 5060Ti to XSS fair? XSS was the absolute min spec that Microsoft was willing to go with in 2020. I'd like to think we can do better than that on a $400 GPU in 2025.

Comparison to Xbox does show how GDDR capacity has been stalled for a long time :(
 
Last edited:
Is comparing the 5060Ti to XSS fair? XSS was the absolute min spec that Microsoft was willing to go with in 2020. I'd like to think we can do better than that on a $400 GPU in 2025.
Well 5060 and the next Ti above it is the low end of the line up so why isn't it fair to compare that to low end of console market?
 
Well 5060 and the next Ti above it is the low end of the line up so why isn't it fair to compare that to low end of console market?
I'm just saying the XSS is a $300 console from ~5 years ago. It shouldn't be that difficult for $400 GPU released 4-5 years later to beat it.
 
Is comparing the 5060Ti to XSS fair? XSS was the absolute min spec that Microsoft was willing to go with in 2020. I'd like to think we can do better than that on a $400 GPU in 2025.

I'm just saying the XSS is a $300 console from ~5 years ago. It shouldn't be that difficult for $400 GPU released 4-5 years later to beat it.

I'm not sure it's so much about comparing, it's more about the installed base of gaming machines. First, you've never been able to resonably compare console hardware to PC hardware. Console hardware prices are fully subsidized by the games and services sold on the platform; the purchase price of the console is wholly unlinked from the cost of the physical hzrdware itself. Second, the console is using a shared memory model on an APU, it's not like we're comparing apples to apples anyway.

The discussion at the moment is pivoting around VRAM, and there's an undeniable supermajority of 8GB (and less) VRAM gaming equipment in the world being targetted by game developers. Earlier I pointed out the Steam HW survey ( https://store.steampowered.com/hwsu...ware-Survey-Welcome-to-Steam?pubDate=20250423 ) showing 65% of all Steam-enabled PCs are running 8GB VRAM or less. If you keep digging, you'll also find nearly 60% of Steam devices are also running 1080p as their display resolution, which further limits VRAM consumption. Finally, we have the deployed console population on top of that, no small portion of which is XSS and XSX which both have a grand total of 10GB to share with the game logic along with the OS.

Anyone trying to tell you 1080P is dead, or that 8GB of VRAM is dead, hasn't looked at the overwhelming majority of devices on this planet playing video games. 8GB is far from dead, just like 1080P; both will continue to live in their majority positions for years to come because 2nd place for both of those categories is so far away.
 
I'm just saying the XSS is a $300 console from ~5 years ago. It shouldn't be that difficult for $400 GPU released 4-5 years later to beat it.
It does beat it. It's what some 3X faster? It doesn't beat it on memory size specifically but you could argue that you don't need more memory to run console settings. So you get more performance at similar settings which is where the premium in price is justified.
 
I'm not sure it's so much about comparing, it's more about the installed base of gaming machines. First, you've never been able to resonably compare console hardware to PC hardware. Console hardware prices are fully subsidized by the games and services sold on the platform; the purchase price of the console is wholly unlinked from the cost of the physical hzrdware itself. Second, the console is using a shared memory model on an APU, it's not like we're comparing apples to apples anyway.

The discussion at the moment is pivoting around VRAM, and there's an undeniable supermajority of 8GB (and less) VRAM gaming equipment in the world being targetted by game developers. Earlier I pointed out the Steam HW survey ( https://store.steampowered.com/hwsu...ware-Survey-Welcome-to-Steam?pubDate=20250423 ) showing 65% of all Steam-enabled PCs are running 8GB VRAM or less. If you keep digging, you'll also find nearly 60% of Steam devices are also running 1080p as their display resolution, which further limits VRAM consumption. Finally, we have the deployed console population on top of that, no small portion of which is XSS and XSX which both have a grand total of 10GB to share with the game logic along with the OS.

Anyone trying to tell you 1080P is dead, or that 8GB of VRAM is dead, hasn't looked at the overwhelming majority of devices on this planet playing video games. 8GB is far from dead, just like 1080P; both will continue to live in their majority positions for years to come because 2nd place for both of those categories is so far away.
I don't think 1080p is dead or 8GB GPUs are dead or anything like that. There was a time when 4GB was the most common configuration on Steam. That doesn't mean it's okay to keep putting 4GB on GPUs indefinitely.

BTW I don't lay the 8GB situation entirely on NVIDIA. When they were designing GB206, they could have reasonably expected higher capacity GDDR to be widely available.
 
Last edited:
Back
Top