Nvidia GeForce RTX 50-series Blackwell reviews

Have your tantrum, but you're still complaining about the content of a "bullshit" video...which you refuse to watch.
I don't have to watch a video which is very obviously specifically constructed to showcase an agenda the author has been peddling for years.
"Hey, here's a video from me which proves that I was right!" This isn't how it works.

As I've said, the whole issue with 8GB cards (or 10, or 12, take your pick based on what's being discussed over there at a moment) on HUB has been mispresented to the degree where it's borderline harming the industry now.
I've already said that the gamers - those who actually buy cards and not just circle jerk on a YT video infotaiment again and again - do not in fact play the games which Steve is using to "prove" his point.
The majority of most played PC games will run completely fine on an 8GB GPU possibly even in 4K/Ultra.

Make a thought experiment and consider a person who only play CS2 on their PC (I'm not even going into the whole "gaming cafe" market for that although this one should be obvious). What is the point for them to pay +$50-100 for a GPU which will net them exactly 0 performance benefits? Should that person be prevented from making such choice and forced to pay more just because Steve can't setup games for testing properly?

I understood and agreed with the issues Nvidia used to have when they've had 8/10GBs up until almost the top end back in 2020, this wasn't a good proposition for the market back then. Now though? We've had 8 vs 16 choice since 2023 in that market. Those who play games like Steve suggest we all should be playing go and get themselves a 16GB card - problem solved! Now we're at a point where Steve suggests that the option of buying an 8GB GPU shouldn't even be there - which is just bonkers for anyone who ever mapped what PC gamers play en masse to the h/w the market buys.

Even if we look at this from a baseline perspective as in that PC h/w should evolve to allow for more complex graphics and 8GB parts are preventing that we'd run into stuff like XSS having 10GB (8+2) or Steam Deck shipping with 16GB of total RAM both of which are limiting what's possible for games way more than the 8GB GPUs do presently. And If you look at this from Nvidia's perspective as a GPU seller you'd quickly realize that they gain nothing from selling 8GB cards as their main strong points all require more VRAM (RT, MFG). This is a purely market driven decision for them - they know that the market wants a cheaper alternative with less VRAM which is the only reason it even exists.

And yet here's Steve telling us how it's DOA. Just like all these other 8GB GPUs in Steam h/w DB sitting at 35,5% of overall marketshare right now.
 
I don't have to watch a video which is very obviously specifically constructed to showcase an agenda the author has been peddling for years.

I watched the video. Yes it’s clearly driven by an agenda and Steve likely fancies himself as a hero fighting the evil Nvidia marketing machine. He’s mostly showcasing scenarios where 8GB fails but there are enough of those games now that it’s hard to ignore the reality that 8GB is a waste of money for a lot of people. He acknowledges that lots of games still run fine with 8GB but his intent was to discourage any interest in the 8GB card due to the increasing number of cases where it does fail even at lowered settings.

His most convincing point was that the 16GB is just a few dollars more than the 8GB so it’s not worth dealing with the potential headaches of the 8GB card. Hard to argue with that.
 
His most convincing point was that the 16GB is just a few dollars more than the 8GB so it’s not worth dealing with the potential headaches of the 8GB card. Hard to argue with that.
$380 vs $430 is +13% and that's MSRPs, retail prices and model ranges will likely widen that gap a bit. If you're a cyber cafe network owner and you buy these in bulk the difference of even 13% can be very substantial especially if you're not really getting much perf out of that added price in your typical DOTA2 or what have you.
Steve's arguing that people can't see what they are buying and thus are buying 8GB cards because they didn't know that it's not enough for some games is completely dishonest.
I'd wager that at this point everyone knows that 8GB is the minimum amount of VRAM you should be buying and if you're looking at playing recent AAA SP titles then you should be looking at 12 at minimum. There are no news about 8GB being not enough today. This however doesn't make it "DOA" or a product which has no reason to exist. And a good reviewer would know that.
 
In the video you can see significant performance differences even at medium settings, so it's not the case that they are just picking unrealistic settings to "force" the 8GB card to fail.
"Medium" settings means nothing, it's a moving target which is also different between different games.
You could approach this problem from several angles.
One would be to set the highest settings and compare - Steve doesn't do that with RT for some reason yet VRAM is free apparently so there's no problem using the highest VRAM consuming settings.
Another would be to look at playable framerates and compare from there. Does a card with 8GB or VRAM suffer in performance or quality on playable framerate in comparison to a 12/16 one? Or are all cards tested below that level on settings where more VRAM starts to matter? This was the most likely scenario even in 1440p until recently.

From a current day review POV however one should assess which games people are playing on PC mostly right now and then see how these would be running on an 8GB GPU in comparison to a 16GB one in two scenarios:
1. Games running at maximum settings.
2. Games equalized to 8GB baseline and then improved on the card with more VRAM with or w/o performance loss to showcase what more VRAM would net you.
Running games into VRAM limits on purpose isn't interesting as that's not how a gamer would be using the card. It's also very easy to do on an 8GB GPU for years now - just play in 4K and you're VRAM limited. But just as with 4K not everyone is using even "medium" settings, and for them the interesting question would be "how much better the games would look at similar performance on a GPU with more VRAM?"
 
Another would be to look at playable framerates and compare from there. Does a card with 8GB or VRAM suffer in performance or quality on playable framerate in comparison to a 12/16 one? Or are all cards tested below that level on settings where more VRAM starts to matter? This was the most likely scenario even in 1440p until recently.
That's exactly what they do in the video. They compare a range of resolutions and settings that are playable on the 16GB card and compare the performance with the 8 GB version. In a lot of the tests the average frame rate is perfectly fine on the 8GB model and it's only the 1% lows that are a problem. In other tests, even the 1% lows are ok, but there is still a performance difference.
 
That's exactly what they do in the video. They compare a range of resolutions and settings that are playable on the 16GB card and compare the performance with the 8 GB version. In a lot of the tests the average frame rate is perfectly fine on the 8GB model and it's only the 1% lows that are a problem. In other tests, even the 1% lows are ok, but there is still a performance difference.
That's exactly what they do in the video in games which have no relation to what majority of people play on PC - especially if we talk about people who buy these 8GB cards.
 
That's exactly what they do in the video in games which have no relation to what majority of people play on PC - especially if we talk about people who buy these 8GB cards.
HUB is an enthusiast channel so focus on an enthusiast audience and the games that audience likes to play. I doubt that people watch their content to figure out how many frames they will get in CS:GO or Rocket League.
 
So now it's unfair to compare a ~$400 card using games made in the last year
Here are "games made in the last year":

The selection of games for this video has 33% from earlier than 2024 - 1 from 2020 (CP2077), 1 from 2022 (Requiem) and 3 from 2023.

If you look at critically acclaimed games from "the last year" you get:
  • Elden Ring: Shadow of the Erdtree - missing
  • Metaphor: ReFantazio - missing
  • Final Fantasy VII Rebirth - missing
  • Slay the Princess, UFO 50, Animal Well, Satisfactory, Balatro, Thank Goodness, Tekken 8, Blue Prince, Split Fiction, The Talos Principle: Reawakened - all missing
Present? The Last of Us Part II Remastered. One title from the last year.

And I honestly don't even want to look at most played titles. Something like MHW would've at least suited the narrative but alas it's also missing.

It's relevant for the audience that watches their videos and arguably for people here as well, if we assume that Beyond3D forum users still play new games. (With RT?)
Do people here or even elsewhere not know that 8GB is on the low side (which is why you see it on lower end cards only these days)? What's the "revelation"?
Do these people buy 8GB cards? Are they made for these people?
 
Do people here or even elsewhere not know that 8GB is on the low side (which is why you see it on lower end cards only these days)? What's the "revelation"?
Do these people buy 8GB cards? Are they made for these people?
You yourself posed the following questions:

"Does a card with 8GB or VRAM suffer in performance or quality on playable framerate in comparison to a 12/16 one? Or are all cards tested below that level on settings where more VRAM starts to matter? This was the most likely scenario even in 1440p until recently."

The video provides data to help answer those questions.
 
If you're a cyber cafe network owner and you buy these in bulk the difference of even 13% can be very substantial especially if you're not really getting much perf out of that added price in your typical DOTA2 or what have you.
If Nvidia made the 8GB model an Asian-specific SKU for cyber cafes I think nobody would really care. The problem is that’s not what western-focused YouTubers are talking about, they are talking about what this is gonna go in in the American/Western market: a sea of sorta-cheap prebuilts. People buying these prebuilts probably aren’t read in on VRAM debates so they’re going to buy this machine and have to drop settings on new games literally on day one. That’s what makes this a bad product.

Part of the blame goes to system integrators but I’d honestly prefer is Nvidia just stopped making these bad products outside of specific distribution channels where it makes sense.
 
Except it doesn't because of the games selection used to find the "answers". Instead it shoehorns the answers to what the author wants to tell the audience.
This is like whack-a-mole. Every time I reply, the argument changes. I entered this thread to point out that HUB were not using unplayable settings to "break" 8GB cards. Now we're into a dispute about the exact proportion of "recent" titles you need to be representative, when almost all review outlets (especially DF) use a range of both recent and older titles.

But if you like you can ignore the non-recent games HUB was testing. I still think people are going to be interested in the performance of Star Wars Outlaws, Indiana Jones, Assassin's Creed Shadows, TLOU2, etc.
 
This is like whack-a-mole. Every time I reply, the argument changes. I entered this thread to point out that HUB were not using unplayable settings to "break" 8GB cards. Now we're into a dispute about the exact proportion of "recent" titles you need to be representative, when almost all review outlets (especially DF) use a range of both recent and older titles.
Well, to be fair, he was directly responding to someone else who tried to make the point of Hub using games "in the last year." He wasn't responding to you. Insofar as whack-a-mole is concerned, the same could potentially be said to describe the four or five or six different angles Hub is trying to be represented in this thread which Degustator is replying to.


But if you like you can ignore the non-recent games HUB was testing. I still think people are going to be interested in the performance of Star Wars Outlaws, Indiana Jones, Assassin's Creed Shadows, TLOU2, etc.
At this point, an 8GB VRAM card is a low end offering. Folks wanting to play AAA titles on a low end card should (and likely, will) expect performance vs quality tradeoffs to be made.

Is an 8GB VRAM 5000-series card really worth buying these days? If you already have a prior-gen card that's working, probably not. If you have no gaming PC at all, the answer is "it depends" based on what you want to pay and what you want to play.
 
Well, to be fair, he was directly responding to someone else who tried to make the point of Hub using games "in the last year." He wasn't responding to you. Insofar as whack-a-mole is concerned, the same could potentially be said to describe the four or five or six different angles Hub is trying to be represented in this thread which Degustator is replying to.
And to be fair to me, I was the one being quoted in this thread after I made my point about the settings used.

At this point, an 8GB VRAM card is a low end offering. Folks wanting to play AAA titles on a low end card should (and likely, will) expect performance vs quality tradeoffs to be made.

Is an 8GB VRAM 5000-series card really worth buying these days? If you already have a prior-gen card that's working, probably not. If you have no gaming PC at all, the answer is "it depends" based on what you want to pay and what you want to play.
Yes, and the "it depends" question is answered with benchmark data.
 
Insofar as whack-a-mole is concerned, the same could potentially be said to describe the four or five or six different angles Hub is trying to be represented in this thread which Degustator is replying to.

Not really sure what "four or five or six different angles Hub is trying to be represented in this thread" really means, but the main problem is the constant moving of goalposts because someone is vociferously arguing against the content of a video they refuse to view. It's a constant cycle of "Actually, they did test that", then another argument is presented as if the content of the video can be be more reliably discerned by first principles instead of just watching it.

I certainly don't think every tech review channel has worth, most don't. But we already have a thread for this kind of stuff - the Value of Hardware Unboxed thread if someone wants to argue a particular channel is worthless and to document their grievances - which I have done with HBU in that thread too. Like the majority of youtubers they're sensationalist and have made some dumb arguments in the past, but I don't think they've sinned to the point of being verboten here.

While there's obvious merit to this, there's a thread for that channel because their videos always end up in drama. Dropping it there would have maybe got more discussion because any further replies here will end up with a mod having to move it over there.

Yeah, that's what I'm wondering. If a thread can be derailed by a post of one of their videos to this extent, maybe for the sake of site harmony we invoke a policy against HBU videos being posted at all, or outside of specific threads?
 
Last edited:
Back
Top