Value of Hardware Unboxed benchmarking

You only mentioned what buyers are choosing. That's completely separate from these media outlets are saying. What a specific demographic of media outlets are saying would not have a direct correlation on what buyers decide.

You removed part of the post that you quoted which referenced HUB. This is 100% related to what media outlets are claiming and evaluating how that holds up against the reality on the ground. Those outlets are claiming that large swaths of gamers are suffering due to 8GB and have provided no proof to that effect. The proof that we have points to lots of 8GB cards still being sold. Don't get me wrong, I'm not a fan of 8GB cards and I'm not advocating for their continued presence in the market. But the arguments being put forward by the talking heads on youtube are simply inane and clickbaity.

Also again I have to state that absolute terms like that are more applicable the more actual consumer choice there is. Even if we want to assume incorrectly that a 4060 and 7600 XT are otherwise interchangeable aside from VRAM the vast majority of buyers when they actually go to purchase something like this do not go to a store (or estore) and just see those 2 options immediately on the shelf next to each other with official MSRP pricing.

If 8GB provided a horrendous experience we would see significantly more uptake of competing solutions with more VRAM. To believe otherwise requires one to believe that consumers willingly pay money to suffer.

System memory (RAM) in the PC DIY market is priced like a commodity, it doesn't mean you aren't paying more for more RAM but you are effectively paying only a slight margin on top of the input costs. There's minimal additional pricing strategy and considerations involved beyond that. System memory for AIO type devices especially with soldered memory (think laptops, Apple products, etc) is priced like a luxury, the price difference can be well beyond input costs due to significant other considerations.

That difference isn't due to commodity vs luxury. It's due to the fact soldered memory solutions aren't upgradeable and have no competition and therefore have more pricing power.
 
8GB also means you can't use NVIDIAs very own G-Assist.
For now at least. G-Assist is an alpha tech right now, the models are still in the development phase, later they will be distilled and made lighter with further optimizations (possibly with FP8/FP4 precision), reducing it's VRAM footprint.
 
The point is that RAM (including GDDR) is a commodity.

Would it be okay if IHVs stopped adding VRAM to their mainstream (most popular) GPUs once they hit 4GB? If not, why?
Those two things are not linked as you suggest.

Main system RAM is a commodity in most x86 CPU platforms because the physical format of a modern memory DIMM (and even earlier SIMM, and even earlier than that DIP) permits any number of manufacturers to develop, market and sell interchangable compatible parts. A DDR5 module is defined by a standard which can be adopted by any manufacturer -- Mushkin, Kingston, Asus, Toshiba, IBM, Lenovo, Dell, HP, Micron, Super Micro, MITAC, Hyundai, Samsung - and any of those modules can be dropped into just about any DDR5-compatible socket regardless of the system board hosting that socket. They can even be mixed and matched, within the compatibility specification of DDR5 signaling. This open availability and the end-consumer's choice of vendors on the basis of price, performance, capacity, and brand loyalty is what permits DDR5 to function as a commodity item.

VRAM on a dGPU board is not a commodity in any consumer sense. You cannot pick or choose whose memory is strapped to your dGPU, no matter the price, performance, capacity or brand. The only option any customer has is to purchase a different dGPU which has whatever price, performance, capacity, or brand they would like to see in their VRAM parts. At the actual BGA level, you could argue the underlying chips are a commodity to the dGPU board supplier, but this too makes a lot of assumptions on total number of memory channels provided by the GPU itself, along with firmware support for module size and clamshell mode.

Your strawman about 4GB cards is irrelevant, and also a non-sequitor. The same market forces which would dissuade an IHV from creating a 4GB VRAM dGPU are the same market forces which persuade them to make 8GB models.

To reiterate for about the 9th time in this thread and others: IHVs only make items which are making them profitable. A 4GB VRAM card is unlikely to sell well in today's market, although I suppose it really depends on who they market it to and how cheaply they could construct it. Regardless, 8GB VRAM cards sell incredibly well today, and so long as they continue to sell, IHVs will continue to make them. Same goes for game creators too: if your game's absolute bare minimum spec is 12GB of VRAM, you've isolated your potential audience to less than 25% of the total Steam installed base.

@trinibwoy was correct in his earlier assertion: If you want 8GB VRAM cards to go away, you have to convince people to stop buying them. The onus is not on manufacturers to stop selling something that continues making money, just as how ATI/AMD continued making 4GB HBM cards for quite a while after NVIDIA had mostly left that space. There was no need to leave when the cards continued making profit!
 
Last edited:
Can't be used and adequate are two different things. People buy tons of things that aren't really adequate for their needs because what they need is out of their pricepoint.
So they can be used? Which means that they are in fact "adequate for people needs"? Or your point is that the only thing "adequate" for all people is the most expensive one on the market? I honestly have no idea what you're trying to say.

8GB also means you can't use NVIDIAs very own G-Assist.
So? There are a lot of things which can only be used on a more expensive product which has higher capabilities in everything VRAM included.
Or are you saying that G-Assist of all things is now essential to play games on PC?

Both Nvidia and AMD are still selling 4GB GPUs by the way. Believe it or not these are also "adequate" for some people needs.
 
Last edited:
@trinibwoy was correct in his earlier assertion: If you want 8GB VRAM cards to go away, you have to convince people to stop buying them. The onus is not on manufacturers to stop selling something that continues making money, just as how ATI/AMD continued making 4GB HBM cards for quite a while after NVIDIA had mostly left that space. There was no need to leave when the cards continued making profit!
Of course people would have continued to purchase 4GB cards if that was all that was reasonably available, or if the >4GB competition to the 4GB cards had other major drawbacks or limited availability. That doesn't mean IHVs should have continued putting 4GB on their mainstream cards indefinitely, which would force devs to continue supporting 4GB GPUs. Devs can't require more VRAM until most of their playerbase has more VRAM. This has to start with the hardware and always has.
 
So they can be used? Which means that they are in fact "adequate for people needs"? Or your point is that the only thing "adequate" for all people is the most expensive one on the market? I honestly have no idea what you're trying to say.
Something can be both usable and inadequate at the same time. For instance I could use a bicycle to get to work every day, but I wouldn't consider that adequate. It's too far and I'm too fat. If I couldn't drive for whatever reason I would be forced to use the bike, but I would still consider this inadequate.
 
Something can be both usable and inadequate at the same time. For instance I could use a bicycle to get to work every day, but I wouldn't consider that adequate. It's too far and I'm too fat. If I couldn't drive for whatever reason I would be forced to use the bike, but I would still consider this inadequate.
Okay. So why would you buy that bike then?
This whole discussion is constantly denying GPU customers basic cognitive functions.
One would not pay money for a card which they wouldn't be able to use.
 
You removed part of the post that you quoted which referenced HUB. This is 100% related to what media outlets are claiming and evaluating how that holds up against the reality on the ground. Those outlets are claiming that large swaths of gamers are suffering due to 8GB and have provided no proof to that effect. The proof that we have points to lots of 8GB cards still being sold. Don't get me wrong, I'm not a fan of 8GB cards and I'm not advocating for their continued presence in the market. But the arguments being put forward by the talking heads on youtube are simply inane and clickbaity.

I'll be frank I don't care about HUB or any other media heads in terms of editorial content in that sense. They're a business as well, and I would honestly just look at their editorial content along the same lens, they do what draws money via effectively pandering to their audience. I don't know why people think all these outlets are some sort of higher level of objective journalism for everything they produce.

What I am saying is that consumers don't have really that much choice and as such using market share as a judgement of what actual consumer preference is not the best gauge. For a business it's not actually giving consumers what they want, that's a fairy tale, it's really about giving consumers what they'll accept.

Maybe some people think the GPU market is a perfectly competitive and rife with competition and/or low barriers of entry for disruption and therefore consumers have a lot of choice, but I would think that is hard to argue for.

If 8GB provided a horrendous experience we would see significantly more uptake of competing solutions with more VRAM. To believe otherwise requires one to believe that consumers willingly pay money to suffer.

Because again VRAM is not the only difference. And no despite some segments wanting to simplify the only other difference down to FPS numbers as well that is far from the only other difference.

I'll just be a bit direct here as an example, say if Nvidia itself (so we can take other IHV and arch differences out) were to offer a 4060 with 8GB and 16GB. Except they only chose to sell the 16GB in retail, and the 8GB model only to volume customers. Guess which GPU would have more market share? What the end consumer prefers between those 2 configurations is inconsequential.

Also taking out the over polarization and language aside I think all sides should rationally recognize this issue is something on the margins and open to debate. Obviously 8GB is not completely unusable and also obviously 8GB is not perfectly fine either. But I think again here we're debating a perspective issue here, should consumers be satisfied with what they accept or also voice what they want? I mean very few games are actually not playable, micro stuttering doesn't render a game not playable, 30 fps does not make a game unplayable, 0 graphics options doesn't make a game unplayable. Does that mean it doesn't make sense to bring that up?

That difference isn't due to commodity vs luxury. It's due to the fact soldered memory solutions aren't upgradeable and have no competition and therefore have more pricing power.

I feel like you're just arguing semantics here. Let's just side step that then?

Some people feel that VRAM configuration and pricing is being priced as if it's a competitive market or that it's fine if it isn't. If companies want to make considerations such as not offering higher VRAM configurations or with huge premiums due to the opportunity loss of selling that to professionals for much more money it's fine or doesn't happen.

Some people feel VRAM should be priced and offered as if it's a competitive market but that it isn't. There shouldn't be any other considerations such as not offering a higher VRAM SKU or for very high mark up just because there is more profit selling it to professionals, but the companies are doing this and they aren't happy.
 
I'll be frank I don't care about HUB or any other media heads in terms of editorial content in that sense. They're a business as well, and I would honestly just look at their editorial content along the same lens, they do what draws money via effectively pandering to their audience. I don't know why people think all these outlets are some sort of higher level of objective journalism for everything they produce.

You're right that it's clicks above all these days. This is the thread for discussing HUB content though...

What I am saying is that consumers don't have really that much choice and as such using market share as a judgement of what actual consumer preference is not the best gauge. For a business it's not actually giving consumers what they want, that's a fairy tale, it's really about giving consumers what they'll accept.

Maybe some people think the GPU market is a perfectly competitive and rife with competition and/or low barriers of entry for disruption and therefore consumers have a lot of choice, but I would think that is hard to argue for.

If your heart is set on PC gaming on a budget and you must have more than 8GB VRAM your options are pretty limited. I quickly scanned a few companies selling pre-builts and the 4060 is basically the only entry level option at all of them. However, I don't buy the argument that people are crying into their 8GB 4060's and sucking it up because there's no other choice. I would bet the vast majority of people gaming on those cards are doing absolutely fine.

Obviously 8GB is not completely unusable and also obviously 8GB is not perfectly fine either. But I think again here we're debating a perspective issue here, should consumers be satisfied with what they accept or also voice what they want? I mean very few games are actually not playable, micro stuttering doesn't render a game not playable, 30 fps does not make a game unplayable, 0 graphics options doesn't make a game unplayable. Does that mean it doesn't make sense to bring that up?

It doesn't make sense to bring up things that enthusiasts fuss about and then extrapolate those feelings to the mass market. This is exactly what's happening with the 8GB debate. Where is the evidence that the unwashed masses buying budget 8GB cards are quietly unhappy?

Some people feel that VRAM configuration and pricing is being priced as if it's a competitive market or that it's fine if it isn't. If companies want to make considerations such as not offering higher VRAM configurations or with huge premiums due to the opportunity loss of selling that to professionals for much more money it's fine or doesn't happen.

Some people feel VRAM should be priced and offered as if it's a competitive market but that it isn't. There shouldn't be any other considerations such as not offering a higher VRAM SKU or for very high mark up just because there is more profit selling it to professionals, but the companies are doing this and they aren't happy.

Not sure I follow this point. I don't think anyone thinks lack of competition is fine. It's certainly not a competitive market because we have one player with near monopoly market share. The consumer still has leverage though. Nobody is forcing you to game on a 8GB graphics card. You can decide to do something else with your money and fewer 8GB cards will be sold. I just don't buy the argument that people are swallowing their discontent and paying for products that make them unhappy.
 
Okay. So why would you buy that bike then?
This whole discussion is constantly denying GPU customers basic cognitive functions.
One would not pay money for a card which they wouldn't be able to use.
So I can get to work.

I never meant to suggest 8GB cards are useless. Just that 8GB is a disappointing amount of VRAM for a $300+ GPU in 2025. I expect this to be resolved soon enough once the higher density modules proliferate.
 
So I can get to work.
But that's not a luxury item -- you NEED to get to work, it's non-optional for most folks who are employed and wish to stay that way. Video cards and playing games is a purely optional part of life and aren't equivalent to the example you've contrived.
I never meant to suggest 8GB cards are useless. Just that 8GB is a disappointing amount of VRAM for a $300+ GPU in 2025. I expect this to be resolved soon enough once the higher density modules proliferate.
But what is this based on, again?

I very much wish a $300 video card would play path traced games at 120FPS using maximum quality settings on my 3440x1440 screen. Sadly this doesn't mean my expectations are based on any observable link to reality or quantifiable logic.

Even the very newest games today can be played at >60fps on an 8GB VRAM card. It doesn't mean you can crank all the sliders to maximum, and that's always been true of every single dGPU generation. For those who don't buy the maximum video card, they often don't get to play with the maximum settings. I'm still completely perplexed as to why this argument ("but they should just give us more!") makes any sense to anyone claiming it.

The cards work for games today, they'll continue work for games in the foreseeable future, and in fact the majority of games will absolutely target this quantity of VRAM as a reasonable PC hardware target expectation for literally years to come thanks to the absolutely massive installed base of <= 8GB VRAM PCs in the wild, along with just about every console ever made outside of the PS5 Pro.
 
Customers don't have options at affordable price points with more than 8GB. Products being sold doesn't signify that customers feel it’s adequate when it’s the only product they can buy.
 
The fact that 8GB GPUs are still being bought in droves clearly proves that people see them as adequate for their use cases.
I’d wager the majority of people buying 8GB cards (or frankly any card at all) don’t even know what VRAM is or what amount they should get. Most PC gamers are on prebuilts and just trust that the SI is designing them a half decent system for the price.

They’ll just drop settings until the game runs well. Which is fine, but that doesn’t mean we need to use the average person as a barometer for if a product is good: the average person doesn’t really care that much about quality honestly.
 
Okay. So why would you buy that bike then?
This whole discussion is constantly denying GPU customers basic cognitive functions.
One would not pay money for a card which they wouldn't be able to use.
Can you really not envision a world where consumers don’t really understand the products they are buying?

Speaking generally for a bit: probably billions of dollars per year are spent on useless placebo nonsense globally. Think nitrogen filled tires, those emitter things that claim to keep away bugs and pests, etc. Most people aren’t even researching the stuff they’re buying, they buy it because it sounds good and that’s enough to convince them.

My first gaming PC was when I was a teenager, my parents got it for me for Christmas and they didn’t want me DIYing. I don’t even know what GPU was inside it and neither did my parents lol, they almost certainly got sold junk for a high price. It actually shipped broken and I got to DIY it later but that’s a separate story, moral of the tale is it’s probably more correct to assume consumers know next to nothing about the products they’re buying (at least in the consumer space, if you’re selling contractor grade power tools you would expect the end buyer to know a thing or two, prob same with Nvidia’s commercial products).

That doesn’t mean you can’t sell 8GB products at all but let’s just be real here: people aren’t usually buying these cards because they’ve calculated they’ll never need 8GB, they’re buying them because they’re part of a cheaper prebuilt that’s ripping them off.
 
I'm still completely perplexed as to why this argument ("but they should just give us more!") makes any sense to anyone claiming it.

It’s a winning argument. After all who doesn’t want more stuff for less money.

I’d wager the majority of people buying 8GB cards (or frankly any card at all) don’t even know what VRAM is or what amount they should get. Most PC gamers are on prebuilts and just trust that the SI is designing them a half decent system for the price.

They’ll just drop settings until the game runs well. Which is fine, but that doesn’t mean we need to use the average person as a barometer for if a product is good: the average person doesn’t really care that much about quality honestly.

Shouldn’t we then use average quality settings to evaluate if average products are good for an average person?
 
Shouldn’t we then use average quality settings to evaluate if average products are good for an average person?
Pray tell what ‘average’ quality settings are, I don’t think I’ve seen those in a menu before.

The average person coming from a console probably won’t notice 2016 era textures because they had to go down to low or medium. However I’d maintain that doesn’t really matter, it’s still a bad lopsided product and I would never recommend it to anyone.
 
However I’d maintain that doesn’t really matter, it’s still a bad lopsided product and I would never recommend it to anyone.
And again, you and a few others make this claim based on what exactly? It's a lopsided product how? It plays literally every game available today, albeit not at maximum quality. Tell me why it's lopsided in a way that can stand up to logical discussion and quantifiable outcomes.
 
Should IHVs stop making lower end products because some Youtubers think they are bad? I mean across the entire history of PC gaming, we used to have different GPUs with variable VRAM sizes all the time, and people never complained like this.

I remember in the generation of GTX 600 we had cards ranging from 1GB to 4GB (4x difference) and even lower than this, some GTX 680s came with 2GB and some came with 4GB, some GTX 660TIs came with 1.5GB and some with 3GB (same for HD 7970 by the way), and people bought what they wanted.

I chose the GTX 600/HD 7000 generation because they were right in the beginning of a new console cycle (PS4/Xbox One), and they faced the same explosion of VRAM requirements as we see today.

Right now we have low end GPUs with 8GB and high end 24GB .. a 3x difference, even if we counted the 5090 with it's 32GB VRAM (we shouldn't but whatever), it's going to be a 4x difference, hardly anything different from the past trends, yet the backlash is severe and needless.

Should NVIDIA make the 5060 with 12GB of VRAM? sure thing. But if all low end cards had 12GBs developers would just set the bar higher and in two years 12GBs is not gonna be enough, and then we will be chasing and ever increasing minimum VRAM sizes forever.

People buy whatever suit their budget and adjust from there, if people want an affordable option (in the current turbulent climate with its constant price hikes) then providing an 8GB SKU for them is a viable option, especially as if that 8GB option can play every game nearly maxed out, with maybe a dozen of exceptions that also run well but at medium textures.
 
Last edited:
Back
Top