Value of Hardware Unboxed benchmarking

Nvidia's own list states 173 games total are natively supported or upgraded to support DLSS3.
Not all DLSS2 games can be upgraded to DLSS3, as noted by Nvidia's list of supported titles.
It sounds like you are conflating specific definitions and features that Nvidia created, see below.

"DLSS 3 is a revolutionary breakthrough in AI-powered graphics that massively boosts performance, while maintaining great image quality and responsiveness. Building upon DLSS Super Resolution, DLSS 3 adds Optical Multi Frame Generation to generate entirely new frames, and integrates NVIDIA Reflex low latency technology for optimal responsiveness. DLSS 3 is powered by the new fourth-generation Tensor Cores and Optical Flow Accelerator of the NVIDIA Ada Lovelace architecture, which powers GeForce RTX 40 Series graphics cards."
Maybe there are some exceptions but since DLSS2 you should be able to put whatever DLL you want in there. You can even choose what preset you want to use. IDK why NV App uses a whitelist for this functionality. There are other programs you can use for this, or you can do it yourself manually. I've never encountered a problem doing this, but it's been a while since I bothered.

Maybe this could get you banned in multiplayer games and that's why NVIDIA uses a whitelist. I've never heard of this happening but it does seem like a possibility.
 
Cleaned up the thread to leave the high signal information and remove the back and forth about how best to count the number of titles that support FSR3.1 and DLSS2 or higher (including the user or a tool or whatever doing some work to upgrade DLLs to get to the latest versions, assuming hardware support and the game allowing that).
 
This makes sense, as the majority of PC gamers are still stuck on 8 GB GPUs, largely due to AMD and Nvidia continuing to ship low-VRAM models, and appear to be actively trying to kill PC gaming, but I digress.

Yes because providing cheaper alternative parts is killing PC gaming.
PC gaming would just thrive if the only GPU available would be the $9000 96GB RTX 6000.
 
Yes because providing cheaper alternative parts is killing PC gaming.
PC gaming would just thrive if the only GPU available would be the $9000 96GB RTX 6000.

They haven't been alternative parts until this generation at the earliest. Even then, the laptop and pre-built market will still make 8GB end up the standard for this gen too. For the mainstream, 8GB has been the defacto for a long time now, except of course, the 3060 (although higher end parts like the 3060ti, 3070/ti made this part an anomaly).

In my mind there was a clear alternate path they could have taken: keep bus widths wider until GDDR7 can come along to increase bandwidth AND capacity. Instead they cheapened out and textures in games are held back because they don't want to increase install size for everyone.
 
They haven't been alternative parts until this generation at the earliest.
2060 6GB and 8GB and even 12GB.
3060 8 and 12GB.
4060Ti 8 and 16GB.
These low RAM parts have always been cheaper alternative for lower end of the market.

In my mind there was a clear alternate path they could have taken: keep bus widths wider
Wider buses means bigger chips and more memory chips means higher prices.

I do wonder sometimes if people actually think that GPU makers don't count the optimal cost strategy when they design chips, and so a guy from Australia can just suggest a better one on YouTube.
 
Last edited:
I'm saying that the alternative parts are the higher VRAM options and that as an alternative they are usually artificially higher priced than they should be.
 
I'm saying that the alternative parts are the higher VRAM options and that as an alternative they are usually artificially higher priced than they should be.
Options here means that the market has a choice instead of having just one part with more VRAM.
Parts with more VRAM will always be more expensive - unless the cost will be saved elsewhere or eaten up by margins.
VRAM is not free. Adding twice the memory chips can easily be more expensive than the cost of RT h/w in Turing which the same guy from Australia was claiming as useless and too expensive.
 
I do wonder sometimes if people actually think that GPU makers don't count the optimal cost strategy when they design chips, and so a guy from Australia can just suggest a better one on YouTube.

HUB has been arguing for some time now that GPU makers should just eat the cost without regard for how many people would actually benefit from the increased vram pool. I wonder what kind of usage stats Nvidia and AMD have on this stuff. It certainly wouldn’t make business sense to willingly eat margins on a high volume part with already tight margins if your metrics tell you that a large majority of users wouldn’t benefit.

Even when offered a choice people seem to be voting for 8GB with their wallet when looking at 4060 vs 7600 XT sales. So if you want to see the death of 8GB cards stop buying them.
 
Even when offered a choice people seem to be voting for 8GB with their wallet when looking at 4060 vs 7600 XT sales. So if you want to see the death of 8GB cards stop buying them.

Looking at tit like that is a bit flawed because the difference in product choice for the majority buyers isn't just VRAM and price (there's a caveat here I'll go into more later). As in picking a 4060 vs 7600 XT is much more involved choice than just a VRAM difference.

I think an issue here is how DIY enthusiast might view the situation with respect the market as whole. For DIY if they look at system memory, system memory is basically priced and configured purely as commodity in most common size ranges. If for this segment of buyers that is what is likely their anchor point in terms of how they view things. If they want more system memory they just essentially pay the input costs with marginal markup (due to ram being priced as commodity as opposed to a luxury). But they don't have that choice with respect to graphics cards. As such this is a bigger issue for this specific demographic.

As for the price issue I think there's going to have to some agreement to disagree here as I think there's differing viewpoints.

We have one camp that likely either doesn't believe GPU VRAM is product and market segmented and priced as such and/or thinks that practice is fine. We have another camp that likely believes GPU VRAM configuration should be priced and available like commodities (akin to system ram) and thinks luxury pricing and product/market segmentation that occurs is not fine.

Personally speaking I believe GPU VRAM is market segmented. I think there's ample evidence to show that is case and it's also pragmatic from a business sense. I can however see why vocal customers might not like that practice as with many other similar scenarios.
 
So if you want to see the death of 8GB cards stop buying them.
People would stop buying something if that would be obviously unusable in modern day s/w.
The fact that 8GB GPUs are still being bought in droves clearly proves that people see them as adequate for their use cases.
Arguing that 8GB are "instantly obsolete" (and whatever b.s. HUB has said about the upcoming 9060 8GB; I didn't watch) here is essentially the same as calling all the people who bought these GPUs idiots as they are buying something which is totally unusable.
The truth is the opposite though. Even the new Doom is running fine on 8GB GPUs - and even HUB had to admit it despite trying their best at creating a settings set where 8GB would choke. When you look at whatever is releasing en mass on PC the vast majority would work fine on 8GB in 1080p or even 1440p and often without the need to lower the holy texture quality (which btw isn't any different to RT and sometimes lowering that from Ultra to Medium does pretty much nothing to IQ, especially in 1080p).
One could argue that 8GB lowers the baseline of PC h/w which means that we see less games with better textures and/or advanced RT but the truth is that this baseline has never been dictated by PC h/w and was always dictated by console h/w. If PC market will suddenly drop all options below 16GBs we will still get games with art aimed at 8GB capacity of Series S which would make such limitation artificial and actually damaging to the lower end of the PC h/w market as it will just get more expensive without any apparent benefits.

What I will agree with though is that the next gen of PC GPUs should probably account for the future generation change in consoles (at the end of 27 probably) meaning that the GPUs which Nv/AMD/Intel will launch at the end of 26 should have enough VRAM to run games from PS6/Xb+ without problems. Hopefully we'll have enough cheap 3GB G7 chips in production then for these to be used across all lineups from low to high end.
 
The fact that 8GB GPUs are still being bought in droves clearly proves that people see them as adequate for their use cases.
This right here is a fallacy. It doesn't prove they're adequate for peoples needs, only thing it proves it's a price point which people are comfortable with, adequate for their needs or not.

And in majority of cases it's just what the OEM decided to put in their machines rather than people actually choosing to buy said cards themselves, retail video card market is a small minority.
 
Looking at tit like that is a bit flawed because the difference in product choice for the majority buyers isn't just VRAM and price (there's a caveat here I'll go into more later). As in picking a 4060 vs 7600 XT is much more involved choice than just a VRAM difference.

There are several outlets claiming VRAM alone is sufficient criteria to disqualify a purchase.

We have one camp that likely either doesn't believe GPU VRAM is product and market segmented and priced as such and/or thinks that practice is fine. We have another camp that likely believes GPU VRAM configuration should be priced and available like commodities (akin to system ram) and thinks luxury pricing and product/market segmentation that occurs is not fine.

I don't think any camp believes VRAM is a commodity. They acknowledge VRAM costs money and are demanding that graphics cards ship with a minimum of 12GB.

Can't be used and adequate are two different things. People buy tons of things that aren't really adequate for their needs because what they need is out of their pricepoint.

I think you’re confusing needs and wants.

The argument is that OEMs are maliciously shipping systems with unusable amounts of VRAM to unsuspecting customers. How does this make sense? It is much more likely that the vast majority of customers do not feel taken advantage of and are happily gaming on their 8GB cards. This will change one day of course but there's no evidence that there are tons of unhappy buyers of pre-builts out there today.
 
There are several outlets claiming VRAM alone is sufficient criteria to disqualify a purchase.

You only mentioned what buyers are choosing. That's completely separate from these media outlets are saying. What a specific demographic of media outlets are saying would not have a direct correlation on what buyers decide.

I don't know if we want to go into another separate discussion here but these media outlets are not representative of buyers nor do those buyers consider these media outlets as gospel.

Also again I have to state that absolute terms like that are more applicable the more actual consumer choice there is. Even if we want to assume incorrectly that a 4060 and 7600 XT are otherwise interchangeable aside from VRAM the vast majority of buyers when they actually go to purchase something like this do not go to a store (or estore) and just see those 2 options immediately on the shelf next to each other with official MSRP pricing.

The above is why I find a lot of these market share and similar discussions in enthuasist circles are done with very poor context.

I don't think any camp believes VRAM is a commodity. They acknowledge VRAM costs money and are demanding that graphics cards ship with a minimum of 12GB.

I'm not sure if you're understanding here. Commodities still cost money, the pricing strategy is just different compared to a good that is a luxury.

As an example -

System memory (RAM) in the PC DIY market is priced like a commodity, it doesn't mean you aren't paying more for more RAM but you are effectively paying only a slight margin on top of the input costs. There's minimal additional pricing strategy and considerations involved beyond that.

System memory for AIO type devices especially with soldered memory (think laptops, Apple products, etc) is priced like a luxury, the price difference can be well beyond input costs due to significant other considerations.
 
I think we need to level-set on this broken discussion of commodity vs luxury. Playing video games is a luxury, full stop. There are literally billions of people on this planet who do not play video games, and who will not play video games for the rest of their lives, because their socio-economic position in this life will forbid it. For so many of those billions, simple permanent housing is a luxury. Those of us here on the internet, furiously typing away on our keyboards and mice and nice LCD monitors on our personal non-shared PC built sometime in the last decade, using power from our own homes, and accessing the internet over anything more powerful than a 300-baud modem completely forget what luxury really means.

Let's move the goal posts one HUGE step up from the socio-economic status which encompasses the whole globe, and limit our conversation only to people with sufficient disposable income to play even the simplest of video games. Has it yet dawned on anyone here the overwhelming supermajority of literally ALL video games are fully playable on integrated graphics? GPU's are still an absurdist luxury item when compared to a decade old used phone and to the people who can afford them. This is ARM-based tablets and phones pulling games from the iTunes or Google Play stores, or any number of "depricated" consoles like the PS3 or X360, or old laptops perhaps with some level of DGPU but from four generations ago like a 1060MQ or a Radeon 5600 or something, or even just an Intel i630. I guess you could include whatever hardware came on the Wii U...

Let's move the goal posts again, now to the people who have sufficient disposable income to play games made within the last half decade on a "marketed as gaming" device made and sold within same timeframe. Even with the goal posts here, the overwhelming majority of video games are still fully playable on IGP-level hardware. The only difference here is, the IGP hardware is going to be quite a bit more performant than the aforementioned group. It's worth pointing out that every console (except one IIRC) made in the last half decade are running Radeon iGPUs and shared memory on an x86 proc.

Even as we sit here today opining about HUB, the overwhelming majority of dGPU cards in the Steam hardware survey today and right now are still equipped with 8GB VRAM or less.

The fact that we're even trying to argue about a $500+ console with $80+ games comparing against >$700 video cards while still trying to describe literally any of this as "commodity" is insanity. There's nothing commodity about an 8GB VRAM dGPU even today. Buying a dGPU at all is an extreme luxury, even if most of the folks here are too privileged to recognize it.

Eight gigabytes of VRAM is absolutely a luxury.
 
Everything you said is true but:

Eight gigabytes of VRAM is absolutely a luxury.
Still won't be enough for gaming past 1080 soon. It's a luxury sure but we're looking at current value prospects of the newest hardware. How fortunate or not of us being able to access/afford it is kind of irrelevant to that discussion and feels a bit strawmanny, no offense intended.
 
Still won't be enough for gaming past 1080 soo.
You make this statement based on what, exactly? About one dozen games, available today, which actually still do run on 8GB of VRAM if you turn a few of the options down to medium? What about the other 100 "really big" games released in the last three years which will run on it without any issues? And the next 100 which probably will have the same success rate?

What you're saying is every current console device will completely die, outside of the PS5 Pro, because all upcoming games will completely stop functioning with "only" 8GB of memory addressable by the GPU (let's not even dwell on the performance issues with a shared memory subsystem, which exists on every console built today.) You know that isn't true, and for as long as it isn't true, 8GB of video memory is a more-than-just-viable product.
 
I think we need to level-set on this broken discussion of commodity vs luxury. Playing video games is a luxury, full stop. There are literally billions of people on this planet who do not play video games, and who will not play video games for the rest of their lives, because their socio-economic position in this life will forbid it. For so many of those billions, simple permanent housing is a luxury. Those of us here on the internet, furiously typing away on our keyboards and mice and nice LCD monitors on our personal non-shared PC built sometime in the last decade, using power from our own homes, and accessing the internet over anything more powerful than a 300-baud modem completely forget what luxury really means.

Let's move the goal posts one HUGE step up from the socio-economic status which encompasses the whole globe, and limit our conversation only to people with sufficient disposable income to play even the simplest of video games. Has it yet dawned on anyone here the overwhelming supermajority of literally ALL video games are fully playable on integrated graphics? GPU's are still an absurdist luxury item when compared to a decade old used phone and to the people who can afford them. This is ARM-based tablets and phones pulling games from the iTunes or Google Play stores, or any number of "depricated" consoles like the PS3 or X360, or old laptops perhaps with some level of DGPU but from four generations ago like a 1060MQ or a Radeon 5600 or something, or even just an Intel i630. I guess you could include whatever hardware came on the Wii U...

Let's move the goal posts again, now to the people who have sufficient disposable income to play games made within the last half decade on a "marketed as gaming" device made and sold within same timeframe. Even with the goal posts here, the overwhelming majority of video games are still fully playable on IGP-level hardware. The only difference here is, the IGP hardware is going to be quite a bit more performant than the aforementioned group. It's worth pointing out that every console (except one IIRC) made in the last half decade are running Radeon iGPUs and shared memory on an x86 proc.

Even as we sit here today opining about HUB, the overwhelming majority of dGPU cards in the Steam hardware survey today and right now are still equipped with 8GB VRAM or less.

The fact that we're even trying to argue about a $500+ console with $80+ games comparing against >$700 video cards while still trying to describe literally any of this as "commodity" is insanity. There's nothing commodity about an 8GB VRAM dGPU even today. Buying a dGPU at all is an extreme luxury, even if most of the folks here are too privileged to recognize it.

Eight gigabytes of VRAM is absolutely a luxury.
The point is that RAM (including GDDR) is a commodity.

Would it be okay if IHVs stopped adding VRAM to their mainstream (most popular) GPUs once they hit 4GB? If not, why?
 
Back
Top