Not all DLSS2 games can be upgraded to DLSS3
What’s preventing you from doing a DLL swap on any DLSS2 game?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Not all DLSS2 games can be upgraded to DLSS3
Maybe there are some exceptions but since DLSS2 you should be able to put whatever DLL you want in there. You can even choose what preset you want to use. IDK why NV App uses a whitelist for this functionality. There are other programs you can use for this, or you can do it yourself manually. I've never encountered a problem doing this, but it's been a while since I bothered.Nvidia's own list states 173 games total are natively supported or upgraded to support DLSS3.
Not all DLSS2 games can be upgraded to DLSS3, as noted by Nvidia's list of supported titles.
It sounds like you are conflating specific definitions and features that Nvidia created, see below.
"DLSS 3 is a revolutionary breakthrough in AI-powered graphics that massively boosts performance, while maintaining great image quality and responsiveness. Building upon DLSS Super Resolution, DLSS 3 adds Optical Multi Frame Generation to generate entirely new frames, and integrates NVIDIA Reflex low latency technology for optimal responsiveness. DLSS 3 is powered by the new fourth-generation Tensor Cores and Optical Flow Accelerator of the NVIDIA Ada Lovelace architecture, which powers GeForce RTX 40 Series graphics cards."
This makes sense, as the majority of PC gamers are still stuck on 8 GB GPUs, largely due to AMD and Nvidia continuing to ship low-VRAM models, and appear to be actively trying to kill PC gaming, but I digress.
Yes because providing cheaper alternative parts is killing PC gaming.
PC gaming would just thrive if the only GPU available would be the $9000 96GB RTX 6000.
2060 6GB and 8GB and even 12GB.They haven't been alternative parts until this generation at the earliest.
Wider buses means bigger chips and more memory chips means higher prices.In my mind there was a clear alternate path they could have taken: keep bus widths wider
Options here means that the market has a choice instead of having just one part with more VRAM.I'm saying that the alternative parts are the higher VRAM options and that as an alternative they are usually artificially higher priced than they should be.
I do wonder sometimes if people actually think that GPU makers don't count the optimal cost strategy when they design chips, and so a guy from Australia can just suggest a better one on YouTube.
Even when offered a choice people seem to be voting for 8GB with their wallet when looking at 4060 vs 7600 XT sales. So if you want to see the death of 8GB cards stop buying them.
People would stop buying something if that would be obviously unusable in modern day s/w.So if you want to see the death of 8GB cards stop buying them.
This right here is a fallacy. It doesn't prove they're adequate for peoples needs, only thing it proves it's a price point which people are comfortable with, adequate for their needs or not.The fact that 8GB GPUs are still being bought in droves clearly proves that people see them as adequate for their use cases.
How often do you buy things which can't be used just because they are cheap?This right here is a fallacy. It doesn't prove they're adequate for peoples needs, only thing it proves it's a price point which people are comfortable with, adequate for their needs or not.
Can't be used and adequate are two different things. People buy tons of things that aren't really adequate for their needs because what they need is out of their pricepoint.How often do you buy things which can't be used just because they are cheap?
Looking at tit like that is a bit flawed because the difference in product choice for the majority buyers isn't just VRAM and price (there's a caveat here I'll go into more later). As in picking a 4060 vs 7600 XT is much more involved choice than just a VRAM difference.
We have one camp that likely either doesn't believe GPU VRAM is product and market segmented and priced as such and/or thinks that practice is fine. We have another camp that likely believes GPU VRAM configuration should be priced and available like commodities (akin to system ram) and thinks luxury pricing and product/market segmentation that occurs is not fine.
Can't be used and adequate are two different things. People buy tons of things that aren't really adequate for their needs because what they need is out of their pricepoint.
There are several outlets claiming VRAM alone is sufficient criteria to disqualify a purchase.
I don't think any camp believes VRAM is a commodity. They acknowledge VRAM costs money and are demanding that graphics cards ship with a minimum of 12GB.
Still won't be enough for gaming past 1080 soon. It's a luxury sure but we're looking at current value prospects of the newest hardware. How fortunate or not of us being able to access/afford it is kind of irrelevant to that discussion and feels a bit strawmanny, no offense intended.Eight gigabytes of VRAM is absolutely a luxury.
You make this statement based on what, exactly? About one dozen games, available today, which actually still do run on 8GB of VRAM if you turn a few of the options down to medium? What about the other 100 "really big" games released in the last three years which will run on it without any issues? And the next 100 which probably will have the same success rate?Still won't be enough for gaming past 1080 soo.
The point is that RAM (including GDDR) is a commodity.I think we need to level-set on this broken discussion of commodity vs luxury. Playing video games is a luxury, full stop. There are literally billions of people on this planet who do not play video games, and who will not play video games for the rest of their lives, because their socio-economic position in this life will forbid it. For so many of those billions, simple permanent housing is a luxury. Those of us here on the internet, furiously typing away on our keyboards and mice and nice LCD monitors on our personal non-shared PC built sometime in the last decade, using power from our own homes, and accessing the internet over anything more powerful than a 300-baud modem completely forget what luxury really means.
Let's move the goal posts one HUGE step up from the socio-economic status which encompasses the whole globe, and limit our conversation only to people with sufficient disposable income to play even the simplest of video games. Has it yet dawned on anyone here the overwhelming supermajority of literally ALL video games are fully playable on integrated graphics? GPU's are still an absurdist luxury item when compared to a decade old used phone and to the people who can afford them. This is ARM-based tablets and phones pulling games from the iTunes or Google Play stores, or any number of "depricated" consoles like the PS3 or X360, or old laptops perhaps with some level of DGPU but from four generations ago like a 1060MQ or a Radeon 5600 or something, or even just an Intel i630. I guess you could include whatever hardware came on the Wii U...
Let's move the goal posts again, now to the people who have sufficient disposable income to play games made within the last half decade on a "marketed as gaming" device made and sold within same timeframe. Even with the goal posts here, the overwhelming majority of video games are still fully playable on IGP-level hardware. The only difference here is, the IGP hardware is going to be quite a bit more performant than the aforementioned group. It's worth pointing out that every console (except one IIRC) made in the last half decade are running Radeon iGPUs and shared memory on an x86 proc.
Even as we sit here today opining about HUB, the overwhelming majority of dGPU cards in the Steam hardware survey today and right now are still equipped with 8GB VRAM or less.
The fact that we're even trying to argue about a $500+ console with $80+ games comparing against >$700 video cards while still trying to describe literally any of this as "commodity" is insanity. There's nothing commodity about an 8GB VRAM dGPU even today. Buying a dGPU at all is an extreme luxury, even if most of the folks here are too privileged to recognize it.
Eight gigabytes of VRAM is absolutely a luxury.