NVIDIA discussion [2025]

A Titan cost $1000 in 2013. Nobody is against the existence of expensive GPU's. But what Nvidia GPU's are actually offering today at almost any given price point is absolutely quite a bit worse than what we used to get in relative terms.

It's completely undeniable. At least to anybody outside this bizarre forum. It's more than four years since the new consoles came out, and people are still having to spend $350-400+ just to get a new GPU where you dont have to to lower the settings to below console quality. That's just insane. Nvidia is so obviously upselling us on these GPU's. A 4060 is a tiny low end GPU. The 4060Ti is similarly pretty dang small, and in both of these cases, their 8GB of RAM wouldn't be questioned much at all if these were named 4050 and 4050Ti and priced appropriately. And they've done this upselling across basically the whole range.

That's how Lovelace in particular jacked up prices on us. So yes, people are 100% correct when they say that GPU's have gotten more expensive and worse value. Because they simply have. Whether you think there's some justifiable financial need for them to do this is a different argument, but the fact that we're getting less for more these days is undeniable.
How have they gotten more expensive? Do you have to pay more to get the same performance in a newer generation? No. At any dollar amount, every new generation gives you more performance.

What has changed is the amount of “more”. We all understand that. We are all disappointed by it. Can we move the fuck on?
 
How have they gotten more expensive?

Well, he did specify his critera quite clearly in the very post you quoted:

It's more than four years since the new consoles came out, and people are still having to spend $350-400+ just to get a new GPU where you dont have to to lower the settings to below console quality.

Now, it's not a perfect metric - like-for-like when comparing performance profiles across different platforms is a little more difficult to use when we have things like DLSS, and far more widespread use of dynamic res on console.

However, I can't see anyone not at least recognizing that halfway through a consoles lifespan, still having to spend ~70% of the price of an entire console for just a competitive GPU - and one where in some cases, it will be result in an even worse experience due to VRAM limitations on these ~$300 cards, is a new thing.
 
First: in what scenario does a console have more VRAM than literally any of the new cards of this gen? Or howabout the graphics cards of last gen? Then, how are we comparing console graphics, which live within this far more restricted memory footprint in both terms of absolute capacity as well as bandwidth, as somehow "equal to" the graphics output from the same engine burning through notably more graphics memory in a dGPU on the PC platform?

Second, I fail to understand how anyone thinks the consoles are further ahead than the dGPUs at equal graphics settings. Let's ACTUALLY talk about the frametime consistency of a console versus a PC, let's talk about the actual raster resolution (yeah, upscaling has been a thing on consoles for a long time now), let's talk about object density, let's talk about polygon budget, let's talk about particle density, let's talk about dynamic lighting.

And finally, a new dGPU is NOT required (in the slightest) to "match" console performance. A used 1080 Ti can match console performance today, and it's seven years old and costs less than a Switch Lite.

P.S. Are we going to just handwave off how incredibly subsidized consoles are, and the prices you're required to pay for a locked ecosystem of games? Nah? Backwards compatibility? No? Let's not start a PC vs Console war in here, because it's "undeniable" (to borrow seanspeed's word) the consoles are not on equal footing to a PC for a litany of obvious reasons.
 
Well, he did specify his critera quite clearly in the very post you quoted:

Now, it's not a perfect metric - like-for-like when comparing performance profiles across different platforms is a little more difficult to use when we have things like DLSS, and far more widespread use of dynamic res on console.

However, I can't see anyone not at least recognizing that halfway through a consoles lifespan, still having to spend ~70% of the price of an entire console for just a competitive GPU - and one where in some cases, it will be result in an even worse experience due to VRAM limitations on these ~$300 cards, is a new thing.
These are all arbitrary metrics. You and I can cook up 10 others to express our disappointment, but that doesn't change the root issue -- that perf/$ for GPUs has stopped increasing at the rate that has been programmed into our heads as a seemingly inalienable birth right.

Console vs. PC comparisons seem to be interesting because PS5 and XBSX rode out the last major slope on Moore's Law's final gasp. Look at the PS5 Pro's price.
 
First: in what scenario does a console have more VRAM than literally any of the new cards of this gen?

In any scenario where its shared 16GB allows for texture settings that will have to be lowered on the PC version running on 8GB cards at comparable settings. You'll have to lower texture settings on an 8GB card for titles like Spiderman (when using RT), Hogwarts, Diablo IV, Ratchet and Clank, Hellblade 2, just off the top of my head.

Let's ACTUALLY talk about the frametime consistency of a console versus a PC

<cough>#stutterstruggle<cough> We uh, really want to go down this road?

let's talk about the actual raster resolution (yeah, upscaling has been a thing on consoles for a long time now), let's talk about object density, let's talk about polygon budget, let's talk about particle density, let's talk about dynamic lighting.

And finally, a new dGPU is NOT required (in the slightest) to "match" console performance. A used 1080 Ti can match console performance today, and it's seven years old and costs less than a Switch L

Well yeah - considering it has 11GB of ram, which is the main sticking point with cards in the $300 range. The very fact that your go-to card for this comparison is a 7-year old flagship just reinforces seanspeed's point! Yes, that card would be decently competitive, precisely because it addresses the main problem that we have with ~$300 GPU's in the past 2 generations!

Regardless, not sure why used products with no warranty and extremely limited availability are being compared to actual products shipping new from the manufacturer.

P.S. Are we going to just handwave off how incredibly subsidized consoles are,

No one is 'handwaving' it off, which is why the discussion is about the price vs. 4 year old consoles, not when they first come out. The PC is always at big disadvantage wrt price/performance of new console out of the gate, but this long into the gen it is indeed a new development where $300 GPU's aren't just wiping the floor with them, and in some cases require quality compromises to boot.

and the prices you're required to pay for a locked ecosystem of games? Nah? Backwards compatibility? No? Let's not start a PC vs Console war in here,

It seems that's what you were kinda going for with this post? This isn't about the viability of each as a game platform and all their relative strengths and foibles, seanspeed and I have merely noted (as Digital Foundry has done themselves many times) how the ~$350 segment of cards has remained relatively weak compared to launch day consoles.

that perf/$ for GPUs has stopped increasing at the rate that has been programmed into our heads as a seemingly inalienable birth right.

:rolleyes:
 
Last edited:
Nvidia is so obviously upselling us on these GPU's.

I don't want to respond to the comparisons with consoles because this will always end up in the meaningless PC vs console wars, so I'll only respond to this.
Do you have any evidence? I mean, let's not forget that AMD lost its market share during these times. If NVIDIA is so "obviously upselling" these GPUs, it should be trivial for AMD to sell their GPU at much lower prices to gain market share, instead of losing it.
 
@pcchen in the past when either Nvidia or AMD slipped the other punished them for it. The fact that both of them are struggling to realize huge gains tells me a lot about how difficult it is. The fact that Sony is now saying that rasterization is nearly a dead end and CNNs plus ray tracing are needed is just the last confirmation I needed. Nvidia made a bet with the 20 series and it turned out to be correct. People waiting for an easy or cheap fix have their heads buried in the sand.
 
Back
Top