NVIDIA discussion [2025]

A Titan cost $1000 in 2013. Nobody is against the existence of expensive GPU's. But what Nvidia GPU's are actually offering today at almost any given price point is absolutely quite a bit worse than what we used to get in relative terms.

It's completely undeniable. At least to anybody outside this bizarre forum. It's more than four years since the new consoles came out, and people are still having to spend $350-400+ just to get a new GPU where you dont have to to lower the settings to below console quality. That's just insane. Nvidia is so obviously upselling us on these GPU's. A 4060 is a tiny low end GPU. The 4060Ti is similarly pretty dang small, and in both of these cases, their 8GB of RAM wouldn't be questioned much at all if these were named 4050 and 4050Ti and priced appropriately. And they've done this upselling across basically the whole range.

That's how Lovelace in particular jacked up prices on us. So yes, people are 100% correct when they say that GPU's have gotten more expensive and worse value. Because they simply have. Whether you think there's some justifiable financial need for them to do this is a different argument, but the fact that we're getting less for more these days is undeniable.
How have they gotten more expensive? Do you have to pay more to get the same performance in a newer generation? No. At any dollar amount, every new generation gives you more performance.

What has changed is the amount of “more”. We all understand that. We are all disappointed by it. Can we move the fuck on?
 
How have they gotten more expensive?

Well, he did specify his critera quite clearly in the very post you quoted:

It's more than four years since the new consoles came out, and people are still having to spend $350-400+ just to get a new GPU where you dont have to to lower the settings to below console quality.

Now, it's not a perfect metric - like-for-like when comparing performance profiles across different platforms is a little more difficult to use when we have things like DLSS, and far more widespread use of dynamic res on console.

However, I can't see anyone not at least recognizing that halfway through a consoles lifespan, still having to spend ~70% of the price of an entire console for just a competitive GPU - and one where in some cases, it will be result in an even worse experience due to VRAM limitations on these ~$300 cards, is a new thing.
 
First: in what scenario does a console have more VRAM than literally any of the new cards of this gen? Or howabout the graphics cards of last gen? Then, how are we comparing console graphics, which live within this far more restricted memory footprint in both terms of absolute capacity as well as bandwidth, as somehow "equal to" the graphics output from the same engine burning through notably more graphics memory in a dGPU on the PC platform?

Second, I fail to understand how anyone thinks the consoles are further ahead than the dGPUs at equal graphics settings. Let's ACTUALLY talk about the frametime consistency of a console versus a PC, let's talk about the actual raster resolution (yeah, upscaling has been a thing on consoles for a long time now), let's talk about object density, let's talk about polygon budget, let's talk about particle density, let's talk about dynamic lighting.

And finally, a new dGPU is NOT required (in the slightest) to "match" console performance. A used 1080 Ti can match console performance today, and it's seven years old and costs less than a Switch Lite.

P.S. Are we going to just handwave off how incredibly subsidized consoles are, and the prices you're required to pay for a locked ecosystem of games? Nah? Backwards compatibility? No? Let's not start a PC vs Console war in here, because it's "undeniable" (to borrow seanspeed's word) the consoles are not on equal footing to a PC for a litany of obvious reasons.
 
Well, he did specify his critera quite clearly in the very post you quoted:

Now, it's not a perfect metric - like-for-like when comparing performance profiles across different platforms is a little more difficult to use when we have things like DLSS, and far more widespread use of dynamic res on console.

However, I can't see anyone not at least recognizing that halfway through a consoles lifespan, still having to spend ~70% of the price of an entire console for just a competitive GPU - and one where in some cases, it will be result in an even worse experience due to VRAM limitations on these ~$300 cards, is a new thing.
These are all arbitrary metrics. You and I can cook up 10 others to express our disappointment, but that doesn't change the root issue -- that perf/$ for GPUs has stopped increasing at the rate that has been programmed into our heads as a seemingly inalienable birth right.

Console vs. PC comparisons seem to be interesting because PS5 and XBSX rode out the last major slope on Moore's Law's final gasp. Look at the PS5 Pro's price.
 
First: in what scenario does a console have more VRAM than literally any of the new cards of this gen?

In any scenario where its shared 16GB allows for texture settings that will have to be lowered on the PC version running on 8GB cards at comparable settings. You'll have to lower texture settings on an 8GB card for titles like Spiderman (when using RT), Hogwarts, Diablo IV, Ratchet and Clank, Hellblade 2, just off the top of my head.

Let's ACTUALLY talk about the frametime consistency of a console versus a PC

<cough>#stutterstruggle<cough> We uh, really want to go down this road?

let's talk about the actual raster resolution (yeah, upscaling has been a thing on consoles for a long time now), let's talk about object density, let's talk about polygon budget, let's talk about particle density, let's talk about dynamic lighting.

And finally, a new dGPU is NOT required (in the slightest) to "match" console performance. A used 1080 Ti can match console performance today, and it's seven years old and costs less than a Switch L

Well yeah - considering it has 11GB of ram, which is the main sticking point with cards in the $300 range. The very fact that your go-to card for this comparison is a 7-year old flagship just reinforces seanspeed's point! Yes, that card would be decently competitive, precisely because it addresses the main problem that we have with ~$300 GPU's in the past 2 generations!

Regardless, not sure why used products with no warranty and extremely limited availability are being compared to actual products shipping new from the manufacturer.

P.S. Are we going to just handwave off how incredibly subsidized consoles are,

No one is 'handwaving' it off, which is why the discussion is about the price vs. 4 year old consoles, not when they first come out. The PC is always at big disadvantage wrt price/performance of new console out of the gate, but this long into the gen it is indeed a new development where $300 GPU's aren't just wiping the floor with them, and in some cases require quality compromises to boot.

and the prices you're required to pay for a locked ecosystem of games? Nah? Backwards compatibility? No? Let's not start a PC vs Console war in here,

It seems that's what you were kinda going for with this post? This isn't about the viability of each as a game platform and all their relative strengths and foibles, seanspeed and I have merely noted (as Digital Foundry has done themselves many times) how the ~$350 segment of cards has remained relatively weak compared to launch day consoles.

that perf/$ for GPUs has stopped increasing at the rate that has been programmed into our heads as a seemingly inalienable birth right.

:rolleyes:
 
Last edited:
Nvidia is so obviously upselling us on these GPU's.

I don't want to respond to the comparisons with consoles because this will always end up in the meaningless PC vs console wars, so I'll only respond to this.
Do you have any evidence? I mean, let's not forget that AMD lost its market share during these times. If NVIDIA is so "obviously upselling" these GPUs, it should be trivial for AMD to sell their GPU at much lower prices to gain market share, instead of losing it.
 
@pcchen in the past when either Nvidia or AMD slipped the other punished them for it. The fact that both of them are struggling to realize huge gains tells me a lot about how difficult it is. The fact that Sony is now saying that rasterization is nearly a dead end and CNNs plus ray tracing are needed is just the last confirmation I needed. Nvidia made a bet with the 20 series and it turned out to be correct. People waiting for an easy or cheap fix have their heads buried in the sand.
 
People waiting for an easy or cheap fix have their heads buried in the sand.

I think the proposal here is that these companies should eat the increased cost and give us great perf/$ increases while intentionally reducing their profits for no reason. Fairy tale stuff.

Intel seems to be trying that to some extent but their tech just isn’t competitive. AMD may try it with RDNA 4 to buy market share. Will see if fairy tales do come true.
 
Intel seems to be trying that to some extent but their tech just isn’t competitive.
Intel is doing it because there are plenty of reasons why their GPUs (or CPUs for that matter) wouldn't be selling at the same price as competition's. So it's not really a validation of the idea as much as the opposite of that - that without any solid reasons no GPU maker would reduce their margins to provide the same h/w at lower pricing.

It is also arguably a very destructive idea for the market in general.
Without profits there would be no R&D on the future graphics h/w and we'd be stuck even more in perf/price increases.
Also in case of all vendors lack of GPU division profitability would actually make the argument against their existence a lot stronger - something which is popular on the Internet lately in the form of "Nvidia abandoning gamers for the AI market". Well if they'd be pushed into making their gaming GPU division margins low or even negative then guess what - that scenario would actually stop being the usual Internet FUD and start making business sense.
 
How have they gotten more expensive? Do you have to pay more to get the same performance in a newer generation? No. At any dollar amount, every new generation gives you more performance.

What has changed is the amount of “more”. We all understand that. We are all disappointed by it. Can we move the fuck on?

In absolute terms there have been significant performance increases. But the complaint is about “relative” performance. So the comparison point is a moving target with lots of arbitrary variables.

Clearly a $300 card today will absolutely destroy a $300 card from 10 years ago not accounting for inflation. That 10 year old card would cost $400 today.

I think this explains why these “terrible value” cards are still selling like crazy. The people who actually play games on them are having a decent experience and aren’t sitting there doing these arbitrary relative comparisons.
 
Morgan Stanley’s view on TSMC’s CoWoS order fluctuations highlights that some customers, like AMD and Broadcom, are releasing CoWoS-S capacity due to weaker demand. NVIDIA, however, has stepped in and requested TSMC to convert this capacity to CoWoS-L for GB300A production. Despite these shifts, TSMC’s overall CoWoS demand remains steady, with a slight potential increase in GB300A production later this year.

Note: H100 and H200 use CoWoS-S, while B200 and B300 use CoWoS-L.

 
NVIDIA is set to wipe out any hope of competition entirely, next-gen Rubin architecture is expected to come under "trial production" by H2 2025, as SK Hynix is now focused on supplying HBM4 earlier than scheduled.

The report claims that HBM4 tape out to NVIDIA had already been done in Q4 2024, meaning that SK Hynix had already completed verification stages with mainstream partners. Along with this, it is said that NVIDIA will be SK Hynix's "exclusive" HBM4 customer, which means that Team Green will get access to the cutting-edge HBM far earlier than the markets
this could effectively mean that the HBM4-focused Rubin AI lineup will be released ahead of timeline, likely in the second half of this year.
Diving into why HBM4 is crucial and probably a massive catalyst in the growth of AI markets, the standard will integrate memory and logic semiconductors into a single package. This means there won't be a need for packaging technology, and, given that individual dies would be much closer with this implementation
It would prove to be much more performance efficient. Coming in with 24 Gb and 32 Gb layers, HBM4 is said to feature up to 6.4 Gbps speeds, making it much superior to its previous-gen counterparts.
Given that we see the debut of Rubin by Q4 2025, it won't be wrong to say that Team Green will wipe out the competition in this segment entirely

 
Last edited:
I don't want to respond to the comparisons with consoles because this will always end up in the meaningless PC vs console wars, so I'll only respond to this.
Do you have any evidence?
Yes, I've pretty much proven it before a number of times, but the response from people here, including a couple mods, is that I simply need to STOP bringing up such facts that demonstrate exactly what I'm talking about.

It's absurd, but that's the situation. I'm literally not allowed to argue the obvious, demonstrable facts that prove what I'm saying, because people here dont want to hear it and simply dont think 'greed' is a thing that can even exist in a conceptual sense. Any price that is, must be perfectly logical and justified, always. The free market is perfect.
 
Yes, I've pretty much proven it before a number of times, but the response from people here, including a couple mods, is that I simply need to STOP bringing up such facts that demonstrate exactly what I'm talking about.

It's absurd, but that's the situation. I'm literally not allowed to argue the obvious, demonstrable facts that prove what I'm saying, because people here dont want to hear it and simply dont think 'greed' is a thing that can even exist in a conceptual sense. Any price that is, must be perfectly logical and justified, always. The free market is perfect.

Yet you haven't answered my question. If you have your "evidence" that'd be quite easy.
I don't really want to waste any more time on this topic, but I'll give another example. Smartphones. Apple is not willing to lower their prices because they want to preserve their brand values. It's very common, just like Ferrari is not going to sell a US$20k car. However, as you can see, in this case other competitors are willing to sell at lower prices and now they have more market share. Apple's global market share is now less than 20%, although they still make a lot of money.
Now, with my question, what's your theory on why AMD's not willing to grow their market share by lowering their prices even more?
 
NVIDIA is set to wipe out any hope of competition entirely, next-gen Rubin architecture is expected to come under "trial production" by H2 2025, as SK Hynix is now focused on supplying HBM4 earlier than scheduled.

If true why would anyone invest in Blackwell especially given the reports of delays and overheating unless Rubin is going to be supply limited for a long time. Hopper got a nice long run.
 
NVIDIA is set to wipe out any hope of competition entirely, next-gen Rubin architecture is expected to come under "trial production" by H2 2025, as SK Hynix is now focused on supplying HBM4 earlier than scheduled.
If true why would anyone invest in Blackwell especially given the reports of delays and overheating unless Rubin is going to be supply limited for a long time. Hopper got a nice long run.
Not to mention they've yet to release the "Blackwell ultra" which Jensen just couple days ago confirmed again to be their next AI chip
 
If true why would anyone invest in Blackwell especially given the reports of delays and overheating unless Rubin is going to be supply limited for a long time. Hopper got a nice long run.
They have moved forward their plans in a great way, Blackwell Ultra (B300) is coming up 6 months early, and Rubin is the same. It's the same situation as H100 and H200. NVIDIA keeps pumping up new hardware and customers buy what's available according to their budget, order volume and development plans.

As for reports of delay:
Taiwan suppliers of Nvidia GB200 servers and components responded to fresh reports of overheating issues on GB200 servers by saying, “How many times is this rumor going to get repeated?”, media report, adding GB200 shipments are on schedule and have not been impacted.

 
Back
Top