Nvidia GeForce RTX 50-series product value

Ok but, the problem is that people like to make bad faith arguments that go along the line of, "the economics of advanced nodes, increase in input prices, etc are leading to increases in prices for gpus"..... Albuquerque made a similar argument but, it's just false.
Please explain in clear terms how the economies of chip complexity, PCB complexity, and power density are not linked to increases in GPU pricing. Be specific.

You need to be able to concisely describe why the extreme difficulty in creating these absolute monsters of GPUs are somehow a "bad faith" argument in how pricing is determined, because for me it's really damned clearly cut on how radically complex systems like billions of transistors and 12 or more layers of PCB and multiple hundreds of watts of power consumption crammed into just a few square inches can radically impact price.

Completely irrespective of your opinion on value, these complexities absolutely drive price.
 
Last edited:
Something has clearly shifted in the value equation. Observable consumer behavior is out of sync with what we’re being told consumers should be doing. I find it incredible that gamers successfully pushed back on the $1200 4080 and the silly 4080 12GB yet are now camping outside in the cold to buy $1500 5080’s.
This is what gets me. Only thing I can think of is that it's been effectively impossible to purchase a high end GPU for a couple months now leading to pent up demand. I don't recall other launches (cryptoboom excepted) where stock on the previous generation disappeared months before the replacements arrived. Too bad for AMD that Radeons have also vanished, maybe they could actually move some cards if they're the only option. WTF is this market?

I've seen a couple videos of reviewers (LTT being one) despairing and openly wondering if what they are doing is of any value anymore. TBH I'm not sure. Value judgements of MSRP cards are effectively useless at the moment (even of 40 series cards) and it's unclear if the cards will ever be widely available at MSRP. It will certainly be a good while before that is the case.
 
Last edited:
I've seen a couple videos of reviewers (LTT being one) despairing and openly wondering if what they are doing is of any value anymore. TBH I'm not sure. Value judgements of MSRP cards are effectively useless at the moment (even of 40 series cards) and it's unclear if the cards will ever be widely available at MSRP. It will certainly be a good while before that is the case.

Jay and Steve were really emotional in their coverage this time around. F bombs galore. I particularly enjoyed Jay’s video intentionally shitting all over his NDA. Maybe they’re doing it for clicks but maybe they’re genuinely frustrated by the constant price gouging. Jay also threw his hands up in despair that nothing he says matters anyway. Steve is imploring everyone to not buy a 5070 Ti for $1000 but his message may land on deaf ears which is kinda nuts.

It must not feel good to receive MSRP cards to review that then show up at Microcenter with a $150 markup. They’re absolutely right to call that out and not be willing puppets in whatever game Nvidia/AIB/retailers are playing. ASUS supposedly confirmed that their MSRP 5070 Ti is supposed to retail for 750 so that points the finger at Microcenter for jacking up prices due to low supply.

These guys obviously don’t have crystal balls and don’t know what’s going to happen in the next few weeks and months and can only react to what’s happening now. Hopefully sanity prevails once the 9070XT launches and supply catches up. If that happens the current hysteria will seem a bit childish in hindsight.
 
This is what gets me. Only thing I can think of is that it's been effectively impossible to purchase a high end GPU for a couple months now leading to pent up demand. I don't recall other launches (cryptoboom excepted) where stock on the previous generation disappeared months before the replacements arrived. Too bad for AMD that Radeons have also vanished, maybe they could actually move some cards if they're the only option. WTF is this market?

I've seen a couple videos of reviewers (LTT being one) despairing and openly wondering if what they are doing is of any value anymore. TBH I'm not sure. Value judgements of MSRP cards are effectively useless at the moment (even of 40 series cards) and it's unclear if the cards will ever be widely available at MSRP. It will certainly be a good while before that is the case.
AMD must have been moving cards as well, as the cheapest in stock 7900XTX on pcpartpicker right now is $1099.99 :ROFLMAO:
Not that the markup ends up at AMD. The middlemen are having good times right now.

As far as the value judgements of reviewers are concerned.. they'll have to come to terms with the fact that they're indeed pretty much irrelevant.
In fact, they would do well to stop trying to will the market into what they think it should be. Compare the options that actually exist. I honestly could do without the drama - it's even worse than the attempts at comedy.
 
I have looked at the current listing of 5070 Ti on some local retailers. Apparently the cheapest ones are priced at NVIDIA's MSRP (NT$26,990, this is what listed on NVIDIA's Taiwan web page), but the OC ones and those with more bell and whistles are up to NT$35,990. This is a difference of NT$9,000 which is ~US$275. There are of course also some in the middle.

There's no 7900XTX but one 7900XT is available now at $30,990. This is comparable to 5070 Ti's price but generally a bit slower in raster (although it has a bit more VRAM), thus making 5070 Ti attractive in comparison.
 
The local retailer I mentioned sell 5070 TI on a online shop this time, and apparently all were sold out now, even those expensive NT$35,990 ones. I don't know how many cards they had but I think it must be more than 5080 they had last time.
 
Please explain in clear terms how the economies of chip complexity, PCB complexity, and power density are not linked to increases in GPU pricing. Be specific.

You need to be able to concisely describe why the extreme difficulty in creating these absolute monsters of GPUs are somehow a "bad faith" argument in how pricing is determined, because for me it's really damned clearly cut on how radically complex systems like billions of transistors and 12 or more layers of PCB and multiple hundreds of watts of power consumption crammed into just a few square inches can radically impact price.

Completely irrespective of your opinion on value, these complexities absolutely drive price.
Why are people like this? Like your whole argument is completely founded on a mischaracterization of my initial statement. This is the part you left out from my initial post:
People keep regurgitating stuff they hear on youtube from people who absolutely don't know what they're talking about. It's true that GPU input costs have increased but it's certainly not the primary driving factor in cost increases.
If you go back even further to my initial response to your first post in this thread, it reads:
The problem with this argument is that you'd actually have a point if there was a 1 to 1 relationship between the rise in input costs and the rise in gpu prices.
So the context of this discussion is very clear. The initial premise is that the cost increases in gpu are far outstripping the rise in input costs. Your post attempts to mischaracterize my statement and premise to something that is far divorced from my intention and frankly, I don't appreciate that at all.

The problem I have is when discussing certain topics here, people make some seriously unfounded assertions. They provide absolutely no evidence to back up their claim, then they expect everyone to just accept that the assertion they made is true. Now, when I tabled my initial argument, I provided the groundwork for my belief by referencing the balance sheet and financial statement of Nvidia. I compared it to other manufacturers who are subject to similar input costs, and pointed to that as the source of my claims. It's fine if you don't agree with my analysis and believe it's flawed. I'm perfectly ok with that so long as you show your work. By that I mean, provide your analysis rooted in evidence that counter acts my claim. Then we can have a discussion about which approach is correct and how we can maybe refine our approaches.

Like I said, you cannot just present an argument with absolutely no evidence to support the line of thinking, then expect it to face no scrutiny. It's just makes the discussion pointless as it devolves into a contest of who can make the most unfounded claims.
 
The initial premise is that the cost increases in gpu are far outstripping the rise in input costs.

I’m sure this is true but without hard facts on those costs the best we can do is speculate. The burden of proof is kinda on you since you kicked off the debate with a bunch of assertions.

If I were to guess the primary driver of increasing prices is increased appetite from buyers to pay more. Hard to say why that is. Maybe all the price sensitive buyers are being priced out of PC gaming and overall discrete GPU shipments are lower as a result. I can’t find a reliable source for discrete shipping volume over time.
 
Why are people like this? Like your whole argument is completely founded on a mischaracterization of my initial statement. This is the part you left out from my initial post:
I literally quoted the part I was responding to. And despite your meandering retort, complexity of design and implementation always drives costs, full stop. There is no IT anything in this world where parts and power assembly complexity doesn't drive cost.

I don't need to play thirty seven quotes game across nine posts. Your assertion was specifically what I quoted: Ok but, the problem is that people like to make bad faith arguments that go along the line of, "the economics of advanced nodes, increase in input prices, etc are leading to increases in prices for gpus"..... Albuquerque made a similar argument but, it's just false.

Either you can concisely describe to us why you say these are bad faith arguments, or you cannot.

Which, if you cannot, makes your own statement a bad faith argument.
 
Last edited:
I literally quoted the part I was responding to. And despite your meandering retort, complexity of design and implementation always drives costs, full stop. There is no IT anything in this world where parts and power assembly complexity doesn't drive cost.

I don't need to play thirty seven quotes game across nine posts. Your assertion was specifically what I quoted: Ok but, the problem is that people like to make bad faith arguments that go along the line of, "the economics of advanced nodes, increase in input prices, etc are leading to increases in prices for gpus"..... Albuquerque made a similar argument but, it's just false.

Either you can concisely describe to us why you say these are bad faith arguments, or you cannot.

Which, if you cannot, makes your own statement a bad faith argument.
I guess this is the part where we all pretend like we have reading comprehension issues? Let me do my best spell it out in very plain language.

It’s a bad faith argument in the context of the discussion because the discussion is not about if of GPU components price increases cause GPU prices to rise. It’s a discussion asking the question: Are component prices the primary driving factor in price increases or is there another factor acting as the primary force driving prices up? All of this is centred around the initial premise: “the cost increases in gpu are far outstripping the rise in input costs.”

When you make an argument that goes along the line of “increase in component prices increase GPU prices”, you’re essentially saying absolutely nothing in the context of the discussion. It’s generally accepted by all parties in the discussion that an increase in input prices will drive up costs. However that is not what we’re discussing. If you had instead phrased your argument in the in the form: “ Node, component, labour, shipping, costs have increased, here’s the percentage it’s increased by, here’s the data showing that the rise in these costs are proportional to the rise in msrp….. then you’d actually be making a very well thought out, evidence backed argument “

But you haven’t done that have you? You’ve just speculated, provided no evidence to back your speculation, and then expected your statement to go unchallenged? It’s a bad faith because you’re not actually contributing anything to the discussion. Instead, you’re derailing the discussion arguing points that no one is arguing.
 
Let's move on from the cost of physical goods bit going up. We all agree it has, and that it's a factor, and we hopefully all agree that it's next to impossible to judge just how much of one from the outside. Given the product and sales model it's just really hard to ever get that bit accurate from the outside, and even if you could, it'd just be accurate for that point in time most likely because it moves about a lot. Plus we argue about it too much and that makes me (genuinely) sad. There's literally zero need to argue or have negative feelings towards other human beings in any discussion about posh sand that draws pretty pictures.

Instead, how about some positive discussion about the other bits that your money is paying for when you buy a GPU. The main thing there is even less tangible from a cost analysis or value point of view, but much easier to reason about the value from the outside because it's a key part of the value judgement for a consumer, which we all are (even I buy GPUs): the value in the software.

Part of the money you hand over pays for that, and it's not just the client 3D API drivers and whatever extra user experience software you get (snazzy control panels for settings and tweaking, streaming software, etc) to run the GPU in Windows, which is the main bundle of make-it-run-games stuff most people care about, but the engagement with game developers, tools, engine middleware development, alternative platform support (I'm typing this on Linux, and very much value the great Linux driver support from my preferred GPU vendor, as an example).

I've spent some time on some (hilariously toxic for the most part, might need some therapy to get over it) tech publication Discord servers recently, taking the temperature of how their readers/viewers feel about the various recent GPU stuff going on, to get a different perspective than here on Beyond3D. Most folks interested in tech to the point they want to discuss GPUs in any detail, outside of playing games on them, also care a lot about what the software gets them.

For people leaning green that means things like considering what set of DLSS tech is supported and whether the new DLSS has unique benefits on the new products. Whether Reflex works well. Are their favourite games going to get integrations of the bespoke ecosystem things Nvidia provide. Those kinds of things are factored in to real-world assessments of gaming GPU product value by people spending money on them quite a bit, and are directly paid for by the money you hand over.

You don't just buy the physical good, and part of convincing a customer to hand over money is the aggregate value of the other stuff that enables the posh sand to draw its prettiest pictures. Maybe we can have a go at that part of the value judgement.
 
Moving production to US will make things even more unaffordable. Higher costs from materials, infrastructure, overhead, and employment costs aren’t going to be just absorbed by the vendors.
 
Anti-lag + exists and is functionally equivalent to reflex
Anti Lag+ is not the same as Reflex. It delivers lower latency reductions and works from the driver and thus have limited scope, Reflex works from inside the game. Anti Lag 2 is the equivalent to Reflex, but it's only available in ~3 games vs 100s with Reflex.
Ray reconstruction is also quite unfinished with numerous well documented problems
The DLSS4 TNN solved 99% of them. This has been well documented across multiple video comparisons and media outlets. In Spider-Man 2, playing the game without RR means losing a lot of the ray tracing image quality. If you are not interested in upscaling, then DLAA and DLDSR will be right up your alley, they are unmatched to this day by any other vendor.
 
Last edited:
If I were to guess the primary driver of increasing prices is increased appetite from buyers to pay more. Hard to say why that is. Maybe all the price sensitive buyers are being priced out of PC gaming and overall discrete GPU shipments are lower as a result. I can’t find a reliable source for discrete shipping volume over time.

PC gaming is growing in popularity but discrete GPU shipments I believe peaked back around 2010 and trended downwards in unit volumes every since with the exception of 2020 and Covid/mining.

There's this general issue in conflating PC gaming enthusiasts with PC gaming hardware enthusiasts. There is of course some overlap but I feel it's much smaller than people in the hardware circles might think.

Just look at this debate in terms judging the value and potential purchase of new hardware primarily based on how it compares to older hardware, this is a very hardware enthusiast skewed mindset. But for PC gamers as whole from a functional stand point the hardware and software (as in the games) are lasting longer then ever. It's not just the hardware that's "good enough" for longer periods but the games themselves. There might be better looking games but most mainstream games are visually functionally competent and not ugly anymore.

Just look at Kingdom Come 2 for instance, is it flawed visually? Sure it is and it could be better, even much better. At the same time for the vast majority of people it's going to look good much less have the visuals detract from the game experience. We're far past the point in which characters are rendered as block characters with hooves for hands and relying on the user to just inference that they are hands.
 
Just look at this debate in terms judging the value and potential purchase of new hardware primarily based on how it compares to older hardware, this is a very hardware enthusiast skewed mindset.

Completely agree. According to popular opinion the 4060 is a terrible card because it’s not faster than the 3060 Ti. However in my experience it’s a fantastic card because it perfectly fits my living room gaming setup in a tiny case sitting below my TV. Plays older games amazingly well and is cool and quiet. Maybe my use case isn’t common but all of the hate is irrelevant to me as an actual owner.

It’s now the 2nd most popular card on the Steam survey after the 3060. Online discourse seems completely out of touch with how people actually make purchase decisions.
 
You don't just buy the physical good, and part of convincing a customer to hand over money is the aggregate value of the other stuff that enables the posh sand to draw its prettiest pictures. Maybe we can have a go at that part of the value judgement.
Along the lines of the last several replies above mine here, this quote is likely the single largest feather in NVIDIA's cap today. There will be purists who decry upscaling as somehow impure, unfortunately for those true believers upscaling will continue to be the future as our ability to throw ever more transistors at the problem is quickly finding its end. We are moving quickly to the turning point in compute technology where our relentless desire for increased visual fidelity will no longer be solvable by throwing more transistors at the problem. The software will become the differentiator.

Sure, hardware will evolve in different directions. Different ways of accelerating work, which will depend on different optimizations or outright shortcuts, will emerge and evolve. But it's not going to be the same old rasterization tricks of throwing ever more transistors at the problem. DLSS is here to stay for a long time, and as evidenced by the most recent Transformer models in DLSS4, it's already resulting better-than-rasterized visuals with ultimately a lower hardware requirement to achieve such results.

AMD's attempts seem to be lackluster at best IMO. Yeah, FSR is better than nothing, but it's still seemingly years away (again, IMO) from the Transformer models in DLSS4. AMD has to figure out how to get better in this space, because NVIDIA is eating their breakfast, lunch, and dinner with all the software refinements they've been belting out.

And these software refinements are very much part of the sale price, too.
 
I just checked one of my distributors (Ingram Micro) and the availibility/pricing situation is no different than anywhere else. No surprises there. I do notice that even on IM, the MSRP of all these cards (when it is listed) is like 20-50% higher than what NVIDIA advertised. Not the actual selling price; the MSRP is listed in a separate field.
 
Back
Top