Nvidia GeForce RTX 50-series product value

I'm seeing 5080's costing more than my 4090 did ($2750 aud), the dollar was around 64 u.s cents at the time so that can't be the reason as it's around that still. There is a zotac 5090 and keep in mind they are a budget brand on ple for $6600 aud and the cooler is a pretty budget one. So uh yeh that's quite an interesting price lol.

Limit 1 per household :nope:
 
Is this the worst GPU launch in a long time? It feels like it to me.

-Dropped support for 32 bit physx
-Missing Rops
-The worst gen on gen increase in a long time
-Fake MSRPs
-Bad Availability at launch
-Power connector design flaws
-Reports of GPUs going up in flames
-Bad performance per dollar improvements gen on gen.

I feel like there are quite a few I’m missing.
users feel sad, but this also affects scalpers, which are going to feel even more sad.
 
There are a bunch of 5070Ti listed on Newegg for MSRP ($749.99). All out of stock of course but I wonder what that's about.

There's also a couple 5080 listed for MSRP and even one 5090.

Regarding the defective ROPs, at least it's likely a bunch of those cards went to filthy scalpers. May their time be wasted in RMA hell.
 
There are a bunch of 5070Ti listed on Newegg for MSRP ($749.99). All out of stock of course but I wonder what that's about.

There's also a couple 5080 listed for MSRP and even one 5090.
A month has passed, availability should start to get better now. Like it always does.
 
A month has passed, availability should start to get better now. Like it always does.

The guys blowing their tops on YouTube claim this isn’t usual launch scarcity but that volumes are actually quite low. Is that true or are they just stirring up discontent for clicks? No way to tell right now.
 
A month has passed, availability should start to get better now. Like it always does.
Is it normal for the list prices to be inflated at launch? I genuinely don't recall this being a thing but it's very hard to look up.
 
Is it normal for the list prices to be inflated at launch? I genuinely don't recall this being a thing but it's very hard to look up.
Well it depends on a launch of course. There were a couple recently where the market was flooded with excess stock left from a mining crash which happened a month before that and during these the prices were generally at MSRP level because there was no scarcity of GPUs. But outside of these? Yeah new cards are always sold out, old cards EOLed prior to launch and demand is pushing market prices up - until the volume shipments start at which point demand gets filled and prices fall to where they should be. This launch cycle is usually 2-3 months I'd say, and we're looking at just 1 passed now.

Also I think it's funny that people want to buy a new GPU right at the start of a generation considering that the generation lasts for a couple of years and almost every generation launch over the last ~10 years was marred with some issues and problems which got solved in a couple of months after launch. Something about this *need* to be a guinea pig is just fun to watch.
 
Is it normal for the list prices to be inflated at launch? I genuinely don't recall this being a thing but it's very hard to look up.

No it’s not normal at all. Scalping is normal but not markups like this from 1st parties or big retailers.

Given scalping is inevitable without some fair and balanced voucher system I’m all for retailers and manufacturers doing the scalping instead of some loser who doesn’t have a real job.
 
Is it normal for the list prices to be inflated at launch? I genuinely don't recall this being a thing but it's very hard to look up.

GPU launches have changed quite a bit compared to the past, both in terms of the supply side and the demand side and the surrounding discourse/coverage. We've also had some extenuating circumstances greatly affect the perception of launches.

Something to remember is in further the past all launch cards were basically built to a single reference line but rebadged for each AiB. This was the case up to the 1xxx series for Nvidia. Custom AiB cards did not follow on the same date and it wasn't out of the ordinary that they were priced higher due to selling them as having better heatsinks (ref launch cards were single fan blowers) and more overclocking capability.

With Turing/2xxx Nvidia moved to the FE as the initial launch model but back then (time flies) the FE model MSPR was specifically above the normal MSRP (+$200 for the 2080ti, +$100 for the rest). So the initial expectation was that it would be more expensive at launch, and actually with some online commentary initially that there would no never be MSRP models from AiBs.

With Ampere/3xxx Nvidia moved the FE model to MSRP but timing wise it came during the pandemic demand/supply issue and then had the mining boom start right at launch. Meanwhile Ada/4xxx came off the inventory glut from Covid/mining. We basically haven't had any market stability for the last 2 launch cycles due to extenuating factors. Not that things are any better with Blackwell/5xxx as we have both AI and trade/geopolitical factors that are overhanging the market as well.

Also we've seen the growth of social media and accessibility to those communities compared to the past which likely is also changing the general demand dynamics and perception of these launches. With Pascal even I'd wager the enthusiast demographic was smaller still primarily on forums like this consuming text media. Now the the enthusiast demographic is much larger with discussion and impressions spread over reddit (eg. r/nvidia has grown from ~15k members in 2016 to ~1.5m today), twitter, etc. and with discourse primarily driven by youtube/influencers. The concept of leaks and secrecy in the leadup as you can imagine is also different nowadays.

Other things to keep in mind is also is that over the years inventory management for goods in general has changed. Companies these days are wanting to avoid inventory issues and having to clear stock.
 
With Turing/2xxx Nvidia moved to the FE as the initial launch model but back then (time flies) the FE model MSPR was specifically above the normal MSRP (+$200 for the 2080ti, +$100 for the rest).
That was with Pascal actually, with Turing (and later) FE prices were the same as MSRPs.
 
That was with Pascal actually, with Turing (and later) FE prices were the same as MSRPs.

Turing had the higher FE prices -


What I misremembered was the FE branding (and higher than MSRP price) started with Pascal -


The difference with those gens was that launch cards moved away from AiB branding. Eg you had AiB GTX 1080s -=



But that stopped with Turing.

Ampere though did move the FE MSRP to be uniform -


As an aside am I dating myself still using Anandtech links?
 
Out of interest I went back and looked at some sentiments from the time of the GTX 1080, as Pascal/1xxx is I think generally considered by peoples memories as the last great leap generation.


While the graphics card’s starting MSRP is ostensibly $600, the only versions available to purchase online on day one were of the pricier $700 Founders Edition model,

Making matters worse, by 9:09 a.m. Eastern, Nvidia’s stock of Founders Edition cards—the only model Nvidia guaranteed to be available on day one—were already sold out. Founders Edition cards peddled by Nvidia partners like EVGA, Asus, and Zotac were also showed as sold out immediately at Newegg and Best Buy. That means that either demand was sky-high, or supplies of these first-ever 16nm FinFET GPUs were extremely limited, or a mixture of the two.

It feels like this launch may have been a wee bit rushed so Nvidia could beat Radeon cards based on AMD’s new 14nm FinFET “Polaris” GPU to market. AMD may announce Polaris-based graphics cards during a Computex livestream scheduled for May 31.


Seriously 2 weeks and all cards are still out of stock or way above MSRP... EVGA representative confirmed on ocuk forum they get very low amount of GPU's from nVidia, not to mention GDDR5X memory which just moved to mass production. The founder edition fee looks like an early adopter tax for me...




Things were so great and different back in the good old days :ROFLMAO:
 
My own experience: when I got my 3080 Ti, I had to buy it in a pre-built. There's almost no one selling separate cards. The price was not good (I calculated the price of each components of the entire system and it's probably over by more than US$300), but not too bad either. For 4090 I was able to get one (just the card) with a somewhat reasonable price (it's a Gigabyte OC card and about US$400 over the FE price but it's directly from a retailer, not scalper price). So I guess at least to me about US$300 ~ US$400 markup was common for last two generations.

This time it's indeed a bit different mainly that 5090 is almost no where to be found. I'm currently not interested in getting one because I already have a 4090, but one of my friend wants a 5080 (he uses a 3070 now) but right now they are only available in pre-built systems (and with very limited availability). Price markup on the other hand does not look as bad this time.
 
Someone went ahead and tested old PhysX games with 5080 + 1050Ti (PhysX).

Batman AA - CPU (5800X3D)
High - 45
Low - 21

Batman AA - PhysX Card
High - 162
Low - 150

Borderlands 2 - CPU (5800X3D)
High - 162
Low - 25

Borderlands 2 - PhysX Card
High - 162
Low - 124

Mirrors Edge - CPU (5800X3D)
High - 162
Low - 14

Mirrors Edge - PhysX Card
High - 162
Low - 157

Metro 2033 - CPU (5800X3D)
High - 314
Low - 13

Metro 2033 - PhysX
High - 260
Low - 35

Batman AK (64bit) - 5080 PhysX
High - 162
Low - 98

Batman AK (64bit) - 1050Ti PhysX
High - 162
Low - 50

 
Last edited:
I think we have enough info to make some assumptions and predictions on the 5070 vs 9070 matchup.

The 9070 XT is 16% faster than the 9070 without RT based on AMD's numbers. Compared to the 9070 it has the same bandwidth, 18% more fillrate, 35% more flops and 38% more power so appears to be somewhat bandwidth limited. The power scaling also isn't great. The 9070 should have better perf/watt.

The 5070 Ti is about 45% faster than the 7900 GRE without RT so that puts it about 5% faster than the 9070 XT based on AMD's own GRE comparison where the 9070 XT is 37% faster than the GRE. There's some error here as TPU didn't test the same games AMD did. Let's call it a tie for now. So that puts the 5070 Ti around 15-20% faster than the 9070.

The 5070 Ti has 20% higher TDP than the 5070 (300 vs 250) so with perfect scaling the Ti would be 20% faster putting the 5070 right in line with the 9070. However Nvidia's bigger N4 chips seem to be more power efficient than the smaller ones by around 10% each tier looking at 4080 vs 4070 vs 4060. Applying the same factor to GB203 and GB205 the 5070 could land around 25% slower than the 5070 Ti.

All in all that puts the 9070 around 5-10% faster than the 5070 in raster. The 5070 wins this one easily if it wasn't for that pesky 12GB framebuffer.
 
Back
Top