Nvidia GeForce RTX 4070 TI Reviews

Found this chart in the Gamers Nexus review interesting, especially the GTX780 vs RTX3080.

Looking at it I think it's glaringly obvious that Nvidia are taking the piss with the RTX4080.
 

Attachments

  • performance-matchup.png
    performance-matchup.png
    583.5 KB · Views: 54
Last edited:
While the example is for the 4090, this discussion is not really about it is it? Especially because the 4090 did not see as much of a price increase as the 4080 and 4070 did. Plus those chips are smaller so they can get more per wafer and probably higher yields too. So if an AD103 is 295 mm2 and they can get 1.5x more per wafer (134) with a yield of say 90%, so 120 dies, the cost per good die would be 141.

1600 for a 288 (5.5x) vs 799 for 141 (5.6x). What does this tell us? That Nvidia is trying to extract as much margin from lower end parts as from the top halo part. From a lower end part that traditionally sold more so economies of scale apply where margins are usually lower and high profits come from sheer volume. I'm sorry but I don't buy it that this is all just because of high waffer costs. Unless there is so much competition at the foundry that they cannot get as much waffers as they would like, so they can't scale production accordingly? Or worse, Ada as bad yields although we didn't hear anything about that.

Remember that NVIDIA does not sell the cards directly, they sell the GPU chips. Most of the cards' cost are probably not the GPU, but other chips (such as the memory chips), PCB, heat sinks, assembly, etc. If, let's say, we use your number $141 as the cost of a "4070 TI" chip, then the room for reducing the price of the 4070 Ti is even smaller, probably no more than $100.
 
Remember that NVIDIA does not sell the cards directly, they sell the GPU chips. Most of the cards' cost are probably not the GPU, but other chips (such as the memory chips), PCB, heat sinks, assembly, etc. If, let's say, we use your number $141 as the cost of a "4070 TI" chip, then the room for reducing the price of the 4070 Ti is even smaller, probably no more than $100.

In that case how do you explain GA104 selling for 300 less?? Did GA104 have a negative cost lol?? That's why chip costs simply cannot justify the huge price hike!
 
I'm sorry but I don't buy it that this is all just because of high wafer costs.

I don’t buy it either. Wafer costs and die sizes likely don’t have much to do with Ada retail pricing. Yes cost per transistor is no longer scaling but that’s just one factor.

The only incentive to lower prices is to increase sales and given how soft the PC market is right now there’s no guarantee volumes would increase significantly if prices were lower. It’ll be interesting to see how the 4070 Ti does at ~$850 and whether scalpers will see any joy. By all indications the 4090 was priced too low and scalpers are still in the game.

With all this talk of die sizes and gross margins let’s not forget there are salaries and other operating expenses to be paid. Nvidia’s net income last quarter was in the dirt.
 
In that case how do you explain GA104 selling for 300 less?? Did GA104 have a negative cost lol?? That's why chip costs simply cannot justify the huge price hike!

??
GA104 chips are likely much cheaper. There's no public record on how much NVIDIA pays Samsung per wafer of course, but the rumored wafter price of TSMC 7nm was about $9,000, and it can be reasonably assumed that Samsung charged quite a bit less for their 8nm wafer. They also use slower memory chips which are also cheaper. PCB routings are also simpler due to slower frequency.
 
??
GA104 chips are likely much cheaper. There's no public record on how much NVIDIA pays Samsung per wafer of course, but the rumored wafter price of TSMC 7nm was about $9,000, and it can be reasonably assumed that Samsung charged quite a bit less for their 8nm wafer. They also use slower memory chips which are also cheaper. PCB routings are also simpler due to slower frequency.
141 - 300 = -159.
GA104 MSRP was around selling for around 500 depending on currency, which is 300 less than AD104. If the wafer cost is the reason, then the cost at Samsung would have been negative!
 
This is literally the only place on the internet where I'm seeing anybody trying to defend these prices and suggest they aren't absolutely insane.
I don't think anyone is defending prices. There is disagreement with regards to the source of the price increases.

ie, Tesla model 3 price rose to $46000+ over 3 year period from $35000. Was the reason for the increase purely profits?
 
More like people trying to understand or explain why they are. I much prefer it to the usual reddit/forum discourse where it's "fuck these greedy assholes"
The only reason people are looking away from that more obvious explanation is because they are trying to defend them, though.

Yes, there are more cost pressures at the moment, and nobody was suggesting that we should have had the same prices from last generation. But the leaps we're seeing here are not remotely justifiable. A sub 300mm² GPU for $800? :/ You're never gonna find the 'explanation' for that sort of absolutely insane price increase if you take away the option of 'greedy assholes'. It's like trying to have a discussion about what 2+2 is, but you're not allowed to answer 4.
 
If the area of a 300m wafer is 70,695mm² and the die of an RTX4080 is 379 mm² that's what? 186 RTX4080 dies per wafer?

Factor in you don't get dies at the edges of the wafer due to the shape so lets say you get 100 working dies per wafer.

At the rumoured/estimated price of ~$18,000 per wafer that's $180 per die.

So what's bumping the costs up? Is the charge by TSMC to actually create the chips that expensive?

I admit my understanding on making these things is somewhat limited but while costs are going up they shouldn't be going up that much.
 
Last edited:
So what's bumping the costs up? Is the charge by TSMC to actually create the chips that expensive?

I admit my understanding on making these things is somewhat limited but while costs are going up they shouldn't be going up that much.

Costs are up both from supplier costs (not just limited to TSMC, and those suppliers may also be under pressure from their suppliers) but also design complexity is way up as well. However note that not all costs may necessarily be up, I would not be surprised if memory costs are actually down not to mention freight costs are now falling considerably especially compared to peak pandemic.

Also costs for these products have a large fixed component and involve indirect costs not intrinsically part of the physical product. For example DLSS 3 surely cost money to develop and to maintain and update ongoing, that needs to be accounted for with higher margins essentially. This also means that cost for every individual product is not entirely clear, eg. how much of the fixed costs get ascribed to RTX 4090 vs 4070ti?

An important thing here is to also note is that costs is not the same as price. I feel some people seem to be stuck in the idea that all businesses basically operate as "mom and pop" restaurants/grocery stores/etc. where it does operate on something akin to a simplistic cost+ model for pricing.
 
Found this chart in the Gamers Nexus review interesting, especially the GTX780 vs RTX3080.

Looking at it I think it's glaringly obvious that Nvidia are taking the piss with the RTX4080.

This whole gen reminds me of the 600 gen tbh. The controversial aspect of that gen was that the 680 had a lower bus width, had less memory chips and was much smaller than the 580 it replaced yet cost the same. It relied on process and architecture efficiency and clock speed. My main concern with the 680 was the 2GB VRAM, matching what AMD offered for their high end parts with their last gen (hey, familiar again). Then there was of course the original Titan which was a great product but controversial for the $1000 price.

They've done something really similar this time, but have doubled down and raised the prices significantly for each tier except the top. Not a great move.

To me the 4070ti should be a 4070 at $600. The 4080 should stay what is is but be no more than $900.

Later introduce a cut down AD103 as 4070Ti for $700 and a cut down AD102 as a 4080Ti for $1200.

Those are still price rises from previous gen but they aren't insulting and it increases the VRAM amount further down in the stack which is always nice for peace of mind regarding longevity.
 
Fab cost is obviously going up and its been for quite a few years but I guess the real implications only show up recently.
For example, in 2010 one TSMC 'GigaFab' (Fab 15) cost about ~$10B to build. A GigaFab has the capacity of about 100K wafers/month.
In 2018, the newest GigaFab 'Fab 18' reportedly cost ~$20B to build (Fab 18 is currently partially online and producing 3nm and 5nm wafers).
 
An important thing here is to also note is that costs is not the same as price.
I don't know why it's so hard for some people to understand that GPU business is, well, business, and the general idea for a business is to make money.
For all the talk about "sky high" margins there's also an elephant hidden in this room which is called "net income".
Unless we're seeing Nvidia making billions of profits off these new GPUs I don't see how their margins being at any level is of any issue.

I'm just a bit tired reading from various people on Reddit how they know better how to run Nvidia or how Nvidia is killing PC gaming - something which Nvidia has very much created and continue to profit greatly from. The general expectation here is for Nvidia to want to continue making profit from this market and thus actually spending a lot of money on figuring out how to provide best products possible at best prices possible. The rest is just uneducated noise to me until I will see a product which will be at 4080's or 4070Ti's performance/features level for whatever mythical price a random redditor thinks these should be selling.
 
An important thing here is to also note is that costs is not the same as price. I feel some people seem to be stuck in the idea that all businesses basically operate as "mom and pop" restaurants/grocery stores/etc. where it does operate on something akin to a simplistic cost+ model for pricing.

If their mental model is cost+ they’re not even factoring in all the costs.
 
Well I got one. And for MSRP at that. But I think I was lucky to manage that so if the seeming demand for this, at least at the MSRP is anything to go by then it might end up selling reasonably well.

The MSRP models sold out in literally seconds at the two big retailers I was on (Scan.co.uk and Overclockers UK). Now the cheapest available are at least £50 over the MSRP.

I'm taking £=$ for the MSRP as the exchange rate almost exactly matches our VAT addition.
 
Back
Top