Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
With this card's specs, it should be at $399 or at worst $449.99. $600 is a scam and imo anyone who buy's this card needs to accept the reality that they'll most likely need to upgrade next gen. Apart from the 4090, the ada lineup has been rather disappointing in price to performance.
as much as I would even prefer that price (399), I am not sure NV or AMD can do that with the inflation and chip manufacturing price increase. But I think they can healthily lose 100 off the price and not notice a damn thing.

I dont know if they're selling that well, or if Nvidia is just making much better margins now so they dont care, or if they are just holding on in a sort of game of chicken with PC gamers, hoping we eventually relent.

But yes, if people dont buy, Nvidia will be forced to reconsider their pricing strategy. And they undoubtedly have plenty of room to do so. It's all up to us.
From what I know they are selling well, but a difference with the last gen release of ampere is that there is a healthy amount of stock always being filled to keep up with the demand. So you really do not see availablity issues on the buying portals (unlike with RTX 3000). It makes guaging the popularity impossible maybe without asking the retailer's directly?
 
as much as I would even prefer that price (399), I am not sure NV or AMD can do that with the inflation and chip manufacturing price increase. But I think they can healthily lose 100 off the price and not notice a damn thing.


From what I know they are selling well, but a difference with the last gen release of ampere is that there is a healthy amount of stock always being filled to keep up with the demand. So you really do not see availablity issues on the buying portals (unlike with RTX 3000). It makes guaging the popularity impossible maybe without asking the retailer's directly?
Inflation is the lie that has been sold to us. If you look at Nvidia's financial statements, it states otherwise. Nvidia since 2011 has grown their gross margin to as high as 67%. It's now returned to 57% due to their poor q4. Apple on the other hand, one of the companies thought to be the most greedy at it's worst had a gross margin of 44%. It currently sits and 43%. Apple uses a more advanced node that Nvidia and yet manages to keep their prices in check. What is Nvidia doing that they deserved an extra 14% compared to apple and at its worst, an extra 24%? There's a reason why most companies don't like working with them.

 
Inflation is the lie that has been sold to us. If you look at Nvidia's financial statements, it states otherwise. Nvidia since 2011 has grown their gross margin to as high as 67%. It's now returned to 57% due to their poor q4. Apple on the other hand, one of the companies thought to be the most greedy at it's worst had a gross margin of 44%. It currently sits and 43%. Apple uses a more advanced node that Nvidia and yet manages to keep their prices in check. What is Nvidia doing that they deserved an extra 14% compared to apple and at its worst, an extra 24%? There's a reason why most companies don't like working with them.

How is the gross margin split in the divisions? Automotive, AI, datacentre, consumer gaming?
 
It's not like AD103 is somehow a higher tier than typical 104 parts, it's basically 'reduced' from the top 102 die as much as 104 parts usually are, with 256-bit bus and everything.

AD103 is a bigger chop than 104 parts have been in the past. 104 is usually 2/3 of the big die. AD103 is just over 1/2 of AD102. It’s about the same chop as Ampere GA104 vs GA102 with one key difference.

Using the second biggest chip for x80 when that chip is 2/3 the size of the flagship is ok (2080, 1080). Using the biggest chip for x80 when the next chip down the line is only 1/2 the size is also ok (3080). With Ada the second biggest chip is half the size and Nvidia used it for the x80 series. It’s that combination that makes no sense.
 
How is the gross margin split in the divisions? Automotive, AI, datacentre, consumer gaming?
From what I remember when I reviewed their 10k, it's hard to say. They don't really tell us as you can see from the images below. They only provide a revenue split.

Nvidia Financial Statements 2023.png
Nvidia Financial Statements 2023 Revenue split.png
 
And frame generation is so horribly bad at the moment that it's not even something I'd consider using.

That's really not true at all. It can work extremely well. It's basically the difference between playable and unplayable on The Witcher 3 and Cyberpunk for me at my preferred settings.

in about 2 years, I feel we'll be talking about 12gb of ram like we're talking about 8gb of ram.

12GB will be completely fine outside of the most extreme corner cases for the rest of this console generation.
 
12GB will be completely fine outside of the most extreme corner cases for the rest of this console generation.
Maybe but, its not a bet I'd take. If the whole point of paying for Nvidia tax is to use premium features, then for me 12gb is not enough. Frame Gen requires more ram and Ray Tracing requires more ram. If console ports on pc end up using 10-12gb of ram without raytracing then you're sol with 12gb if you're interested in RT and frame gen. Again, $600 for 12gb of ram is unacceptable.
 
Inflation is the lie that has been sold to us.
It's not a lie that TSMC 5nm is significantly more expensive than Samsung 8nm. We should also expect GDDR6X to be more expensive than GDRR6 as it is a proprietary technology made only by Micron.

Edit: And see Microsoft's statements about cost/transistor no longer going down significantly with new processes, hence they had to launch the Series S up front because they wouldn't be able to cost-reduce the Series X enough.
 
For me the GPU load already increased enormously during ray tracing when jungle-like trees with many leaves were nearby. Maybe SER helps more in these places?
 
The lack of a significant uplift for the 4070 over the 3080 despite the use of SER in CP2077 OD mode is interesting. I'd expected the game to be much faster on Ada.

SER addresses a specific problem when running RT but I wonder if another part of the RT pipeline is lagging compared to the 3080 which is reducing how effective SER is.
 
Last edited:
The lack of a significant uplift for the 4070 over the 3080 despite the use of SER in CP2077 OD mode is interesting. I'd expected the game to be much faster on Ada.

It could be, for instance, that CP 2077 in Overdrive mode is using a very large acceleration structure and/or that accesses aren't able to be handled as effectively in GPU cache. In such a case, gains from SER may be balanced out by losses from the 4070 only having a 192-bit bus compared to the 3080's crazy huge 320-bit bus.
 
Inflation is the lie that has been sold to us. If you look at Nvidia's financial statements, it states otherwise. Nvidia since 2011 has grown their gross margin to as high as 67%. It's now returned to 57% due to their poor q4. Apple on the other hand, one of the companies thought to be the most greedy at it's worst had a gross margin of 44%. It currently sits and 43%. Apple uses a more advanced node that Nvidia and yet manages to keep their prices in check. What is Nvidia doing that they deserved an extra 14% compared to apple and at its worst, an extra 24%? There's a reason why most companies don't like working with them.


For profit companies are just that and not charities. When companies set gross margin goals they tend to be a lower bound target. There is no upper bound set by a heart of gold.

Do you use such logic when value you own skills and talent. I'd bet you more than likely have a cutoff or a minimum salary that you would accept in return for your time and effort. And I am willing to bet you have no upper bound with anything over that value you equate to greed.

Nvidia isn't selling living saving pharmaceuticals or any other essential goods and forcing people to pay high prices because they have no other choice as those products are essential to their survival.

They are selling gaming GPUs who prices are ultimately set by PC users. The fact that AMD sells competing products at lower prices, yet Nvidia dominates the market shows their pricing is set by the market who has a lot disposable cash when it comes to highend wares.

Nvidia isn't a Grapes of Wrath land owner and PC gamers aren't the Joads.
 
Last edited:
Nvidia isn't a Grapes of Wrath land owner and PC gamers aren't the Joads.

I agree with @Dictator that in terms of realistic pricing asks of Nvidia, expecting the 4070 to be say, $400 is being naive. AMD will not be offering a competitive product to the 4070, 16GB or not, at that price point. $500? Sure, that's reasonable in terms of value and something that would not significantly impact Nvidia's margins, and would likely receive a very positive reception from consumers. Most reviewers are reasonable.

But, your post is a rather defensive reply to someone simply trying to rebut the argument - given by Nvidia - that they must price their consumer products as they currently are due to market conditions outside of their control. What is it exactly that you're objecting so strongly to here - the use of the word 'greed'?

In the context of essentials for life, nothing we argue about here matters - no shit. The vast majority of consumer products are not essential to survival. And yet, we argue about their supposed value all the time. Something can be considered a 'rip-off' outside of the context of baby formula. Nvidia arguing they have no choice but to price their products as they are is probably based on some degree of truth, but also PR. Every company during this inflationary period, when asked, have responded that they must raise the prices to the what they have - while also making record profits.

Noting that, and in this case, noting the exceptional gross margins of Nvidia in this supposed time of economic turmoil is absolutely fair game. What would make the criticism fairer of course, would be if Nvidia segregated out their margins from their consumer and datacenter sales, so we would have an idea of how much of the pie TMSC is taking. But they don't want to do that.
 
We also shouldn't forget, that the cost of making a product is not only the cost of BOM, R&D cost is also paramount, R&D for architecture (performance/efficiency), R&D for features (software and hardware), R&D for cooling ..etc. R&D is always factored into the price of any given product.
 
We also shouldn't forget, that the cost of making a product is not only the cost of BOM, R&D cost is also paramount, R&D for architecture (performance/efficiency), R&D for features (software and hardware), R&D for cooling ..etc. R&D is always factored into the price of any given product.

If I’m not mistaken Nvidia includes R&D in their COGS so it should already be factored into their gross margin. If it was just BOM their gross margin would be several hundred percent.
 
Last edited:
Gaming GPUs are luxury items. They’ll price them at a point people are willing to pay. If you’re not willing, don’t pay. If no one is willing they’ll lower the prices. I get that being "priced out" sucks, but I just don't see it as a real problem. Save your money and wait for something that fits your budget and offers a reasonable upgrade.

Edit: I'm looking at prices by me. 7900 XTs are around $1200 CAD. 4080s are $1700-1800 CAD. Nobody is giving their next gen cards away. A 3060 is still selling for $500-600 CAD. It's just the reality. The only thing that'll bring the prices down is if people stop buying cards.
 
Last edited:
Gaming GPUs are luxury items. They’ll price them at a point people are willing to pay. If you’re not willing, don’t pay. If no one is willing they’ll lower the prices. I get that being "priced out" sucks, but I just don't see it as a real problem. Save your money and wait for something that fits your budget and offers a reasonable upgrade.

Edit: I'm looking at prices by me. 7900 XTs are around $1200 CAD. 4080s are $1700-1800 CAD. Nobody is giving their next gen cards away. A 3060 is still selling for $500-600 CAD. It's just the reality. The only thing that'll bring the prices down is if people stop buying cards.

Nobody is arguing otherwise.

On the 4070 as a whole, general consensus seems to be "Well, ok then." and pretty much my take. It's...fine. That damn 192 bit bus though is so frustrating. 256 bit, you've got the future proofing of 16 GB, but also those 4K numbers would be boosted well above the 3080, instead of often falling behind (Returnal in particular at 4K without RT has the 3080 really stomping over it). A $600 256bit version of this would be so much more attractive, but alas.
 
Last edited:
Nobody is arguing otherwise.

On the 4070 as a whole, general consensus seems to be "Well, ok then." and pretty much my take. It's...fine. That damn 192 bit bus though is so frustrating. 256 bit, you've got the future proofing of 16 GB, but also those 4K numbers would be boosted well above the 3080, instead of often falling behind (Returnal in particular at 4K without RT has the 3080 really stomping over it). A $600 256bit version of this would be so much more attractive, but alas.

But then you need a 256bit 16GB 4070ti which will affect its price and the RTX 4080.

But the 4070 isn't a 4k card so the 4k numbers imo are irrelevant.
 
Status
Not open for further replies.
Back
Top