Speculation and Rumors: Nvidia Blackwell ...

  • Thread starter Deleted member 2197
  • Start date
Anything below GB202 would likely be designed to accommodate being put into a laptop in terms of both power and physical restrictions.

If GB203 were likely designed larger especially with a 384 bit bus then it's a desktop only chip effectively. But unlike GB202 products it also wouldn't lend itself to the pro/prosumer market. I'm not sure if consumer desktop has enough volumes and margins to justify something like that.
 
Why not? Nvidia is selling AD104 based workstation cards.

If you already have the chip designed then the cost to spin up a new product is much cheaper. I don't mean it can't functionally exist but what the demand would be, that also can't just be served with GB202 and GB203 as is.

It's a question of whether or not just second tier consumer and workstations has enough volume and margins to justify an entire chip to be designed. I'd suspect not due to the fixed cost involved.
 
Recent rumor says at least 1800 euros for the 5090. I think at least 2000 if performance and/or features offer something significant.
 
My RTX 4090 is now powerfull enough for 4K raytraced, so I will upgrade to the 5090 when it launches.

Something neglected in these "what will it cost" discussions is how the number of transistor goes up each generation coupled with that each new process node is not getting cheaper.

GP102 = 11.800 Mil Die:471 mm² MSRP $1199 Transistor per $= 9.841.534
TU102 = 18.600 Mil Die:754 mm² MSRP $999 Transistor per $=18.618.618
GA102 = 28.300 Mil Die:826 mm² MSRP $1999 Transistor per $=18.618.618
AD102 = 76.300 Mil Die:609 mm² MSRP $1599 Transistor per $=14.157.078
GB102 ~ 90,180 Mil Die:744 mm² MSRP $2000 Transistor per $~45.090.000*)
*) Estimated numbers, I have no insider information.



The jump from GA102 to AD102 is quite substantial., but you are still getting more transistors per $ than with the GP102.
 
My RTX 4090 is now powerfull enough for 4K raytraced, so I will upgrade to the 5090 when it launches.

Something neglected in these "what will it cost" discussions is how the number of transistor goes up each generation coupled with that each new process node is not getting cheaper.

GP102 = 11.800 Mil Die:471 mm² MSRP $1199 Transistor per $= 9.841.534
TU102 = 18.600 Mil Die:754 mm² MSRP $999 Transistor per $=18.618.618
GA102 = 28.300 Mil Die:826 mm² MSRP $1999 Transistor per $=18.618.618
AD102 = 76.300 Mil Die:609 mm² MSRP $1599 Transistor per $=14.157.078
GB102 ~ 90,180 Mil Die:744 mm² MSRP $2000 Transistor per $~45.090.000*)
*) Estimated numbers, I have no insider information.



The jump from GA102 to AD102 is quite substantial., but you are still getting more transistors per $ than with the GP102.
Nobody is neglecting anything, most of us has followed this market for over 20 years. the number of transistors always went up, ever since NV Riva, that's nothing new. Each new process node was always more expensive than the following one (ignoring the single generation Nvidia made in Samsung). Nvidia profit margins were always large and they always tried to push prices up wherever they could, as any company does. There is nothing new but AMD's inability to complete head to head with Nvidia, like they did when they launched HD5800 series, forcing Nvidia to back down from their large price increase with GTX 280 series. For three generations now AMD does not have a really high end part and that's when Nvidia really went to town with pricing.
 
Nobody is neglecting anything, most of us has followed this market for over 20 years. the number of transistors always went up, ever since NV Riva, that's nothing new. Each new process node was always more expensive than the following one (ignoring the single generation Nvidia made in Samsung). Nvidia profit margins were always large and they always tried to push prices up wherever they could, as any company does. There is nothing new but AMD's inability to complete head to head with Nvidia, like they did when they launched HD5800 series, forcing Nvidia to back down from their large price increase with GTX 280 series. For three generations now AMD does not have a really high end part and that's when Nvidia really went to town with pricing.
The price per transistor was still going down significantly with previous process nodes, which is no longer the case. That's why the console manufacturers are struggling to price reduce their consoles.
 
Each new process node was always more expensive than the following one
This isn't true at all. Most node transitions up until 16nm or so were in fact cheaper than a previous node, in both transistor cost and die size cost (as in you would get lower costs not only for a chip of the same complexity but also for a chip of a similar physical size even).
This changing dramatically is the main reason why we see price/perf stagnation and the addition of higher pricing tiers in GPU lineups in the last 10 years or so.

There is nothing new but AMD's inability to complete head to head with Nvidia
AMD is competing about as well as they can in perf/cost right now, all vendors are mostly limited by production capabilities. If AMD will "fix" their ray tracing implementation they would become very competitive without any other changes - although FSR4 should really use AI to be completely competitive.
 
Nobody is neglecting anything, most of us has followed this market for over 20 years. the number of transistors always went up, ever since NV Riva, that's nothing new. Each new process node was always more expensive than the following one (ignoring the single generation Nvidia made in Samsung). Nvidia profit margins were always large and they always tried to push prices up wherever they could, as any company does. There is nothing new but AMD's inability to complete head to head with Nvidia, like they did when they launched HD5800 series, forcing Nvidia to back down from their large price increase with GTX 280 series. For three generations now AMD does not have a really high end part and that's when Nvidia really went to town with pricing.
The cost per transistor "broke" at 20 nm:
Handel1.png

Since 20nm the cost per transistor has inscreased (see above image).
 
The cost per transistor "broke" at 20 nm:
View attachment 12404

Since 20nm the cost per transistor has inscreased (see above image).
That's not true. The cost per transistor stopped dropping, that's not the same thing as an increase. It has remained stable for 4 generations now and you didn't have the sort of prices you have now on 20nm and 16nm. What you have is AMD not able to compete. Drop it, this argument has already lead to the forum being closed and you are bringing it again.
 
Drop it, this argument has already lead to the forum being closed and you are bringing it again.
That's an overreach. How people handled discussion got it closed. If there's been debate with a categorical conclusion on transistor pricing, just link to it and end the argument. In that absence, talk about transistor cost can proceed until someone gets bored and decides to agree to disagree. If the thread is getting too bogged down with a side topic, signal a clean-up and we'll separate out the discussion.

It looks like there is no thread on this subject and it would warrant one...

Discussion on changing cost of transistors moved here
 
Anything below GB202 would likely be designed to accommodate being put into a laptop in terms of both power and physical restrictions.

If GB203 were likely designed larger especially with a 384 bit bus then it's a desktop only chip effectively. But unlike GB202 products it also wouldn't lend itself to the pro/prosumer market. I'm not sure if consumer desktop has enough volumes and margins to justify something like that.
Laptop cards are becoming more and more popular, so I think you’re right on the money here.
 
Laptop cards are becoming more and more popular, so I think you’re right on the money here.
What is the reason for this? Since Turing, laptop GPUs have become increasingly pathetic compared to their desktop counterparts. The mobile 4070 is a joke compared the the desktop 4070.
 
What is the reason for this? Since Turing, laptop GPUs have become increasingly pathetic compared to their desktop counterparts. The mobile 4070 is a joke compared the the desktop 4070.

I’m pretty sure laptop GPu performance/watt is much better today than it was on Turing. Comparing to desktop isn’t really useful since power budgets on desktop are irrelevant for laptop parts.
 
What is the reason for this? Since Turing, laptop GPUs have become increasingly pathetic compared to their desktop counterparts. The mobile 4070 is a joke compared the the desktop 4070.
But it says 4070 on it, unfortunately. I have quite a few friends with kids of gaming age in the tweens and early teens, and they are well, let's just say... 'unsophisticated' buyers. If it says 4070 on the box and kiddo says they want a 4070, then the parents look at the price of the '4070' laptop versus a desktop rig with an actual 4070 + monitor + keyboard and mouse, and the laptop gets bought every single time.

The kids watch an awful lot of gaming youtube, but none of it very technically sophisticated either, they just know they were told they should have a 4070, and not much else.
 
I’m pretty sure laptop GPu performance/watt is much better today than it was on Turing. Comparing to desktop isn’t really useful since power budgets on desktop are irrelevant for laptop parts.
I know that, I'm wondering why laptops are increasingly outselling desktops in a time when desktops are clearly superior. Back in the Turing days the difference between laptop and desktop SKUs wasn't so big. I don't think laptops with 4070s in them are particularly cheap. Although Black Friday is making it hard to get a bead on current pricing.
 
But it says 4070 on it, unfortunately. I have quite a few friends with kids of gaming age in the tweens and early teens, and they are well, let's just say... 'unsophisticated' buyers. If it says 4070 on the box and kiddo says they want a 4070, then the parents look at the price of the '4070' laptop versus a desktop rig with an actual 4070 + monitor + keyboard and mouse, and the laptop gets bought every single time.

The kids watch an awful lot of gaming youtube, but none of it very technically sophisticated either, they just know they were told they should have a 4070, and not much else.
This is a fair explanation. People just don't know. I guess IHVs wouldn't play so fast and loose with names unless it worked. :(
 
I know that, I'm wondering why laptops are increasingly outselling desktops in a time when desktops are clearly superior. Back in the Turing days the difference between laptop and desktop SKUs wasn't so big. I don't think laptops with 4070s in them are particularly cheap. Although Black Friday is making it hard to get a bead on current pricing.

Because every single person that goes to school or has an office job likely has a laptop. In school (college) you're buying your own, and even some workplaces have bring-your-own-device where they give you money to buy whatever you'd like. A lot of people probably can't afford to have a desktop and a laptop.
 
Because every single person that goes to school or has an office job likely has a laptop. In school (college) you're buying your own, and even some workplaces have bring-your-own-device where they give you money to buy whatever you'd like. A lot of people probably can't afford to have a desktop and a laptop.
Has this changed so much since Turing? Thinking about it, Turing was before covid so maybe things have changed. During that time laptops were completely out of stock. Maybe so many people got laptops and are now used to it, and now looking to upgrade.

I'd be pissed if I upgraded from a mobile 2070 to a mobile 4070 thinking I was about to get a huge 2 generation improvement.
 
Has this changed so much since Turing? Thinking about it, Turing was before covid so maybe things have changed.
I think there is also a generational effect as well. I have teenage brothers from a second marriage and they exclusively game on laptop or console - they marveled at my desktop case like something at the Smithsonian. They’ve grown up being able to do pretty much anything on the go, so bifurcating their laptop and desktop like I do seems strange to them (there isn’t a desktop in their house). I also think one can’t underestimate the popularity of low-Fi games to that generation. One of my brothers exclusively plays Roblox and Minecraft on his laptop. Admittedly, you probably don’t need a dGPU for that but the parents bought him a laptop with one because they asked to buy a gaming laptop!

VRR monitors are de rigeur for a modern gaming laptop, whereas when I bought a top-of-the-line gaming laptop (admittedly Maxwell, not Turing) I don’t recall even having the option, meaning it was 60 hz Vsync or bust. Now if you can only hit 50 fps no problem, and then DLSS 3 gets you well over 60. I swore I’d never buy one again - it was essentially a portable desktop. But I’m confident if I did buy one today my experience would be far better (to be clear, I have zero interest in one). Please bear in mind this post is entirely anecdotal, and only intended as a partial explanation.
 
Last edited:
Back
Top