NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
It's hard to parse out because almost all the publicly traded AiBs have broad businesses and may not report individual segments.

But PC Partners (Inno3D, Manli, Zotac. Some connection to Sapphire?) - https://www.pcpartner.com/en/subpage.php?sid=70


e5


VGA revenue increase of 110%.

Just playing with the numbers a bit. JPR's numbers I think for total discrete GPU shipments in 2021 was 48.9m units vs 41.5m for 2020, an increase of about 18%. Using an ugly (very ugly) estimate it might mean that ASPs increased about 86%.
 
Last edited:
Fact 1 was ever in contention?
Well, listening to some rumors, you would believe that AIBs are struggling with so much stock in hands that they're asking to delay Ada launch, when in fact its just going back to normal pre-pandemic world with 3-4 months of inventory in the channel. Inventory that AIBs have ordered when everything was immediately flying off the shelves...
 
Well, listening to some rumors, you would believe that AIBs are struggling with so much stock in hands that they're asking to delay Ada launch, when in fact its just going back to normal pre-pandemic world with 3-4 months of inventory in the channel. Inventory that AIBs have ordered when everything was immediately flying off the shelves...
I can see them wanting a few extra months to move Ampere inventory but with the prices GPUs were selling for there is no way they weren’t making astronomical profits. I find it very unlikely they were paying more to Nvidia per GPU.
 
Fact 1 was ever in contention?

Probably depends on what you consider contention. There hasn't really been any mainstream reporting actually discussing the specifics of the business side of graphics cards and the pricing over the last 2 years. So I would say plenty of people with at least some cursory interest in the industry are likely not fully aware of the intricacies. If you go on broader discussion forums for instance you can see people discuss under various impressions of what caused the high prices and who they went to.

The likely reality though is that all parties, including the suppliers of the manufacturers, saw increased revenue and profits. It shouldn't be overlooked too that graphics cards were actually able to to increase shipments in a supply constrained environment. The higher prices did likely contribute to Nvidia/AiBs likely being able to have some priority on supply/logistics in such an environment. Whereas if you look at the consoles that had to operate with a tighter fixed price they not only weren't able to increase shipments in face of higher demand they had to revise shipments numbers downwards continually.

Well, listening to some rumors, you would believe that AIBs are struggling with so much stock in hands that they're asking to delay Ada launch, when in fact its just going back to normal pre-pandemic world with 3-4 months of inventory in the channel. Inventory that AIBs have ordered when everything was immediately flying off the shelves...

I feel a lot of the public sentiment regarding inventories likely stems from anecdotal impressions after acclimating to the situation over the last two months. The fact that graphics cards are sitting on so called "shelves" (real or virtual) at MSRP (well this is a bit of a loose term, some cards, eg. 3060ti are not sitting on shelves at MSRP) with ample inventory to many people feels like their is essentially over stock. However that was the normal situation prior to 2020, you could always basically walk in or click in and buy a graphics card. The public view is too surface level to really know the actual inventory situation.

Non specifically graphics cards though there does seem to be general traction that the broader consumer tech industry is facing lowering demand related to broader macro conditions leading to manufactures and suppliers starting to cut orders and expansion.

They're Sapphires contract manufacturer. Also they're (or at least have been in recent history) contract manufacturer of crapload of other big companies starting from AMD

That's what I thought, I'm guessing it's included in their non branded numbers for revenue.

Another interesting data point would be from TUL Corporation (Powercolor) which I believe is also graphics focused. They only seem to have detailed breakdowns financial reports in Chinese though. In terms of general company revenue and profit - https://finance.yahoo.com/quote/6150.TWO/financials?p=6150.TWO

(all numbers in thousands)
Revenue:
2020 - 3,776,428
2021 - 8,790,649
Gross profit:
2020 - 270,852
2021 - 1,866,279

Asus, Gigabyte, and MSI also did have very good 2021's but they are much more diversified companies so it would be harder to separate out the graphics component (which I couldn't find they reporting specifically). They did not have the same relative growth in revenue and profits as PC Partners or TUL Corporation, which is understandable due to graphics likely being a much smaller relative part of their business.
 
Last edited:
Another interesting data point would be from TUL Corporation (Powercolor) which I believe is also graphics focused. They only seem to have detailed breakdowns financial reports in Chinese though. In terms of general company revenue and profit - https://finance.yahoo.com/quote/6150.TWO/financials?p=6150.TWO

(all numbers in thousands)
Revenue:
2020 - 3,776,428
2021 - 8,790,649
Gross profit:
2020 - 270,852
2021 - 1,866,279

I took a look at TUL's Q4 financial report and it does have breakdowns on revenues by departments, and it looks like this:

2021:
Video card manufacturing and sales: $8,572,684
Construction: $38,122
Others: $179,843
2020:
Video card manufacturing and sales: $3,753,160
Construction: $22,015
Others: $1,253

So basically it's like > 95% of revenues are from video cards. Last year is a very good year for TUL.
 
Man, I was just thinking about this as I was looking to replace the refrigerator at my place. I hadn't really thought about it but the high end graphics cards are using more electricity than the average refrigerator by themselves. That just blows my mind and seems so incredibly excessive. That's not even the whole PC, just the graphics card.

I guess it shouldn't surprise me considering that high end graphics cards also put out more heat than the compressor on a refrigerator. Damn!

Regards,
SB
 
Man, I was just thinking about this as I was looking to replace the refrigerator at my place. I hadn't really thought about it but the high end graphics cards are using more electricity than the average refrigerator by themselves. That just blows my mind and seems so incredibly excessive. That's not even the whole PC, just the graphics card.

I guess it shouldn't surprise me considering that high end graphics cards also put out more heat than the compressor on a refrigerator. Damn!

Regards,
SB
That's true, but the work done by a graphics card is hugely more complex than what is done by a refrigerator. On the other hand, three decades ago you had mainframes which were not nearly as powerful, occupied whole rooms and used way more power!

Still, I get your point and I'm not planning myself to ever get a graphics card that consumes more than 250W, if possible.
 
That's true, but the work done by a graphics card is hugely more complex than what is done by a refrigerator. On the other hand, three decades ago you had mainframes which were not nearly as powerful, occupied whole rooms and used way more power!

Still, I get your point and I'm not planning myself to ever get a graphics card that consumes more than 250W, if possible.
don´t keep your expectation low, even AMD foresees GPUs with TDPs up to 700W by 2025 ... :cool:

https://www.tomshardware.com/news/amd-envisions-700w-gpus-by-2025
 
Man, I was just thinking about this as I was looking to replace the refrigerator at my place. I hadn't really thought about it but the high end graphics cards are using more electricity than the average refrigerator by themselves. That just blows my mind and seems so incredibly excessive. That's not even the whole PC, just the graphics card.

I guess it shouldn't surprise me considering that high end graphics cards also put out more heat than the compressor on a refrigerator. Damn!

Regards,
SB
I don't follow the sentiment here at all. What's the point of comparing two things that do completely different things? A hair dryer consumes more energy than either a refrigerator or a GPU. So does a car. It doesn't make any sense to me to use any of these as a benchmark to judge the excessiveness of the other because they solve entirely different problems. A GPU could be consuming 1mW or 1MW, that doesn't make it more or less effective at its job in relation to a refrigerator.

Instead we can compare it against other computing devices. And you know how that story goes -- we've enjoyed a freaking *exponential* increase in efficiency over the past several decades.

Yeah it sucks is that that exponential efficiency growth is slowing down, and it's getting expensive to sustain it. We're just spoiled. I doubt there is any other field in the history of human civilization that has seen a sustained exponential efficiency growth. And that growth hasn't happened by magic -- it has happened due to the efforts of many, many smart human beings. Let's give them their due credit.
 
I don't follow the sentiment here at all. What's the point of comparing two things that do completely different things? A hair dryer consumes more energy than either a refrigerator or a GPU. So does a car. It doesn't make any sense to me to use any of these as a benchmark to judge the excessiveness of the other because they solve entirely different problems. A GPU could be consuming 1mW or 1MW, that doesn't make it more or less effective at its job in relation to a refrigerator.

Instead we can compare it against other computing devices. And you know how that story goes -- we've enjoyed a freaking *exponential* increase in efficiency over the past several decades.

Yeah it sucks is that that exponential efficiency growth is slowing down, and it's getting expensive to sustain it. We're just spoiled. I doubt there is any other field in the history of human civilization that has seen a sustained exponential efficiency growth. And that growth hasn't happened by magic -- it has happened due to the efforts of many, many smart human beings. Let's give them their due credit.

That's fine if someone wants a heater on for hours at a time. I'm not going to say what others should or shouldn't want. And I guess if one were to use a hair dryer for hours at a time, that'd also be a valid comparison. Sure in that sense an refigerator (100-400 watts) on a daily basis is likely using a similar amount of electricity to a high end GPU depending on the efficiency of the refrigerator and how often a person is using their GPU. Basically a normal gamer and not someone who games 12+ hours a day or has their rig constantly doing crypto.

But for me, I really don't want a heater like that and I'll be content with lower power GPUs.

Just like while shopping for a refrigerator I'll prioritize energy efficiency over features that might be nice, but I don't really need. The less excess heat I'm dumping into the house, the less air conditioning I'd need to use during the summer. Sure an argument could be made that it can help heat your room in the winter, but there are significantly cheaper and more efficient ways to do that.

But that's just me. You do you. :)

don´t keep your expectation low, even AMD foresees GPUs with TDPs up to 700W by 2025 ... :cool:

https://www.tomshardware.com/news/amd-envisions-700w-gpus-by-2025

BTW - just to clarify that was WRT a discussion about efforts AMD was making to try to reduce the energy that GPUs are using by making them more energy efficient. IE - it's that forecast about energy consumption for single die GPUs that prompted them to start seriously researching and developing chiplets for GPUs. It's their hope that by going the chiplet route that they won't hit that 700 watt prediction.

In other words, spend more silicon (spread across multiple chiplets) to reduce power use. Whether that works out as they hope? Only time will tell. So, I'll wait and see, I've been burned too many times by claims made by AMD, ATi, NV and Intel prior to shipping products that I only care about it once a product ships. :p

Regards,
SB
 
That's fine if someone wants a heater on for hours at a time. I'm not going to say what others should or shouldn't want. And I guess if one were to use a hair dryer for hours at a time, that'd also be a valid comparison. Sure in that sense an refigerator (100-400 watts) on a daily basis is likely using a similar amount of electricity to a high end GPU depending on the efficiency of the refrigerator and how often a person is using their GPU. Basically a normal gamer and not someone who games 12+ hours a day or has their rig constantly doing crypto.

But for me, I really don't want a heater like that and I'll be content with lower power GPUs.

Just like while shopping for a refrigerator I'll prioritize energy efficiency over features that might be nice, but I don't really need. The less excess heat I'm dumping into the house, the less air conditioning I'd need to use during the summer. Sure an argument could be made that it can help heat your room in the winter, but there are significantly cheaper and more efficient ways to do that.
You are misdirecting my critique. This has nothing to do with personal energy/heat budgeting for their household (to each their own). You were comparing the energy consumption of a GPU with that of a refrigerator, which is a meaningless comparison because there is no way to meaningfully compare the useful work that those two things do with that energy.

The concern that GPU (and silicon in general) power consumption is increasing is valid. It sucks, and it's going to get worse. It's not just raw perf/W scaling, a bigger issue is the increasing $/xtor, which will drive everyone towards narrow-and-fast designs as opposed to more efficient wide-and-slow designs. Or, we all just give up our voracious appetite for ever-increasing performance (fat chance).
 

Samsung Electronics, the world leader in advanced memory technology, today announced that it has begun sampling the industry’s first 16-gigabit (Gb) Graphics Double Data Rate 6 (GDDR6) DRAM featuring 24-gigabit-per-second (Gbps) processing speeds.

This would appear to render GDDR6X unnecessary.
 



This would appear to render GDDR6X unnecessary.
That is if this G6 is close in it's power consumption to G6X.
 
That is if this G6 is close in it's power consumption to G6X.
Supposedly GDDR6X consumes 15% less than GDDR6 per transferred bit, but that was compared to 16Gbps GDDR6. Anyone know if there has been process/other improvements since then?
 
So, with that GDDR6 announcement, the unveiling of RTX 4000 will happen during Gamescom in August, then. RTX 2000 was revealed during Gamescom, as well.
 
So, with that GDDR6 announcement, the unveiling of RTX 4000 will happen during Gamescom in August, then. RTX 2000 was revealed during Gamescom, as well.
Would NVIDIA push GDDR6X to the sidelines already?
Samsung didn't really say when the actual "launch" will happen, though, nor with which GPU company (and for what it's worth, they did talk about launches in plural)
 
Status
Not open for further replies.
Back
Top