NVIDIA discussion [2025]

We actually don't know (or at least I haven't seen any info) on the supply of GDDR7. As far as I know this is the first product to utilize GDDR7 and that maybe supply constrained.
It's possible but shouldn't be a long running issue even if it is an issue right now. G7 is an industry standard which means that every memory maker out there will be making it - in contrast to G6X used on 30 and 40 series which was Micron exclusive.

So as I've said there aren't any reasons for 50 series supply to be any worse than 40 series was. It's the same process, same wafers, dies are similar or smaller (aside from the 5090 but that got a +35% price increase to remedy that). The fact that perf/price gains will be relatively small also suggest that the demand will likely be relatively low - probably similar to 40 series for everything below 5090.
 
Last edited:
Okay, but what does any of that have to do with historical pricing?
My guess is, that if people have a history of buying a product named somehow, and they were satisfied with it, they hope to continue doing so. When they find the current iteration is out of reach, they feel disappointed. They might not gather information from reviews like you do, because that can also be very confusing and time-demanding compared to coasting on the product name alone. The average Joe is much different from users of this forum.
 
I needed a humidifier.
Yeah this is the biggest difference between a GPU and a humidifier. One is a fairly cheap purchase you have to make, so you just look at whats available. There are also no 'humidifier enthusiasts' who track humidifier size vs price for fun. A GPU is an expensive luxury purchase you redo every 5 years.
 
Fridges are expensive too, and they improved a lot over the years (at least the high end ones), but I doubt there are similar "fridge enthusiasts" (there probably are, but not as prominent).
 
Fridges are expensive too, and they improved a lot over the years (at least the high end ones), but I doubt there are similar "fridge enthusiasts" (there probably are, but not as prominent).

Ha I’m not so sure. Most of my friends don’t know places like B3D exist and would probably think a fridge enthusiast is as weird as a graphics card enthusiast. I just assume there are enthusiast communities for everything under the sun even if I haven’t heard of them.
 
Fridges are expensive too, and they improved a lot over the years (at least the high end ones), but I doubt there are similar "fridge enthusiasts" (there probably are, but not as prominent).
? Fridges haven't meaningfully changed in the past 20 years lol. Very few people 'upgrade' fridges, they just buy a new one when the old one breaks or they remodel their kitchen and want to fulfill some aesthetic requirement.

Any product that improves as fast as computer hardware is going to have enthusiasts that track pricing gen over gen. For example, TVs, which have cratered in price over the past few decades as they improve.
 
? Fridges haven't meaningfully changed in the past 20 years lol. Very few people 'upgrade' fridges, they just buy a new one when the old one breaks or they remodel their kitchen and want to fulfill some aesthetic requirement.

Any product that improves as fast as computer hardware is going to have enthusiasts that track pricing gen over gen. For example, TVs, which have cratered in price over the past few decades as they improve.

You certainly are not a fridge enthusiast. :runaway:
 
Really want to do a joke about fridge latency and time to beer chilled but I don't think I better add to the derail. The funny thing is I actually did upgrade my fridge about 6 months ago, old one still works and is in the garage as I did do it basically as an upgrade.
 
Just made a +$25 Million infrastructure plan and then someone dropped the bomb: "We would like to use A.I. for...."
I am going to give a VP a heart attack with the increased cost now (and delivery schedule) :runaway:

Consumers have it easy, both in cost and timelines 🤷‍♂️
 
Yeah, the "demise of A.I." proclamation was very premature.
At least the usual suspect have moved on to defective 50 series cards, until the next FOMO against a certain company 🤷‍♂️

EDIT: Talking about forums like Anandtech, HardForum, Tom's Hardware etc, before people get their nuts in a sling...
 

The use of 12VHPWR on 40/50 series GPUs with reference design boards is once more blowing up. This time due to a cost optimization where they eliminated a current balancing circuit they still had in the 30 series board designs, and the lack thereof has now significantly increased the risk of uneven currents when using especially short and thick (low resistance) cables. Even when correctly inserted, just a slightly better than normal fitting pin can already result in a burning wire. There shouldn't be a risk if the cables are long enough, but especially shorter and thicker custom cables are at risk of being a fire hazard.

The linked video got the differences between the board designs right, but fluked with the "broken cable" explanation. There's an up to 10:1 difference in current (>20A / <2A over a two of the 16AWG wires in the bundle) even with a perfectly good cables and connectors.

Guess this one is going to be coined "RTX Flame Generation".
 
Last edited:
they eliminated a current balancing circuit they still had in the 30 series board designs,
Whoever made this video doesn't know what they talking about, 30 series also had two shunt just like 40/50
1739607570368.png
Full diagram
 
Whoever made this video doesn't know what they talking about, 30 series also had two shunt just like 40/50
Full diagram

Not sure about the schematics, they may be reference boards? I think Buildzoid is talking about the Founders Edition boards unless otherwise specified.
The 3080 FE is known to have 3 rails with 3 shunts while the 3090 Ti FE has 3 rails with 6 shunts IIRC.
 
just like 40/50

Look at the solder points for the 12VHPWR socket on the 5090 FE, can be clearly seen on the rear. Just how do you think you think that's supposed to be two rails, with only 2 pin-through slots and one of which is GND? And if you look at the front side of the PCB - there is only a single shunt soldered onto the board close to the socket.

I don't know where you got the info that the 50 series would have two shunts, but it's definitely false.


You are right about the 4090 FE. That's at least two shunts, not one. Might still be only a single rail though, can't tell.

However... https://www.badcaps.net/forum/docum...geforce-rtx-4090-rtx4090-schematic-board-view

Many of the custom designs switched to a single rail and single shunt design. And also mind in the schematics the "Microfit connector with merged 12V inputs".

There are 30 series with 2 and 3 rails. But no single-rail yet.
 
Last edited:
Raja Koduri (former Radeon and Arc head) calls the B200 NVL72 a "masterclass in system architecture, showcasing state-of-the-art scale-up and scale-out bandwidth capabilities that set new industry standards".

He also wrote a custom script to test various AI GPUs in raw PyTorch, with no middleware. These are the results.

Across the sweep of different matrix shapes and sizes:
  • Nvidia 8xH100 - 5.3 PF (67% of peak)
  • AMD 8xMI300 - 3.1 PF (30% of peak)
  • Intel 8xPVC - 2.7 PF (40% of peak)
 
Back
Top