Dedicated gaming GPUs bottom of the priority list?

Losing perf/watt would mean losing top perf which in turn would remove their "halo" status and likely significantly affect the sales of the whole lineup.
Contrary to a popular weird idea that people would just buy Nvidia "regardless" they certainly would not, just like the very same people aren't buying as many Intel CPUs now as they did previously.
They could stick with the latest and greatest for the high end and price it wherever they want. Obviously the 4090 MSRP is too low so manufacturing cost shouldn't be a huge concern in that range.

Unfortunately I don't think RTX4000 is so expensive due to manufacturing costs, so this discussion probably doesn't matter.
 
They could stick with the latest and greatest for the high end and price it wherever they want.
Would require a double amount of an already expensive physical layout stage for the standard architectural parts.
Would also probably lead to a loss of any price incentive they may get by producing a lineup at one exclusive foundry partner.
In the end could lead to actually higher retail prices than if they'd make all chips of one family on one production line.
There are reasons why we don't see companies "dual sourcing" their chips - or even chip families - much these days.

Unfortunately I don't think RTX4000 is so expensive due to manufacturing costs, so this discussion probably doesn't matter.
Not going to repeat all that again.
 
I have nothing against using a less than cutting-edge process if it brings prices down. It worked well for Ampere.

Ampere used a significantly older/less advanced process as the Samsung 8nm process is a derivative of their 10nm process, which first started mass production in 2017 I think? AMD were using TSMC 7nm at the time, and Apple had even started shipping 5nm so there was likely an outsized cost advantage (at the expense of power/performance naturally), which is unlikely to be repeated. Seems more likely that AI/HPC will be at leading edge and consumer stuff will be a node behind for the foreseeable future.

It put Nvidia at a very close risk of being beaten in perf/watt by competition though, and it is unlikely that they'll feel like repeating that gambit any time soon.

Also technically N5 isn't a cutting edge process anymore so it is arguable if going to an even less advanced node would be that beneficial even in terms of perf/price. The lowest segment where this is key is being served by Ampere still suggesting that there's nothing between that and N3 which would be a better fit from that point of view.

Agreed, don't see Nvidia repeating it. At best I could see them going Samsung 4nm for some lower end parts.

The N5 family has improved a bit and N4P today is still pretty much cutting edge I'd say, with not that much of a gap to N3. So much so that only the highest margin parts can afford to move to N3 at the moment. Perf/power is like a 15-20% improvement and if you look at density, the SRAM density of N3E, which is the most in use node, is the same as N5. Only logic density improves a reasonable amount but depending on the logic/SRAM/analog mix, I believe it's only a ~1.3X improvement for typical chips. So the benefits are limited despite the significantly higher wafer costs. It may take until N3S which is a density optimized version to see mainstream consumer parts to move to N3.
 
Last edited:
I literally responded with actual commentary about the actual subject matter, which you are conveniently pretending didn't happen.

If you really cared about keeping things on topic, you wouldn't have ignored that.
You literally responded with a cheap shot. Stop that.

This is precisely what happens, it drowns whatever point you might be making (in current and future posts), and if you cared a little bit about keeping things on topic, you would have not done so
 
You literally responded with a cheap shot. Stop that.

This is precisely what happens, it drowns whatever point you might be making (in current and future posts), and if you cared a little bit about keeping things on topic, you would have not done so
It only drowns it out because it's used as a convenient excuse to ignore and deflect from what I was actually saying. Nobody is really that offended by anything I said, cuz nothing was really that harsh, nor invalid. But you'll pretend that to be the case so you can make it about that instead of addressing any of the substance of my post. smh Y'all are still doing it, too.

Congrats on the effort, though. It was quite successful.
 
NVIDIA has the problem that they kept their tensor cores equivalent for their lineups. Suddenly crippling them on a new gen is a bad look, but nvlink is more of a luxury than a necessity for most use cases.

Their GPU lineup is an ever present danger to their AI lineup, at large enough scale lack of support and coolers not designed for good large scale integration stop mattering ... you can DIY it. I assume a lot of Chinese firms are doing just that.
 
NVIDIA has the problem that they kept their tensor cores equivalent for their lineups.
Do you mean between Ada and Hopper? SM8.9 lacks some crucial bits compared to SM9.0 (TMA, wgmma, double the smem/L1) which means they're not equivalent in practice even if they're both "Fourth-Gen Tensor Cores" and their theoretical numbers look close.
 
I wouldn't exactly call "a problem" a feature which generates high demand for your products.
I would call a lack of such feature "a problem" though.
 
It only drowns it out because it's used as a convenient excuse to ignore and deflect from what I was actually saying. Nobody is really that offended by anything I said, cuz nothing was really that harsh, nor invalid. But you'll pretend that to be the case so you can make it about that instead of addressing any of the substance of my post. smh Y'all are still doing it, too.

Congrats on the effort, though. It was quite successful.
No, when you make those cheap shots, you should expect you will be called out for it.

We have a saying of sorts in my country: pushing an old lady off the stairs and that asking her "wow now!, what's the rush?".
You just can't go off topic and attack posters (irrelevant whether it offends or not, if we're not on the terms to allow for some small banter, you're out of line already) and then still you are blaming others for not adressing the bit that is factual in your posts.

And you are way off with your suppositions. I don't pretend anything and I'd say it's a good chance i'm not the only one. I intentionally won't address anything substantive in your posts if you keep polluting them with irrelevant crap. I want to make this choice every time, I won't spend my time untangling them.

The bare minimum *you* have to do in order to have your posts replied to in good faith is to keep it factual.
 
I wouldn't exactly call "a problem" a feature which generates high demand for your products.
I would call a lack of such feature "a problem" though.
It's a problem for AMD, but since NVIDIA doesn't really need to care about AMD for the moment that hardly matters to NVIDIA. They are only competing against themselves.
 
It's a problem for AMD, but since NVIDIA doesn't really need to care about AMD for the moment that hardly matters to NVIDIA. They are only competing against themselves.
It would be "a problem" for Nvidia too if they'd suddenly decide to "cripple" themselves because apparently having the required h/w in your GPUs is "a problem".
In other words, there is no problem aside from some market positioning shenanigans.
 
It's a balancing act, which is which is why this thread seems to be speculating they will just delay the consumer cards instead.

I could see it happening, not because of capacity issues but due to internal competition and China/US tension (gaming cards are much more easily diverted). We shall see.
 
if Radeon somehow actually got back to being properly competitive again
Many people want AMD to be competitive on price alone, but that didn't work well for them in the past, and it won't work at all today, demand for Radeon products is at an all times low (as evidenced by AMD's financial statement and their market share). They are technologically behind and it's hurting their image. DLSS is huge now for NVIDIA (not just for it's quality, but for the fact it is available in significantly more games, whether Super Resolution of Frame Generation), and ray tracing/path tracing are firmly in the hands of NVIDIA, as well as lots of other cool exclusive features.

AMD is in a vicious circle now, they have to get ahead, get feature parity and offer cheaper prices all at the same time. Otherwise, their best hope is they end up like their AI GPU Instinct business, good performance, way cheaper prices, but still most people want NVIDIA, because AMD is behind on technology and features.
 
Last edited:
No, when you make those cheap shots, you should expect you will be called out for it.

We have a saying of sorts in my country: pushing an old lady off the stairs and that asking her "wow now!, what's the rush?".
You just can't go off topic and attack posters (irrelevant whether it offends or not, if we're not on the terms to allow for some small banter, you're out of line already) and then still you are blaming others for not adressing the bit that is factual in your posts.

And you are way off with your suppositions. I don't pretend anything and I'd say it's a good chance i'm not the only one. I intentionally won't address anything substantive in your posts if you keep polluting them with irrelevant crap. I want to make this choice every time, I won't spend my time untangling them.

The bare minimum *you* have to do in order to have your posts replied to in good faith is to keep it factual.
You keep making this strawman that I wasn't staying on topic, when I absolutely was. You are the only who has taken it completely off-topic, focusing only on me and one part of my comment, making it entirely personal and ignoring the whole rest of it that was absolutely on-topic(which you're still doing). smh The sheer irony is crazy. But again, congrats on the successful deflection of the actual content of my post.

Many people want AMD to be competitive on price alone, but that didn't work well for them in the past, and it won't work at all today
It worked better for them in the past than what they're doing now. If they aren't gonna be competitive in tech, then they need to be competitive somehow. Not doing either is a path to complete irrelevance in the market.
 
Back
Top