NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
It’s bonkers that the 4090 is the highest perf/$ part. Is Nvidia not trying to sell a lot of 4080’s to help clear Ampere inventory? The 12GB is supposedly using full AD104 dies so yields may not be great anyway.

I find it odd that the 4080 16gb is using a cut down AD103, considering the drop from AD102 and the crazy price you'd expect it to at least use the full die. Maybe they are keeping it for a 4080 super next year.

Also I'm not sure where the eventual 4080 ti would fit in the current lineup, the 4090 is heavily cut down already.

There isn’t enough room left in AD103 for a 4080 Ti. A further cut down AD102 isn’t unreasonable. 96SMs could work as a decent middle ground.
 
Actually gpu is the biggest difference (don’t look at synthetics where everyone cheats by allowing different power states but the games shipping on Android vs iOS). Apple kept same MSRP in the US, but European prices are a bit stupid admittedly. That’s one thing Nvidia will never get right, no matter how much they want to be Apple, they always keep changing the price points for the same tier of GPUs, and PC gamers are rightfully pissed about it.

To the end consumer it doesn’t matter that TSMC 4N is more expensive to produce than Samsung 8N, but that the price of the xx80 GPU doubled. Of course the volume Nvidia is pushing is crickets compared to Apple so maybe they get worse deals, but again that doesn’t matter to the end consumer.

Looking at benchmarks Apple would be abit faster, in special loading/exporting due to nvme. Otherwise in gaming SD 8+ gen1 is close enough in special GPU gaming performance which matters. Thats what the end consumers care about, not geekbench and antutude.

PC gamers being pissed, it seems mostly console gamers and apple users who are the most pissed about it, weirdly as it sounds. Even more weird is that AMD fans dont seem to care much either.
 
We are, but fighting against Mother Nature is getting exponentially more difficult. So we need more people and more time all the way up and down the chain starting from ASML and ending with board partners. To a first order of approximation the increased cost is just the summation of people*time needed to fight this battle.

Yes, this probably deserves its own thread, but one of the problem is that the fixed cost of building a new fab has gone up quite significantly, while production rate (wafers per month) is not much higher.
Since a new node can only enjoy a price premium for a limited time, the fixed cost has now been a significant part in the cost formula, not just the marginal cost.
This is again not just because of lack of competition. The apparent lack of competition in high end semiconductor manufacturing is actually a result of pricing out. If you don't have the scale, you won't survive. But scale along can't save you, because the market is ultimately limited. Even if there's suddenly 10 TSMC, the price won't come down much, unless you are willing to sell at loss.
 
Yes, this probably deserves its own thread, but one of the problem is that the fixed cost of building a new fab has gone up quite significantly, while production rate (wafers per month) is not much higher.
Since a new node can only enjoy a price premium for a limited time, the fixed cost has now been a significant part in the cost formula, not just the marginal cost.
This is again not just because of lack of competition. The apparent lack of competition in high end semiconductor manufacturing is actually a result of pricing out. If you don't have the scale, you won't survive. But scale along can't save you, because the market is ultimately limited. Even if there's suddenly 10 TSMC, the price won't come down much, unless you are willing to sell at loss.

Another thing to consider is that fewer and fewer people each year are going into the material sciences (like Electrical Engineering) that are needed for advancements in chip production. Most people going into college see software engineering as the more attractive choice, so not only is it getting harder to achieve any breakthroughs, but there are fewer and fewer people each year (more people with the necessary training retiring than are graduating from college) capable of helping to achieve those breakthroughs.

Regards,
SB
 
View attachment 7041
The 4090 has 68% more SMs than the 4080 but is only 15%-20% faster. what's going on here? BW bottleneck?
Good observation, but bw should be also 50-ish % higher (384 vs. 256 bit). No Idea what's limiting here, but barring any further details, there could be CPU work involved. Some Blender projects have a significant portion of scene setup and whatnot, that's not going any faster with fatter GPUs.
 
But even that is up to developers implement, not automatic like at least Intel is apparently doing
It's controllable by developers, not "up to them to implement". Presumably you can enable it only for shaders which get a speed up from the feature but not for others - as it probably has an overhead.
 
It's controllable by developers, not "up to them to implement". Presumably you can enable it only for shaders which get a speed up from the feature but not for others - as it probably has an overhead.
Meant "up to Devs to implement" as in it's not automatically on, and Devs have to enable it in their game in whatever way they seem fit.
 
A possibly relevant video I came across the other day that maybe touches on this.


That's a very large part of it, but it's just part a general malaise about having to do hard work if you aren't qualified to do the smarter jobs. And a government(s) that encourages you to not work rather than work a difficult job. And it doesn't help that it's not as glamorous or high paying as a software engineering job.

The problem is with what is going on before it even gets to the foundry stage. The R&D behind the technology that a foundry like TSMC will eventually deploy into a fab is losing the skilled engineers that are needed to drive innovation in those areas. While key companies can still gain talent by poaching them from less successful tech R&D companies, that has a limit if the pool of skilled engineers continues to decrease due to a severe shortage of college students going into those fields that are needed.

IMO, if the US government (or any government) is serious about this, they need to go revisit the 50's when there was a large push for the sciences at the university level. While arts are nice, it'd be better if the government funneled college funding and thus students into fields where students actually have a real world chance at a career. So things like the arts should have limited government funding, IMO. If you want to spend 10's of thousands to earn a degree that will result in you being a burden to society, then you should have to pay for that yourself. Basically government grants and guaranteed loans should be off limits to anyone pursuing a liberal arts degree outside of maybe a very limited number of them.

I mean when the government sends a clear message that it is no longer interested in sponsoring things that can actually advance technology and the human race (see defunding of NASA under Obama) is it any wonder that there is less interest in getting into the "hard" sciences (material, electrical, etc.).

Regards,
SB
 
Last edited:
I mean when the government sends a clear message that it is no longer interested in sponsoring things that can actually advance technology and the human race (see defunding of NASA under Obama) is it any wonder that there is less interest in getting into the "hard" sciences (material, electrical, etc.).

The majority of governments across the world have sent a clear message about their level of concern with the future of humanity by their response to the climate crisis.

We are not, as a modern civilization, looking down the gun barrel because Obama put a 5 year freeze on Nasa's planetary exploration budget in 2012.
 
That's a very large part of it, but it's just part a general malaise about having to do hard work if you aren't qualified to do the smarter jobs. And a government(s) that encourages you to not work rather than work a difficult job. And it doesn't help that it's not as glamorous or high paying as a software engineering job.

The problem is with what is going on before it even gets to the foundry stage. The R&D behind the technology that a foundry like TSMC will eventually deploy into a fab is losing the skilled engineers that are needed to drive innovation in those areas. While key companies can still gain talent by poaching them from less successful tech R&D companies, that has a limit if the pool of skilled engineers continues to decrease due to a severe shortage of college students going into those fields that are needed.

IMO, if the US government (or any government) is serious about this, they need to go revisit the 50's when there was a large push for the sciences at the university level. While arts are nice, it'd be better if the government funneled college funding and thus students into fields where students actually have a real world chance at a career. So things like the arts should have limited government funding, IMO. If you want to spend 10's of thousands to earn a degree that will result in you being a burden to society, then you should have to pay for that yourself. Basically government grants and guaranteed loans should be off limits to anyone pursuing a liberal arts degree outside of maybe a very limited number of them.

I mean when the government sends a clear message that it is no longer interested in sponsoring things that can actually advance technology and the human race (see defunding of NASA under Obama) is it any wonder that there is less interest in getting into the "hard" sciences (material, electrical, etc.).

Regards,
SB
This is an interesting statement given that we are talking about video cards used for gaming here. The same gaming that relies a lot on creative arts (or is even seen as an art form) like visual, writing and music, among others, perhaps even more so than technology and graphics. I'm quite sure the gaming industry would not disappear if computer graphics stagnate. On the other hand if all we have is endless CoD and FIFA, due to a lack of creativity, I'm quite sure the industry would not prosper as much.
 
Simplygon post on Micro Mesh: https://simplygon.com/posts/aeec95de-79f1-40d1-b72f-e121e90648c0

I'm guessing the workflow may be similar to how normal maps are used. From the pictures it looks like the triangles are handled individually and possibly with different tessellation factors.

They're not the same and have different limitations but for compression micro meshes gives numbers like 11x, 28x, 14x, 20x whereas the nanite docs has 7.6x.

Not sure if the feature works for rasterization, with the talk of it being open perhaps a mesh shader fallback is also possible. I think there is a good chance it could work for animated geometry as well.
 
Not sure if the feature works for rasterization, with the talk of it being open perhaps a mesh shader fallback is also possible.
I wonder this as well.
It is described as a feature of RT core of Ada but would it work without RT (or RT cores for that matter)?
The key advantage here is that micromesh is what RT core is tracing against while the geometry is simple and thus BVH is also simple - the micro geometry is generated by the RT core while tracing a ray.
If this is tied to RT core then it may not work without RT.
 
Status
Not open for further replies.
Back
Top