NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
That's not we're talking about though, we're discussing the performance uplift in pure ray tracing loads, not all games in general.

And with RT performance they didn't specially compare two cards.
If the 3080 was more than 2X faster than the 2080 in ray tracing workloads, then the relative performance would be advertised to be "up to" more than 2X faster. The fact that 2X is the upper limit implies that this is the limit for ray tracing performance. And that's what testing confirmed.
 
If the 3080 was more than 2X faster than the 2080 in ray tracing workloads, then the relative performance would be advertised to be "up to" more than 2X faster. The fact that 2X is the upper limit implies that this is the limit for ray tracing performance. And that's what testing confirmed.

You're missing the point, NVidia claimed a 2x RT increase for the 3000 series and we get 60-70% on average, with some outliers.

Nvidia have also claimed 2x RT increase for the 4000 series, so we could also likely see the same 60-70% increase, with some outliers.
 
Seems like "12GB" is there for that. Or are we talking about those mythical GPU buyers who base their buying decisions solely on GPU name on the box?
The amount of VRAM is not some indicator of a performance difference in the GPU itself. It never has been. The GPU names like 3070 and 3080 are there for a reason to differentiate between their performance characteristics of the GPU core. Even "Ti" cards.
 
The RT Cores of Ampere have twice the triangle intersection rate of Turing. The 3080 is more than twice as fast in games like Minecraft and Quake 2 RTX. Lovelace has again twice the triangle intersection, 50% higher clock rates and SRE for ~25% more performance.
 
You're missing the point, NVidia claimed a 2x RT increase for the 3000 series and we get 60-70% on average, with some outliers.

Nvidia have also claimed 2x RT increase for the 4000 series, so we could also likely see the same 60-70% increase, with some outliers.
The 2X was always an "up to" as far as I can see (even their own slides show an average well below 2X!), and they never specifically said that the boost would apply to every 3000 card. (Obviously the graph was misleading, like their power efficiency figures).

Anyway, I agree that the *average* boost for ray tracing will likely be a lot less.
 
Considering they changed how ray tracing is being done in huge ways, it could be that future RT games might not be "specifically" optimized towards Ampere and then we will start to see outliers maybe. There's also the case of RT complex affecting RT performance of cards, and devs might use such settings that destroy Ampere to a point where Ada beats it by 2x. It is definetely possible.
 
geforce-rtx-40-series-gaming-performance.png



I'm wondering if DLSS is even enabled on the 3090ti in the 3 rightmost comparisons.
 
Turing and Ampere launch where not so popular here either. (i dont get why console gamers have a cake in this anyway) Really the only thing truly wrong with these new GPU's are the prices. Has alot to do with the Ampere backlog and most of all zero competition in the GPU space. IF AMD launches a competitive enough product, perhaps not as fast or capable in RT/AI etc, but at a much lower price, then thats competition in some ways.
Intel could also give some more compeition going forward. Here in EU we get screwed as usual, we pay 1600euros for an iphone opposed to 500/600usd less in the US. Same seems to be happening to the new GPU's. PS5 got the highest increase here too. The EU has the largests financial problems at the same time.

Also, Ampere gpu's are still great, its not like they got worse one day to another.
 
Last edited:
The fine print says yes.

I also hate the fact that they advertise DLSS performance. DLSS already has issues in Quality mode and these are just exacerbated in Performance mode.

Yes the numbers Nvidia shared are mostly useless and don’t represent how the vast majority of people will use these cards.
 
The fine print leaves room for it not to be enabled TBF.
Not really, they did state DLSS Super Resolution = DLSS 2.x and thus works on any DLSS capable card. DLSS 3 is literally DLSS 2.x + Frame Generation (+ Reflex)
 
About the SER out of order thing, this process is done by the hw+driver right now in current gpu , yes ? Trying to fully use the gpu ressources I mean. It seems like a huge thing, but, I can't imagine that it wasn't done at least partially before ?
Intel has it:
1663763277618.png
ImgTech had some coherency sorting engine too long before RTX.
Back then i though that's ray reordering, but maybe it only was about material sorting as well.
So probably ImgTech was first.

Btw, Intel also has support for traversal shaders, which NV still does not mention, so i guess they lack it.
Intels RT really looks advanced and also fast. Maybe they deserve some attention as well.

Regarding Ada, i'd be very interested about the micro polygon feature. Sounds a bit like BVH is coarse, and geometry in the leafs could be somewaht flexible in the best case.
I hope we hear more about that, than we did about 'HW motion blur' for Ampere.
 
It's only me that watch the 2 year old CP2077 running at 22fps on the still unreleased most powerful gpu in the world, and thinks that maybe RT is still 10 years away?

Nah it’s a good thing. This is 22fps in 4K RT Psycho++ settings and you can scale that way down. In 10 years we will have 8-10 the RT performance. That’s 175fps.

I don’t subscribe to this belief that hardware should dominate software. It should always be the other way around since the shelf life of software is much longer. There will be a lot more people playing Cyberpunk 10 years from now than there will be people using 4090s.
 
Your platform of choice will go first im sure.
I do not choose my platform - gamers do.
Let's put the old misunderstandings on a rest, please. It really was just that - all the time.

My predictions are PC arrives at APUs in general. SteamDeck, Rembrandt etc., are first signs, but it will come to desktop as well.
NV looks like they already prepare for this. Their consumer focus seems much more cloud gaming and content creation. At least that's how it looks to me.

But that's jsut me, and we shall see.
 
The amount of VRAM is not some indicator of a performance difference in the GPU itself. It never has been. The GPU names like 3070 and 3080 are there for a reason to differentiate between their performance characteristics of the GPU core. Even "Ti" cards.
The amount of VRAM is a differentiator between models.
Again, let's not pretend that there are people who buy $1000 GPUs solely based on their name alone.
And even if someone would do this how exactly a "3080 vs 3080Ti" is more clear in which card is better than "4080 12GB vs 4080 16GB"?
 
Turing and Ampere launch where not so popular here either. (i dont get why console gamers have a cake in this anyway) Really the only thing truly wrong with these new GPU's are the prices.

Yeah the product itself is clearly awesome, but pricing and marketing is what dictates peoples expectations of how awesome it should be and I think this is where NVIDIA have fallen hard tbh. That 4080 12GB should have been marketed as a 4070. If the price were then even $100 over the 3070 people would I suspect still see that as a good deal. I think even a $150 increase would have been accepted with a few grumblings. But whacking it up by $400 and then renaming it a 4080 to trick people into thinking they're getting better value than they are is where Nvidia have gone wrong IMO. But hey, maybe they simply had no choice on the pricing and people will buy it anyway in which case what I'm saying doesn't matter.

For my part though as a pretty serious PC hardware enthusiast of decades whos been itching to get my hands on a new GPU for the last few years and have been waiting on this with baited breath, I'm put off. I was ready to put my money down as soon as these were available before the announcement but now I'll wait and pray that AMD release something more compelling. Perhaps that may also have a knock on effect on Ampere prices bringing that performance level pricing in line with what we'd expect following a new generation launch. If that were to happen then I'd be tempted by a high end Ampere.
 
Is it true that there's no Founders Edition for the not-4070 12Gb? So a couple of crappy editions for maybe MSRP with the majority $50-$150 higher than MSRP.
 
Status
Not open for further replies.
Back
Top