NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
3090 will also likely to be slower than 4080/12 while costing more - which makes it a terrible value unless you specifically need those 24GBs of VRAM.

But we're going in circles. It's like people are so against Lovelace that they are using whatever arguments they can dream up to justify why. 3090 is NOT a better buy for a typical gamer unless it will fall below $900. No, you don't need more than 12GBs of VRAM for gaming even in 4K either.
 
I'm starting to think about getting a 3080Ti myself on the Nvidia side. That is unless AMD come out with something that offers sufficiently superior value to make up for the lack of DLSS (it'd have to be a LOT!)

If your intrested in going Ampere/3080Ti, i'd consider the 3090 as that GPU sports double the vram (12 vs 24). its basically the same gpu where the 3090 has more RT cores, shader units, higher clocks etc at the same tdp. Here in Sweden their quite close in price. In the used market their a bargin for what you get. Im seeing them go for 750usd.
 
3090 will also likely to be slower than 4080/12 while costing more - which makes it a terrible value unless you specifically need those 24GBs of VRAM.

But we're going in circles. It's like people are so against Lovelace that they are using whatever arguments they can dream up to justify why. 3090 is NOT a better buy for a typical gamer unless it will fall below $900. No, you don't need more than 12GBs of VRAM for gaming even in 4K either.

i think the problem is that the ante has upped. Not in raw raster performance but in other areas. You get much more performance leaps but also at much higher prices. Someone else on this forum shared an article on just that.
 
i think the problem is that the ante has upped. Not in raw raster performance but in other areas. You get much more performance leaps but also at much higher prices. Someone else on this forum shared an article on just that.
I think people straight up underestimate Lovelace's performance, going off Nvidia's DLSS3 graphs and tricking themselves into thinking that there will be like zero performance gains on a 4090 without DLSS3 or something. This whole discourse will likely come to a crashing stop at the moment of independent benchmarks release, but I wonder how many potential Lovelace buyers will manage to get themselves 3090s at $1000 before that to then feel "tricked" when a cheaper 4080/12 will end up being considerably faster even in "pure raster" not to mention RT titles.

At this point all I can say is "WAIT FOR BENCHMARKS". Worst case scenario - you will still get a 3090 but it will likely cost you even less than it does now.
 
I wonder how many potential Lovelace buyers will manage to get themselves 3090s at $1000 before that to then feel "tricked" when a cheaper 4080/12 will end up being considerably faster even in "pure raster" not to mention RT titles.
Not even NVIDIAs own picked benchmarks suggest it would do that.
 
Thats atleast what im doing, waiting for benchmarks and reviews. I also want to see what 4070 for example will do. Its too early to judge on lovelace i think, the thing isnt even out yet. The only complaints are prices, but it depends all on what kind of performance your getting.
IF DLLS 3 is going to give you that much performance over previous gen with no graphical differences aside from tech nerds disecting the images then the true gain and winner is there. Along with the hugely improved RT performance together its the way forward eventually.
I think were in a paradigim shift, away from the focus on raw raster. Though Ada still did see a healthy improvement there, even without RT and ML.

Intel is working (confirmed) on a DLSS3, their on to something.

What if a 4060Ti is going to perform close to what a 3080/Ti does now? Heck even a 4060 could be quite nice. The average gpu now is a 3060, imagine the average gpu being 3080 level later on. Its not bad at all.
 
If that would be the case then everyone would only be buying 3060s and 3090/Tis. Which is obviously not the case.

I'm not sure I follow the logic here. My point was that if you're willing to spend >£1000 on a GPU then you probably care more about raw performance than you do about value, so a 4090 at 80% more cost and performance might be more tempting.

Sometimes I wonder if people genuinely can't understand that there are products below the fastest one and you don't really have to buy the fastest one to get the new features and performance which would be enough for you. Why would I buy a 4090 if I plan to game on a 1080p TV for the foreseeable future? Rhetorical question.

But the cheapest product sporting those new features is over £1000! i.e. it's priced out of range of all but the highest level PC gaming enthusiasts (or rich people). If there were a normally priced 4xxx series available then believe me, I'd be first in line for it. That's exactly why I'm now considering just getting an Ampere instead which is highly disappointing given I've waited two years for something much faster at the 3070/3080 price point.

Come on, be fair, you can't objectively compare a brand new product that is being launched in the market to a last gen end-of-life SKU that is heavily discounted in order to clear the stock...

That's exactly what I've got to do if I'm trying to understand what the best value product that I can get today is though. Comparing back to the launch price of the 3090 might make sense if we want to understand the relative value of the products at launch but then we have to factor in the fact that we should expect better value at the same price point 2 years down the line. How much better value though isn't something that we can unambiguously measure. The 4090 definitely offers more performance/$ in its launch window than the 3090 did, but after two years, that's expected.

3090 will also likely to be slower than 4080/12 while costing more - which makes it a terrible value unless you specifically need those 24GBs of VRAM.

But we're going in circles. It's like people are so against Lovelace that they are using whatever arguments they can dream up to justify why. 3090 is NOT a better buy for a typical gamer unless it will fall below $900. No, you don't need more than 12GBs of VRAM for gaming even in 4K either.

Who said the 3090 was better value than a 4080 12GB? I said the 3080Ti is better value (probably). And I'm not against Lovelace in any way, shape or form. I think it's an amazing product. I'm against Nvidias naming and pricing of it.

If your intrested in going Ampere/3080Ti, i'd consider the 3090 as that GPU sports double the vram (12 vs 24). its basically the same gpu where the 3090 has more RT cores, shader units, higher clocks etc at the same tdp. Here in Sweden their quite close in price. In the used market their a bargin for what you get. Im seeing them go for 750usd.

Unfortunately the 3090 is around £300 more than the 3080Ti over here so not an option for me. The 4080 12GB will probably be cheaper than it which would be the better buy from my point of view (but too expensive overall).

What if a 4060Ti is going to perform close to what a 3080/Ti does now? Heck even a 4060 could be quite nice. The average gpu now is a 3060, imagine the average gpu being 3080 level later on. Its not bad at all.

I don't hold much hope for the lower tier parts tbh although I hope to be proven wrong. If a 4080 12GB is £1000 here with 3090 level performance, then I can't see the 4070 being much less than £700 with 3080 level performance and the 4060Ti coming in at around 3070Ti level performance for say £500-£600. They might end up as being better value than those parts but at this stage I think I want something a bit faster, i.e. 3080Ti or above performance. Also it could be months before those slower parts launch given than Nvidia will want to clear out the Ampere inventory first. 2 years wait was enough for me!

Really looking forward to the real reviews tomorrow though. I hope I'm wrong about all of this and the 4080 12GB ends up faster than expected and more reasonably priced in the UK.
 
But the cheapest product sporting those new features is over £1000! i.e. it's priced out of range of all but the highest level PC gaming enthusiasts (or rich people). If there were a normally priced 4xxx series available then believe me, I'd be first in line for it. That's exactly why I'm now considering just getting an Ampere instead which is highly disappointing given I've waited two years for something much faster at the 3070/3080 price point.

RTX4000 has only seen the high end and ethusiast announcement, its almost certain that a 4060/4060Ti and even 4070 will be much cheaper and at the same time offer performance in the high-end Ampere range.

Unfortunately the 3090 is around £300 more than the 3080Ti over here so not an option for me. The 4080 12GB will probably be cheaper than it which would be the better buy from my point of view (but too expensive overall).

New the 3090 is about 70usd more here than the 3080Ti. Used a 3090 can be had as low as abit above 700usd. Its quite tempting even coming from an ancient 2080Ti. I'd wait if i was in your shoes, see what lovelace does performance wise and the market for Ampere prices.

I don't hold much hope for the lower tier parts tbh although I hope to be proven wrong. If a 4080 12GB is £1000 here with 3090 level performance, then I can't see the 4070 being much less than £700 with 3080 level performance and the 4060Ti coming in at around 3070Ti level performance for say £500-£600. They might end up as being better value than those parts but at this stage I think I want something a bit faster, i.e. 3080Ti or above performance. Also it could be months before those slower parts launch given than Nvidia will want to clear out the Ampere inventory first. 2 years wait was enough for me!

Its indeed an unknown, and probably depends on what RDNA3 is going to do (and to some extend Intel). But i can imagine the 4060 being a very nice baseline versus the 3060 today.
 
I'm hearing more and more that stock should be pretty good for the 4090 launch on Wed.

More confident in my ability to get one now.

However, now I'm less confident that my case will fit the larger editions... particularly with the adapter cable.. I may have to remove the side panel 😬
 
tricking themselves into thinking that there will be like zero performance gains on a 4090 without DLSS3 or something
NVIDIA is doing it on purpose, they are trying to clear the stocks of high end Ampere, even if they had to downplay Ada's fps uplift. However, synthetic benchmarks point to 80%+ uplift, in DX11, DX12, and RT over the 3090.
 
RTX4000 has only seen the high end and ethusiast announcement, its almost certain that a 4060/4060Ti and even 4070 will be much cheaper and at the same time offer performance in the high-end Ampere range

Actually, they have already shown the 4070 and 4060Ti cards, they just put the 4080 sticker on both of them. I mean just look at the die size and TFLOP difference to the 4090 and compare them to previous gens.

Die Size:
AD102 equals GA102, AD103 equals GA104 and AD104 equals GA106. The die size differs ~5%, but last gen the smaller chips were sold as 3070 and 3060Ti, now they are both 4080s.
I am quite sure similar area usage can be seen in older generations as well between the three largest chips and the performance class they have been sold for.

TFLOPS:
Then then there are the TFLOP differences between 4090 and the 4080 cards. An 80-class card has never been so weak in performance compared to the "top-tier", in the past they delivered around 80% of the top-tier's TFLOPs.
A ~20% performance difference as seen between the two 4080s was also classified at least one if not even two tiers lower in the past, but now this should be the same performance level.
Just for comparison, the 4080 16GB only delivers 59% of the TFLOPs of the 4090, which is in line with previous 70-tier cards (55-60% of the top-tier). So we have a 70-class card (sub 500$ btw) priced at 1199$ now, because Nvidia put a 4080 sticker on it.
And the same thing goes on with the 4080 12GB, it only delivers 48.5% of the TFLOPs of the 4090. A similar performance gap can be seen at 1060Ti, 2060 and 3060Ti. So it uses a comparable die size to these chips and has a comparable performance target as these sub 400$ cards , but Nvidia calls it 4080 and adds a 899$ premium price tag on it.

And those TFLOP numbers are even in favor of the 40 series, because it takes the 4090 as top-tier, despite it already took probably the biggest cut of to the full chip for a 90 or 2nd best tier card, the others are compared to the full chip 90Ti, 80Ti or TITANs of that generation, which ever was the fastest.
 
Where is the above wrong? Why does it matter that the account is new? It is math, the numbers do add up, the only thing that is off is calling it a 4080 and the way too high pricing ;)
 
That's exactly what I'm talking about. Because they actually do suggest that.
How exactly does RTX 4080 12 GB losing in every single non-DLSS3 game to RTX 3090 Ti suggest it would be considerably faster than RTX 3090, when difference between RTX 3090 and 3090 Ti is small to begin with (3-8% @ TPU depending on res)?
It might be on average a tad faster, but considerably? How?
 
Where is the above wrong? Why does it matter that the account is new? It is math, the numbers do add up, the only thing that is off is calling it a 4080 and the way too high pricing ;)

There is alot going on before you joined the forum and the discussion, it has been kinda disruptive because instead of talking about the products technical merits things derail into how evil NV is and how doomed they are etc. That 'the 4080 aint a 4080' has been repeated a thousand times now. Its just not the best to keep pushing it, maybe good to review the topic beforehand. Can be a good thing to await the reviews and more deeply benchmarks aswell.
What i mean with the new acc comment is that you probably havent really understood what the temperature of the room in here is.

If that 4080 12gb is a lie and its a 4060/4070 as some claim it is, then its giving hell of a performance if its competing in raw raster with a RTX3090, let alone RT and ML performance then. It'd be the largest jump ever for a xx60 product?
 
If that 4080 12gb is a lie and its a 4060/4070 as some claim it is, then its giving hell of a performance if its competing in raw raster with a RTX3090, let alone RT and ML performance then. It'd be the largest jump ever for a xx60 product?

The problem is, that nobody would be paying 900$ for a 60-tier card or 1200$ for a 70-tier card, even if it is the biggest leap history. Which to be honest had to be expected with more than a full node shrink.
But look at it the other way around, those 4080s are now overpriced and the slowest 80-tier cards in history, compared to the top tier with full AD102 even missing.

And to be honest where should the performance magically come from? It is the same architecture, the specs are known, the boost clocks wont surpass the AD102 by a LOT to compensate for the missing SMs. Reminder it is only 60% and 50% of it where it should normally be ~80%.

Seeing the big gap between the 4080 16GB and 4090 how many other 4080 is Nvidia planning to release in the next year? Super, Mega, Ultra and Ti? There is enough room for all of them ;)
 
The problem is, that nobody would be paying 900$ for a 60-tier card or 1200$ for a 70-tier card

Though it isnt a 60-tier gpu, it isnt in performance thats for sure. I also doubt the 4070 will be 1200 dollars when its announced.

the slowest 80-tier cards in history

Lets await benchmarks to that claim.

Seeing the big gap between the 4080 16GB and 4090 how many other 4080 is Nvidia planning to release in the next year? Super, Mega, Ultra and Ti? There is enough room for all of them ;)

The most important will be the 4060 and subsequent 4060Ti. These are the mainstream GPUs in the gamers market, now its the 3060. To some extend the 3070 lives there so imagine the 4070 being intresting. Anything beyond those gpus and were talking high end, its up to debate if high end/enthusiast market are willing to pay for the >4080 12GB prices.
Obviously Nvidia seems to be thinking its going to be a success so we will see if their right or wrong. Maybe they are and they will non-exist soon, makes way for Intel and to some extend AMD then.
 
Though it isnt a 60-tier gpu, it isnt in performance thats for sure. I also doubt the 4070 will be 1200 dollars when its announced.

They actually are, but they are just labeled 4080s. I also have it as bar graphs to visualize what I wrote above.
Comparing TFLOPs vs. the biggest chip, again in favor of the 40 series because 4090 is not a full chip.
RTX400fake.png

Lets await benchmarks to that claim.

Just quote the whole sentence, would you? I never said that ;)
Reading the whole text helps to understand it. And there is more to go along with it in the next paragraph, that you somehow just ignored.
 
They actually are, but they are just labeled 4080s. I also have it as bar graphs to visualize what I wrote above.
Comparing TFLOPs vs. the biggest chip, again in favor of the 40 series because 4090 is not a full chip.

its officially a 4080, if it performs like one we will see because TF doesnt mean much when considering new architectures and features.
 
Status
Not open for further replies.
Back
Top