NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
I don't agree with any of that.

A. "Significant" isn't a measurable definition. +30% in CPU ST is significant, right? Why is it not in GPU performance?

True but if we're now to expect performance/$ improvements more in line with CPU scaling than historical GPU scaling then we may also see the user replacement frequency of GPU's drop to something more in line with CPU's which is generally quite a bit lower, and I'm not sure that would be good for the enthusiast PC gaming market.

Then what are we using when we count this number? Average performance results from the likes of HUB who still pretty much ignore RT in 2022? TPU results do not include RT in them either so what is the baseline for such comparisons? Do I personally care if some new card provide "only" +30% in 1080p in Diablo 3? Is this "insignificant"?

I'm sure everyone will make their own determinations based on what matters to them but ultimately, less performance/$ is less performance/$ regardless of what metric it's in.

Personally if the 4080 12GB ended up 100% faster than the 3080 in RT heavy games but only 30% faster in non-RT games, then I would consider that a huge performance increase, worth paying a bit more for 2 years after the 3080 launch. But I suspect the uplift in both metrics will be much closer to each other than that.

B. +30% at the same price DOES improve perf/price. Nowhere anywhere does anyone promised you that it will be more (or less) with each next generation. If you don't like it then don't buy the new product, it's really that simple.

I'm certainly not saying that we've been promised or are entitled to a specific performance uplift. But we have been trained what to expect from decades of similar product refreshes, and for many of us it's that rapid progress in performance/$ which is one of the most attractive aspects of PC gaming. As you say above, I think many people won't like it and so won't buy the new product, or will skip more generations than they have done in the past. Hence why that's bad for PC gaming.

C. Pascal to Turing was more or less flat on perf/price (1080Ti was slightly behind 2080 at the same launch price). This isn't the same as a +30% gain in performance at the same price.

I was comparing 1080 to 2080 where price went up by $100 with a ~40% performance uplift. But yeah I didn't consider the 1080Ti which in hindsight was an amazing deal compared with the 1080. This is just another example of how much worse the PC market has become in price/$. Pascal top end consumer GPU (excluding Titan class hardware) was $699, Turing was $999, Ampere was $1299, and who knows where Ada will end up.

Then what products are we even talking about? Have I missed the announcement of MSRPs for the upcoming cards? Do we know what else they will bring to the market besides the oh so popular "moar FLOPs"? People are too quick to jump at any possibility to bash some product before they even know anything about it - price including. Remember the times when 4080 was expected to draw 500W and everyone and their mother was running around screaming how unacceptable that is? What happened to this I wonder.

Hey if Ada comes in with a more reasonable MSRP - i.e. the 4080 12GB (AKA 4070) at $499-$599 the the real 4080 (16GB) at $699-799 with the rumoured/expected performance uplifts then no-one will be happier than me. I've been first in line for one of these things for almost 2 years now.
 
True but if we're now to expect performance/$ improvements more in line with CPU scaling than historical GPU scaling then we may also see the user replacement frequency of GPU's drop to something more in line with CPU's which is generally quite a bit lower, and I'm not sure that would be good for the enthusiast PC gaming market.
I mean what can you do? Rewrite the laws of physics? Start giving away your products for free because that would be "good for the enthusiast PC gaming market"? Would it actually?

Hey if Ada comes in with a more reasonable MSRP - i.e. the 4080 12GB (AKA 4070) at $499-$599 the the real 4080 (16GB) at $699-799 with the rumoured/expected performance uplifts then no-one will be happier than me. I've been first in line for one of these things for almost 2 years now.
Hence why I think that everyone should chill and wait for now. We don't know much about the products besides TDPs and (fairly useless) FLOPs/units numbers. We're not even sure that 4080 12GB is a thing yet.
 
I don't think there can be a reasonable assessment of these upcoming architectures that can use merely rasterised gaming performance. I'd like to see solely max-RT used.

God no. The vast majority of titles on the market today don't have significant RT implementations, and 4k 144hz+ displays aren't as esoteric as they used to be. There is plenty of room for significantly increased rasterized performance to provide a meaningful benefit.

That's not to say reviewers should take the Hardware Unboxed line and be dragged kicking and screaming to cover RT at all, but solely? No way.
 
The vast majority of titles on the market today don't have significant RT implementations
Indeed, however there are several points to consider:
1- The numbers of RT titles is rapidly increasing, they are now over 90.
2-Relatively old titles are adding RT retrospectively (even Death Stranding is adding it, Hitman 3 did add it, Crysis, Resident Evil ..etc)
3-Very old titles are adding path tracing modifications, those require hefty requirements (see Doom, Serious Sam TFE and Castle Wolf)
4-Upcoming titles are pushing the RT effects harder than before (Atomic Heart, Boundary, Forza Motor, STALKER 2, Avatar Frontiers Of Pandora ..etc)
5-So far, there are 20 titles with significant (and heavy) RT implementation (Minecraft RTX, Cyberpunk, Dying Light 2, Metro Exodus EE, Control, Fortnite, MechWarrior V, Bright Memory Infinite, The Medium, LEGO Builder's Journey, Ghostwire Tokyo, Crysis 3 Remastered, Industria, Sword and Fairy 7, Marvel's Guardian Of The Galaxy, Quake 2 RTX, Amid Evil, Hitman 3, Chernobylite, not to mention the Doom and Serious Sam path tracing mods), the number of these games is increasing rapidly, we really the need hardware oomhp to run them properly at native 4K or even DLSS Quality.

 
4K 144fps non-RT should be easy for 4090. "Who cares" easy. Same should be true of Navi 31.

So it will be literally irrelevant as a comparison of these two cards in purchasing decision terms.
Absolutely. RDNA3/Ada battle will be on RT, not on raster. Who care about 150 or 200fps at 4k pure raster? It's irrelevant. But 60 vs 100fps with RT is a another story and the real deciding factor...
 
4K 144fps non-RT should be easy for 4090. "Who cares" easy. Same should be true of Navi 31.

So it will be literally irrelevant as a comparison of these two cards in purchasing decision terms.

Ah ok my bad, I read your post incorrectly as 'rasterization performance is irrelevant in this generation of cards', meaning that any reviews of say, a 4090 just by itself shouldn't even bother with non-RT benchmarks.

For the comparison of architectures between AMD and Nvidia though as you said, yeah it makes sense. Both will be more than close enough in rasterization between them that RT performance should definitely be focused on.
 
8k should be on horizon as well. nvidia already tried the BFGPU branding with 3090's launch for 8k gaming, but were soundly mocked since it struggled without DLSS ultra-performance mode and with it on was not upto par to native. However, with DLSS performance upscaling from 4k to 8k. the image quality should be much closer.

Also, review sites will have CPU bottleneck issues in quite a few games at 4k with these cards. So they'd need to jump to higher resolution as well. RT should be a good differentiator, but even that might need increased CPU power to show the difference.

However, 8k 144Hz monitors don't seem to exist and DP2.0 is not confirmed for these cards which would be required for 8k displays to really take off.
 
4K 144fps non-RT should be easy for 4090. "Who cares" easy. Same should be true of Navi 31.

So it will be literally irrelevant as a comparison of these two cards in purchasing decision terms.

Cyberpunk says hi :)

B2AE5639-7E78-4F03-A1A7-9D17C8C73915.png

Seriously though there are lots of 4K/120 TVs out there now so it’s arguably a mainstream target. We probably have one or two more generations to go before pure rasterization benchmarks are truly useless. For the big dog flagship cards it’s happening much sooner especially with DLSS and XeSS becoming more prevalent.
 
I still find the narrative interesting in respect to being insistent at testing at "max settings" while putting RT in it's own bracket.
Yeah. Max settings without RT are not max settings.

RT has a much higher impact on visual fidelity compared going from high to ultra raster ( many titles even medium to ultra) in games with a great RT implementation.

Often, medium Raytracing + high raster settings look almost a generation ahead of Ultra settings without RT. And nearly every GPU capable of HW-RT can run these at 60 FPS in most games with the help of upscaling.

But RT won't be in its own bracket for long. I suspect in the near future you will have RT integrated into the higher settings of a game (for example, setting reflections to high automatically enables RT reflections) and only that makes sense, given how little difference there is in most games with the raster settings.
 
Status
Not open for further replies.
Back
Top