Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

Everything else aside, how is that "gamers buying up" supposed to work?
Does it include all cards sold in Launch+6 months or just new architecture products for Launch+6 months?
Since Ampere+6 months has literally zero new architecture products in <$299 MSRP bracket. Turing had 2 or 3 and Pascal had 4.
It's only natural for "gamer's buying up" when you limit what's available at the lower price brackets.
They picked up a bad situation for DIY gamers - which is having the majority being forced to pay well above MSRP for dGPUs due to shortages - and somehow turned that into good news, as if "gamers are getting richer and paying more for our cards"!
Of course the moment the 5 year-old GTX 1050 Ti, which debuted at $140, starts selling above $300, you can brag about increasing your the amount of cards selling for over $299.
At this point they're just celebrating the ripping off of their largest source of income.


Gamers don't mine too ? I don't get the point. Because ampere is increasing on steam, ampere is not buy by miners too ?
Not only that, they're also skewing the numbers. Ampere has 1.7x the adoption rate of Turing, not 2x.
And it's not like the comparison to Turing is a great accomplishment. They're comparing Ampere to what was probably their family of GPUs with the slowest selling ramp of the past decade. Turing cards released among harsh criticism from their lower value compared to Pascal cards, which at the time Turing released were flooding the 2nd-hand market as miners were dumping them due to the crypto crash.
 
Last edited by a moderator:
NVIDIA Calls Ampere GeForce RTX 30 Series Graphics Cards Their Best Launch Ever, More Gamers Buying High-End GPUs

https://wccftech.com/nvidia-calls-ampere-geforce-rtx-30-series-graphics-cards-their-best-launch-ever

NVIDIA also shows that Ampere GeForce RTX 30 graphics cards amounted to over twice the share versus Turing in Steam. The results here are obtained for the same time period after launch and gamers are just flocking to get hands-on RTX 30 series graphics cards regardless of the insane price points that retailers have them priced at in the current market situation.

NVIDIA-GeForce-RTX-30-Series-Graphics-Cards-Ampere-GPU-Best-Launch-Ever-_2-1030x579.png
 
Everything else aside, how is that "gamers buying up" supposed to work?
Does it include all cards sold in Launch+6 months or just new architecture products for Launch+6 months?
Since Ampere+6 months has literally zero new architecture products in <$299 MSRP bracket. Turing had 2 or 3 and Pascal had 4.
It's only natural for "gamer's buying up" when you limit what's available at the lower price brackets.
I think it's pretty clear from the footnote underneath the graphic itself:„Note: Desktop Card ASP based on MSRP of entire $99+ offering, for first 6 Months after launch“ [edit: missed an f]
 
Last edited:
The situation at 1080p is really absurd. The 3090 jump above the 6900xt at 4K also raises eyebrows.
Why though? 3090 has always done better against 6900 at 4K compared to lower resolutions and it was pretty close at 1440p already
 
I wouldn't say it's faster when the 6900 offers a 50% higher frametime percentile.

A 0.2% percentile is not a particularly useful metric to represent performance of a card or gaming experience. 2 out of 1000 frames being half as fast says very little, even @100 fps we're talking about just 2 frames every 10 seconds being "half speed" vs 50% longer on the 6900), and @65fps (as shown in the charts for the 3090) it's 2 frames every 15 seconds. Even if consecutive, hardly noticeable.

It's interesting from an architectural POV tho. maybe showing the strenght of Infinity Cache.
 
A 0.2% percentile is not a particularly useful metric to represent performance of a card or gaming experience. 2 out of 1000 frames being half as fast says very little, even @100 fps we're talking about just 2 frames every 10 seconds being "half speed" vs 50% longer on the 6900), and @65fps (as shown in the charts for the 3090) it's 2 frames every 15 seconds. Even if consecutive, hardly noticeable.

It's interesting from an architectural POV tho. maybe showing the strenght of Infinity Cache.
While not a certainty, it could point to frame times at various low percentiles being much worse as well though. Could be a noticeably less smooth experience.
 
PCIe Resizable BAR Performance AMD and NVIDIA benchmarks - DX11: Unigine: Superposition (guru3d.com)
Yep, that was close to 500 individual test runs, time to wrap things up. We're not 100% what to make of SAM / ReSize BAR just yet. Yes, it has the potential to boost performance a bit, but the plots show that more visually than you would ever notice on the screen. However, the results show that overall in a bigger picture the FPS differences are more eminent, but often can even fall within normal error margins.
 
Question regarding LHR versions of GPUs Nvidia is now releasing. I don't know much about how they detect the hashing but I believe it's based on how the mining tools memory access patterns? I'm wondering if there would ever be a possibility in near future where gaming, rendering or other uses non-mining related could be falsely detected as ethereum-like mining and cause the GPU to throttle?
 
Question regarding LHR versions of GPUs Nvidia is now releasing. I don't know much about how they detect the hashing but I believe it's based on how the mining tools memory access patterns? I'm wondering if there would ever be a possibility in near future where gaming, rendering or other uses non-mining related could be falsely detected as ethereum-like mining and cause the GPU to throttle?
Unlikely, it's only Ethereum limiter so there's probably very specific things it detects, probably more than just memory access patterns. Other than ETH you can mine away without limitations so if it was just memory access patterns, what are the chances something useful would ever match that while probably more similar mining software for other coins isn't affected.
 
GeForce & Radeon Ultrawide & 4K Gaming Performance Roundup – Techgage
May 7, 2021
We took a look at 1080p and 1440p gaming across a wide range of current-gen GPUs a few weeks ago, and now, we’re going to dive back in, but with a focus on 4K and ultrawide (3440×1440). With ten games in-hand, we’re going to explore which cards will deliver the kind of performance you’re looking for with either of these grueling resolutions.
 
Back
Top