Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

There was a previous video of theirs (I think it was just a general news roundup/discussion) where they talked about this, and basically echoed the NedTechGasm video explanation as the reason, not "this might be why this is happening" but the 'hardware vs. software scheduler' is why it's happening, and that Nvidia can't 'fix' this until a full hardware refresh, full stop. So they seem particularly wed to this explanation, you would think that would be a good opportunity to try and reach out to other game developers to get their insight, even if they were anonymous.

Again a simple addition I think that could provide some more insight would be to test all these games at a locked framerate of 60fps and record per-core CPU usage and frametimes over a run, you can see how hard the CPU has to work to deliver the same performance. You still want uncapped results of course to determine if it's actually a 'bottleneck' to the final fps, but this method will at least separate out the CPU workload more clearly imo.

Yep, I think recording cpu usage at fixed framerates might be illuminating. Maybe running uncapped tests at 720p so they don't have charts where they have low end cards that are gpu limited mixed with high end cards that are cpu limited so scaling is more obvious. Maybe take a high end cpu with large caches and disable cores to compare to a lower end cpu with small caches (R7 to R3, i7 to i3). Part of the testing was done on an i5 with 9MB of cache at two different RAM speeds and timings. Some games showed nvidia scaling worse on low timings, some games they were about the same, and some AMD scaled a little worse. Some games may be more cache friendly than others so it might make sense to do that test on an i3 or r3 with really small cache that's undoubtedly going to lead to more RAM accesses if one vendor driver has a heavier reliance on memory accesses. I feel like there's a lot that can be done to actually figure out what exactly the bottleneck is, but I guess they got the general idea correct enough that people know to be aware of how they pair gpus with older or slower gpus and they intend to play competitive games or play more cpu heavy games in the future.
 
GALAX and Gainward release GeForce RTX 3090/3080/3070 and 3060 Ti BIOSes with ResizableBAR support - VideoCardz.com
It is important to note that the software is being listed under very specific GALAX China and Gainward China models, not even the international variants. It is not guaranteed to work and we strongly advise everyone not to use these tools till your board partner confirms it will work with your card.

The downloaded file does not appear to be digitally signed by NVIDIA, but it does look like the official NVIDIA tool. The software is not really a custom BIOS, but rather a tool that modifies existing BIOSes on the graphics cards. Theoretically, it should work on all RTX 30 graphics cards.

Redditors have already confirmed that the tool works on non-Chinese variants such as GALAX RTX 3090 SG:
GALAX-RTX3090-SG-ReBAR.jpg
 
So what's the story here?
Was it just a laser marking made by mistake? Did Nvidia lower the performance requirements of the RTX3090 in order to fit GA102-250 GPUs in that card? Does this mean the 3080 Ti is effectively scrapped?
The story is that 3080Ti was supposed to launch back in 2020 with the same number of SMs as 3090 but with a 320-bit bus carrying 20GBs of G6X. This one was scrapped and some of the chips which were qualified for it went into 3090 instead - likely not all of them since there could be some which actually had defects in two of disabled MCs.
The "new" 3080Ti is expected to launch in May with less SMs than was originally planned but with a full 384-bit bus carrying only 12GBs of G6X however.
 
Steam Data Shows Ampere GPUs Barely Trickling Into Market


As the GPU shortage continues, what constitutes “success” is being rapidly recast. Several publications have recently run stories claiming that an uptick in Ampere GPU deployments according to the Steam Hardware Survey constitutes evidence that these cards are making their way to gamers and that miners aren’t soaking up all the demand.
(...)
The RTX 3070 gained 0.17 percent market share from February 2021 to March 2021. That’s the most market share any GPU gained last month. But according to past Steam data, a single GPU topping out at 0.17 percent adoption isn’t very good at all.

I’ve surveyed several multiple data points in the SHS over the past two years. In November 2019, no fewer than nine GPUs gained more than 0.17 percent market share. The RTX 2060 picked up 0.42 percent that month, for example. In February 2020, before the pandemic hit, the GTX 1660 Ti and GTX 1650 gained 0.34 percent and 0.51 percent share, respectively, with other cards above 0.17 percent. Even in March 2020, with the pandemic gearing up, cards like the RTX 2060 (0.51 percent), RTX 2070 (0.31 percent), and RTX 2070 Super (0.28 percent) saw stronger growth than what’s being reported for Ampere today.


Contrary to official statements, this isn't a too-much-demand-for-same-supply issue. It's a supply issue.
Whatever cards are being produced, they're going directly into miners' hands.
 
I was just about to mention that the +0.17% was historically un-remarkable for the RTX 3070. It's just the most right now because not many new GPUs are making it into gaming machines period.

When I go to Amazon and see Radeon 6800's selling for ~2,000 USD, it makes me sad. Or seeing listings for OEM 5700 series Radeons selling for 1,400 USD that are advertised as "optimized for mining." I'd imagine the situation would be the same for NV cards if there were any in stock at Amazon. No gamer in their right mind is paying that much for a card right now, but miner's? I'm guessing there are people thinking they can make a profit at those prices.

Regards,
SB
 
Contrary to official statements, this isn't a too-much-demand-for-same-supply issue. It's a supply issue.
Whatever cards are being produced, they're going directly into miners' hands.

If you start with a pre-conceived narrative you can always cherry-pick specific data points that seemingly support that narrative.
Here are some counter-examples to that narrative from the same data source (SHS):
  1. The GTX 1070 (an extremely popular card, $379-$449) launched on 05/27/2016. Five months later, the SHS for 10/2016 shows its adoption at 1.30%: http://web.archive.org/web/20161106071852/https://store.steampowered.com/hwsurvey/videocard/
  2. The RTX 2070 Super (a popular Turing/RTX card, $499) launched on 07/09/2019. Five months later, the SHS for 12/2019 shows its adoption at 0.63%. http://web.archive.org/web/20200103034839/https://store.steampowered.com/hwsurvey/videocard/
The 3070 ($499) reached 1.29% (~identical to the 1070 and nearly 2x the 2070S) of a presumably larger install base than either of those cards within a similar time frame (launch 10/29/2020, SHS 03/2021).
 
No gamer in their right mind is paying that much for a card right now, but miner's? I'm guessing there are people thinking they can make a profit at those prices.
The minds of gamers matter very little when there are a medium-large corporations backed by venture capitals and setting up camp in the industrial complexes were the cards are made, making deals for tens of thousands of cards at a time and cutting away all the transport and distribution costs for the "gaming" cards.

If you start with a pre-conceived narrative you can always cherry-pick specific data points that seemingly support that narrative.
Here is Joel Hruska's contact so you can complain to him instead:
https://www.extremetech.com/author/jhruska

Perhaps your post wasa reaction to what you interpreted as an anti-nvidia news post. It wasn't.
He just picked up the RTX3070 because it has been recently lauded as "market penetration success story". It would be worthless to make such a story about an AMD card because those tend to generally bug out in the steam hardware survey.
 
I was just about to mention that the +0.17% was historically un-remarkable for the RTX 3070. It's just the most right now because not many new GPUs are making it into gaming machines period.

When I go to Amazon and see Radeon 6800's selling for ~2,000 USD, it makes me sad. Or seeing listings for OEM 5700 series Radeons selling for 1,400 USD that are advertised as "optimized for mining." I'd imagine the situation would be the same for NV cards if there were any in stock at Amazon. No gamer in their right mind is paying that much for a card right now, but miner's? I'm guessing there are people thinking they can make a profit at those prices.

Regards,
SB

If I were to upgrade now, I would not look at the price...I never do.

Hobbies cost money and gaming is cheaper than my martial arts hobby.
 
Back
Top