Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

I'm not sure how amazing current implementations can even be when people aren't able to discern when a game has it.
How is anybody supposed to discern something on static screenshots when screen-space artifacts manifest itself mostly in dynamic?
Try paying attention to screen-space effects during real gameplay and you will easily spot the difference in a second.
 
How is anybody supposed to discern something on static screenshots when screen-space artifacts manifest itself mostly in dynamic?
Try paying attention to screen-space effects during real gameplay and you will easily spot the difference in a second.
It was patently obvious there wasn't any Rt in the Nioh 2 screenshot and it looked the same as on any other platform. When people will look at that extremely unimpressive screenshot and claim the amount of detail is on another level just because it was posted by Nvidia marketing it casts a huge shadow of doubt over all the subjective claims about what RT currently brings to the table.
 
Last edited:
It was patently obvious there wasn't any Rt in the Nioh 2 screenshot and it looked the same as on any other platform.
It can't be patently obvious because there are lots of effects, such as AO or GI, which are hard to discern without alt-tabbing between screenshots or better playing the game.
Any diffuse lighting is hard to discern because it's diffuse. You can discern between these effects only if you are aware of all the shortcomings screen-space methods have and most people obviously aren't aware of screen-space at all.
 
Last edited:
it casts a huge shadow of doubt over all the subjective claims about what RT currently brings to the table.
RT brings reflections/GI/etc for offscreen and backface geometry, that's obvious.
You can cast the rays in screen-space, but it won't work for offscreen and backface geometry hence all the artifacts and limitations.
You can combine both to get the best of both worlds - reuse already shaded on-screen pixels where applicable and capture world-space geometry with BVH tracing to get rid of the screen-space artifacts, in fact, that's how it's done in most games today.
 
I can only guess how they were able to measure perf drops in games where resizable bar is not supposed to work at all without according game profile.
NVIDIA Inspector or some similar tool allows you to enable/disable it on per application basis at will
 
NVIDIA Inspector or some similar tool allows you to enable/disable it on per application basis at will
Yeah but they didn't actually check what would happen if they'd enable it in games where they were seeing performance degradation.
AFAIU NV don't "disable" ReBAR for games which aren't explicitly allowed to use it - they simply limit the CPU visible VRAM heap to 256MB. ReBAR is still enabled and apps which can use it may use it at will.
 
NVIDIA Inspector or some similar tool allows you to enable/disable it on per application basis at will
ReBAR is listed as "Unknown" setting in the Inspector, not sure whether changing it would enable ReBAR at all.
They obviously didn't enable it via Inspector for unsupported games, this would have been very stupid of them to do something like this.
 
I'm not sure how amazing current implementations can even be when people aren't able to discern when a game has it.
As I've said in the past, visual fidelity improvements are often discerned by their absence, not their presence. Turning on RT might make you acknowledge some improvement, but turning it off after playing with it for a while will probably elicit a stronger reaction. Of course this depends on the aesthetic sensitivity of the viewer, and also yes the quality of the implementation.

We've seen excellent implementations of real-time RT in multiple degrees of complexity, ranging from full on path-tracing (Minecraft, Q2) to heavy hybrid (Control, Cyberpunk) to well-done light-touch implementations (Insomniac's Spiderman PS5). That last example in particular shows what a talented developer can do with relatively modest hardware. So while I agree with your point that the implementation definitely matters, I object to your lumping of "current implementations" into one big bucket in an attempt to disparage the present-day RTRT push.
 
NVIDIA will be releasing a new version of the GeForce RTX 3060 to suppress ETH mining behavior once again for this very reason, as there is a new hardware ID and they will probably be paying closer attention to the drivers in the wild this time around. Namely, according to a Taiwanese graphics card manufacturer, NVIDIA has discontinued the old version of the GeForce RTX 3060 with the GA106-300 GPU chip and replaced it with the new GA106-302 GPU chip.
https://www.igorslab.de/en/beat-the...nd-hardware-id-against-miner-and-beta-driver/
 
That data shows Ampere is selling more relative to Turing than it gained Steam share compared to Turing.

Neither AMD nor Nvidia really track actual end usage (or at least disclose) nor do they want to (plausible deniability). They don't want to associate revenue boom with something volatile like crypto. With the last mining boom Nvidia's "gaming" quarterly revenue dropped $700m from the previous quarter, this is despite that quarter being the typically highest end of year quarter (due to the holidays). Nvidia's official reporting of revenue related to mining was rolled into OEM sales in the prior quarter was <$150m which decreased to $116m.

I guess it's possible that some sort collective phenomena happened during the 2018 holidays causing "gamers" to massively shun buying gaming GPUs compared to prior quarters as opposed to increase buying compared to every other holiday quarter before. Maybe some similar phenomena is causing "gamers" to also surge in demand (even paying multiple times MSRP) for "gaming" GPUs after the 2020 holidays as well.

ETH network hashrate for the first 6 months of Ampere increased 84%. If you actually only look at the hashrate increase from January to now in 3.5 months it's increased by 80%. This in comparison to only 30% for the 4 month period of September to January.

ETH network hashrate for the first 6 moths of Turing decreased 43%.

Are literally all GPUs going into mining? I don't think anyone reasonable is suggesting that. But I don't see how people can actually rationalize that the current GPU situation is primarily due to the mining factor unless there is some agenda involved. Even with tight supply, tariffs and other factors affecting the entire industry in general there is a clear outlier of demand/willingness to pay for 1 type of component.
 
They don't want to associate revenue boom with something volatile like crypto.
AFAIK it's not just about crypto being volatile, it's because that market doesn't have customer retention.
Crypto miners will only buy dGPUs as long as they're profitable. They don't give a crap about driver and gaming features, and they'll all disappear overnight once mining ceases to be profitable (and worse: they start selling all their GPUs in 2nd -hand channels that compete with the IHVs' / OEMs' new releases).

Unless they're e.g. Bitmain who produces your own hardware to mine your own coins and are able to tune their own production according to hash output and crypto market value, anyone who gets a substantial share of their revenue from supplying dGPUs for miners is constantly with a rope around their neck.


Nvidia is trying to avoid this by launching GPUs "just of mining", i.e. planned obsolescence products that can only become e-waste once mining loses its profitability. However, that isn't making a dent on dGPUs being made available for gamers.


I guess it's possible that some sort collective phenomena happened during the 2018 holidays causing "gamers" to massively shun buying gaming GPUs compared to prior quarters as opposed to increase buying compared to every other holiday quarter before.

Scott Herkelman from AMD said their market studies pointed exactly to this: whenever dGPU customers go through a long period of not being able to purchase a graphics card, they start to distance themselves from the hobby and a lot of long-time clients tend to not come back.


But I don't see how people can actually rationalize that the current GPU situation is primarily due to the mining factor unless there is some agenda involved.
Wasn't this sentence supposed to be a double negative?
How can the GPU situation not be primarily due to the mining factor? AMD claims the dGPU market will only stabilize once/if ETH goes below $800. Gamers in general won't stand a chance if that doesn't happen (or until ETH goes proof of stake, which IIRC was supposed to have happened a long time ago but somehow it hasn't).
 
Wasn't this sentence supposed to be a double negative?
How can the GPU situation not be primarily due to the mining factor? AMD claims the dGPU market will only stabilize once/if ETH goes below $800. Gamers in general won't stand a chance if that doesn't happen (or until ETH goes proof of stake, which IIRC was supposed to have happened a long time ago but somehow it hasn't).

This has a bigger impact:
The Substrate Crisis Deepens | SEMI

The October 2020 fire at Unimicron’s IC package substrate plant in Taiwan exposed the serious nature of the capacity shortage for IC package substrates. Substrate makers have been reluctant to make large investments in capacity over the last few years due to the fear that demand could decline and they would have excess capacity. Relentless price pressures by customers and the resulting low margins have weakened the finances of substrate suppliers.
Substrate Crisis TechSearch LogoWith tight capacity, substrate prices have increased and lead times are 14 weeks or more. The most critical shortage is for flip chip ball grid array (FC-BGA) substrates. In addition to increased demand in units, applications such as servers and networking products are seeing requirements for larger body sizes and increased layer counts.


Add in a demand like no time before, a pandemic, water shortages in asia (affecting FAB's)...and mining is not the the primary reason.
I could point you to NVIDIA's numbers....but facts seems to not be something used in your guesswork /shrugs

Hell it is even affacting car production (who mines on their cars "entertainment system"?)

Chip shortage disrupts auto production, but trader has one hedge play (cnbc.com)

Like I said..."reasonable" is sorely lacking online...even in this forum.
 
Everything else aside, how is that "gamers buying up" supposed to work?
Does it include all cards sold in Launch+6 months or just new architecture products for Launch+6 months?
Since Ampere+6 months has literally zero new architecture products in <$299 MSRP bracket. Turing had 2 or 3 and Pascal had 4.
It's only natural for "gamer's buying up" when you limit what's available at the lower price brackets.
 
Back
Top