Seems that miners also are fanatic gamers using Steam.
Yeah...the fallacies in use are hillarious
Seems that miners also are fanatic gamers using Steam.
They picked up a bad situation for DIY gamers - which is having the majority being forced to pay well above MSRP for dGPUs due to shortages - and somehow turned that into good news, as if "gamers are getting richer and paying more for our cards"!Everything else aside, how is that "gamers buying up" supposed to work?
Does it include all cards sold in Launch+6 months or just new architecture products for Launch+6 months?
Since Ampere+6 months has literally zero new architecture products in <$299 MSRP bracket. Turing had 2 or 3 and Pascal had 4.
It's only natural for "gamer's buying up" when you limit what's available at the lower price brackets.
Not only that, they're also skewing the numbers. Ampere has 1.7x the adoption rate of Turing, not 2x.Gamers don't mine too ? I don't get the point. Because ampere is increasing on steam, ampere is not buy by miners too ?
NVIDIA also shows that Ampere GeForce RTX 30 graphics cards amounted to over twice the share versus Turing in Steam. The results here are obtained for the same time period after launch and gamers are just flocking to get hands-on RTX 30 series graphics cards regardless of the insane price points that retailers have them priced at in the current market situation.
I think it's pretty clear from the footnote underneath the graphic itself:„Note: Desktop Card ASP based on MSRP of entire $99+ offering, for first 6 Months after launch“ [edit: missed an f]Everything else aside, how is that "gamers buying up" supposed to work?
Does it include all cards sold in Launch+6 months or just new architecture products for Launch+6 months?
Since Ampere+6 months has literally zero new architecture products in <$299 MSRP bracket. Turing had 2 or 3 and Pascal had 4.
It's only natural for "gamer's buying up" when you limit what's available at the lower price brackets.
Seems the increase would be even more if not for an " active rBAR" problem on RTX cards.Computerbase new Watch Dogs Legion benchmarks.
Performance has improved all around but more so on RTX GPUs. RX GPUs still have much better frame times.
Why though? 3090 has always done better against 6900 at 4K compared to lower resolutions and it was pretty close at 1440p alreadyThe situation at 1080p is really absurd. The 3090 jump above the 6900xt at 4K also raises eyebrows.
I wouldn't say it's faster when the 6900 offers a 50% higher frametime percentile.The situation at 1080p is really absurd. The 3090 jump above the 6900xt at 4K also raises eyebrows.
I wouldn't say it's faster when the 6900 offers a 50% higher frametime percentile.
I don't think that IC has anything to do with it since RDNA1 cards and even GCN cards show similar behavior.It's interesting from an architectural POV tho. maybe showing the strenght of Infinity Cache.
While not a certainty, it could point to frame times at various low percentiles being much worse as well though. Could be a noticeably less smooth experience.A 0.2% percentile is not a particularly useful metric to represent performance of a card or gaming experience. 2 out of 1000 frames being half as fast says very little, even @100 fps we're talking about just 2 frames every 10 seconds being "half speed" vs 50% longer on the 6900), and @65fps (as shown in the charts for the 3090) it's 2 frames every 15 seconds. Even if consecutive, hardly noticeable.
It's interesting from an architectural POV tho. maybe showing the strenght of Infinity Cache.
Yep, that was close to 500 individual test runs, time to wrap things up. We're not 100% what to make of SAM / ReSize BAR just yet. Yes, it has the potential to boost performance a bit, but the plots show that more visually than you would ever notice on the screen. However, the results show that overall in a bigger picture the FPS differences are more eminent, but often can even fall within normal error margins.
Unlikely, it's only Ethereum limiter so there's probably very specific things it detects, probably more than just memory access patterns. Other than ETH you can mine away without limitations so if it was just memory access patterns, what are the chances something useful would ever match that while probably more similar mining software for other coins isn't affected.Question regarding LHR versions of GPUs Nvidia is now releasing. I don't know much about how they detect the hashing but I believe it's based on how the mining tools memory access patterns? I'm wondering if there would ever be a possibility in near future where gaming, rendering or other uses non-mining related could be falsely detected as ethereum-like mining and cause the GPU to throttle?
We took a look at 1080p and 1440p gaming across a wide range of current-gen GPUs a few weeks ago, and now, we’re going to dive back in, but with a focus on 4K and ultrawide (3440×1440). With ten games in-hand, we’re going to explore which cards will deliver the kind of performance you’re looking for with either of these grueling resolutions.