Actually i'm more worried the situation becomes the reverse, because couch gamers get more chips than chair gamers.
This is actually
an important point that Scot Herkelman raises in his PCGamer live conversation.
He said that their market research found out that people will start to mentally distance themselves from buying PC gaming hardware - and PC gaming itself - if they go through a long period of time without being able to buy new stuff. They observed it in 2017 and they're observing it now.
I.e. mining is taking away PC gamers for good, as apparently there's a tendency to not come back when the market stabilizes.
I'd guess there's also a seasonal tendency to decrease the PC gaming marketshare for a couple of years whenever a new generation of consoles comes out. However, this time it's a perfect storm:
1 - the new consoles are offering high-end gaming experiences:
2 - PC CPUs and graphics cards that would be equivalent to said consoles are overpriced. For example, the 6700XT released with a similar MSRP to the PS5 and SeriesX, while
back in 2013 the HD7870 GHz was going for less than $280;
3 - The CPUs/GPUs that would offer a substantially upgrade over the consoles are nowhere to be found (and/or reaching ridiculous prices).
So while AMD and Nvidia are making record profits out of their GPU sales these quarters, their serviceable available market is shrinking.
Speaking on a personal level, I had planned to do a major upgrade in the beginning of this year (probably Ryzen 5900 + best deal between Navi21 or GA102 at around $600), but of course I couldn't get any of those.
Nowadays I'm progressively less inclined to deal with the miner/scalper/availability shitshow and I certainly don't have the time+patience to follow availability on select twitter accounts to rush on to an estore just to see the unavailable red sign on a product that was already way pricier than my initial budget.
If push comes to shove and they start releasing games I want to play that aren't available on the PS5 (e.g. elder scrolls) and don't play well on my old PC hardware, I think I'll just give up on PC hardware and buy an xbox.
Though, who cares. High spec seems no longer making such a big difference anyways.
It usually doesn't, at the start of a new generation of consoles (which tends to increase the baseline by ~8x over the previous generation), but I do agree the "graphical ROI" has been going down.
Regardless, it's also a good thing for us consumers that high/top-end graphics cards don't provide a big difference from mid-end offerings because the price of the high-end GPUs has been steadily rising at a pace way above the inflation and manufacturing cost.
More curious to me is that recommended for rasterisation is 1070 or 5700. The latter's normally ahead of a 1080 isn't it?
Yes, the 5700 is around 25% over the 1070 and even the 5600XT is some 10-15% faster. It's a bit strange they're not mentioning the Vega cards that are contemporaneous with the Pascal models. It's a cross-gen game so there's probably a very optimized path for GCN GPUs that put the Vega 56/64 on par with the 1070/1080.
Though as we've been seeing, the system requirements lists often have these weird nonsensical comparisons. Cyberpunk 2077's recommended system requirements have something like a 4-core skylake from 2015
or a 6-core Zen2 from 2019.
Don't read too much into it.
I simply can't figure out the logic behind your claim.
I bet you can.