At playable settings 8GB was useless for a very long time unless the user purposefully did something to the game that made need 8GB VRAM.
It certainly took a while before it started to deliver. 2019 / 2020 was probably when it started to show a benefit beyond outliers, and obviously cross gen games started to change the landscape. I know many gamers would have moved on to newer cards by then, but I tend to keep my hardware a long time and donate it when I'm done - my GTX 560 1GB is still in use in a friend's computer!
I can totally see why at launch of the 570 / 580 many people concluded that the 4GB variants were optimal for them. GTX 1060 6GB was a good compromise, and a pretty solid card all round. I'd have been very happy with one.
That's such a specific and small use case that I feel it's irrelevant.
To a lot of people it probably is, but for me the capability is a nice bonus.
VRAM is and always has been a balancing act, we've had periods in time where AMD haven't given their GPU's enough VRAM to complement their GPU core performance (HD5800 series)
And there's also been instances where Nvidia haven't had enough VRAM for their GPU core performance (GTX670/680)
And then there's bad ports, is it fair to use those games to gauge if a card has enough VRAM? Or do you go with what the other 99% are using?
The amount of AAA games in 2023 that require more than 8GB VRAM at 1080p, at playable settings you can count on your fingers.
8GB is fine for 1080p with playable settings, 8GB is fine for 1440p in 99% of games at playable settings.
I agree on pretty much all these points. I do think vram requirements will continue to drift up over time for matched console settings, and that 1440p will become a bit tight on 8GB, but absolute stinker ports that look like shit with 8GB vram are a developer side problem.