A GTX 580 doesn't play 2012 or 2013 titles at full detailes in 1080p, so if you have to go lower with details/AA, the required memory amount is lowered too. I don't think the smaller memory impacts longevity unless you want to play at < 30 fps or have very specific games/requirements. For the vast majority of the market, the memory amount on Nvidia cards is perfectly sufficient. If you're serious about gaming, then you should upgrade every 2-3 years anyway, at least in my opinion. And then the memory argument is quite moot since you get a new card with more.
Stating that the GTX 580 can't play everything from 2012/2013 maxed out
therefore it
doesn't need more than 1.5GB of memory is an awfully generic and ultimately wrong assumption.
First of all, the importance of a card's longevity doesn't depend only on
being serious about gaming. It also depends on either the person can or cannot afford it. It'll also have influence on the card's second-hand market price, which is yet another factor that determines if/when a person can upgrade.
Secondly, configuring the graphics IQ settings in a game isn't a binary "can play/can't play" option.
It's perfectly possible to have the GTX580 presenting spectacular graphics at >60FPS in recent games, being only bottlenecked by the memory amount.
One such example is Skyrim: it'll play completely maxed out in any GPU with a performance comparable to the GTX580. And here the GTX 580's low amount of memory ends up limiting the high-resolution texture mods that can be applied. A slower 2GB Geforce GTX 650 Ti will be able to
do more than a GTX580 in this situation.
Third, the new consoles have iGPUs with performance characteristics very close to the GTX580 and yet those will have access to over 6GB of memory for graphics. Even if the lack of low-level optimizations stop the GTX580 from ever performing as well as the consoles, the lack of memory will be the only thing that will eventually stop it from playing most games of this generation with reduced settings.