You can't be serious bringing that poll with with less than 50 people, in an enthusiast as evidence... and expecting me to take anywhere near as seriously as Steam hardware survey.
Does the Steam survey report on what settings people prefer? My little poll shows there's no clear standard, even amongst hard-core tech enthusiasts. Your data shows what monitors and GPUs people have. You need to assume a lot about hardware use to go from hardware choices to rendering resolutions, so though my little poll is far from statistically relevant, the information is at least transparent and doesn't require interpretation.
I also don't see how this line of reasoning applies to the qualification of a card as 4K or not. That the majority of 4K may be using high-end cards doesn't mean no-one will buy a 2060 to play 4K. In fact it can be argued that until now, no mid-range card has really been that great for 4K gaming. It'd be more useful to look back at 1080p adoption and what cards people bought and played 1080p at. Given the sales of mid-range and low-end GPUs like the 750ti and 1060, I think that matches the notion that people will choose settings to match their preferences as per my poll. The only reliable extrapolation is elite gamers wanting the best will buy high-end cards. The moment a gamer moves outside that elite bracket, it's a conventional cost/benefit consideration, how much GPU can they get for their money and what experience can they get from that by tweaking settings, which is subjective.
If you have a valid argument to why people keep upgrading their GPU, but not their monitor, and that has nothing to do with IQ and fps expactations, please explain.
I don't have a counter argument because that's what happened and I wouldn't suggest otherwise. Higher resolution monitors than 1080p have been rare and expensive until recently, as I understand it. Monitors have been stuck at 1080p for a long time, and obviously people have been content to stick with that except for multi-monitor gaming. That didn't stall GPU progress though because new GPUs meant playing games in better quality. Now GPU progress is being matched with display progress and there's another quality consideration when upgrading GPU.
Again, I don't see how that applies to GPU classification.
I said multiple times I'm against the labelling of the card as a 4K capable card, either by reviewers or the community. I'm against the labelling!!!!
Okay. Your posts have been read (certainly by me) in favour of the labels because you were replying to replies to pro-distinction posts. The context was established as "2060 isn't a 4K card," the counterargument to that being "any GPU can be a 4K", and the natural counter to that counterpoint is implied in support of the first suggestion.
I think people often forget that the ideas they are posting aren't necessarily seen in the context which they intend, because fora consist of multiple persons presenting different positions in a flow of ideas that implies connection. I guess it's always good practice to be explicit in what point one is making, and whether one is in agreement, disgreement, or exploring a different perspective.
This is one of the many reasons why discussions should maintain civility, to account for the difficulties of managing a conversation in such a structure, and to give people the benefit of the doubt and first and foremost, ask for clarification or proof rather than accusing them of being obtuse/moronic/trollish.