What's the definition of a 4K capable GPU? *spawn *embarrassment

My post somehow remained in the other thread so I'm just going to copy it here, please delete the other one. Can't even edit it now.



1.42% of steam users have 4K display.

1.88% have 1080 Ti
2.85% have 1080
1.14% have 1070 Ti
4.16% have 1070

Wow! it would seem that a mayority of people buying $500 graphics cards don't do it for 4K gaming... Surely they are all idiots.

OR

they have other standards they consider more important and worthy of a $500+ upgrade than 4K. Any guess about what those could be besides IQ settings and frames per second?

Steam statistics aren't a great talking point for something like this, as Steam's surveys are increasingly influenced by China. The influence is enough that its getting Chinese games with no English localizations into their recommendations for me purely based on tags and how many people are viewing the game (IE - Chinese only views are so large that they end up moving up the recommendation chart over games with English language).

As Chinese systems trend towards the bottom end of systems, Steam isn't a reliable predictor of Western hardware install bases anymore where people are more likely to be able to afford and desire both a 4k display and a GPU capable of playing games at 4k (regardless of whether it is at Ultra IQ settings or low end IQ settings or anything in between).

If Steam ever starts to break things down by region, it'll be more relevant to make localized generalizations with it again, but that time isn't now.

Regards,
SB
 
Steam statistics aren't a great talking point for something like this, as Steam's surveys are increasingly influenced by China. The influence is enough that its getting Chinese games with no English localizations into their recommendations for me purely based on tags and how many people are viewing the game (IE - Chinese only views are so large that they end up moving up the recommendation chart over games with English language).

As Chinese systems trend towards the bottom end of systems, Steam isn't a reliable predictor of Western hardware install bases anymore where people are more likely to be able to afford and desire both a 4k display and a GPU capable of playing games at 4k (regardless of whether it is at Ultra IQ settings or low end IQ settings or anything in between).

If Steam ever starts to break things down by region, it'll be more relevant to make localized generalizations with it again, but that time isn't now.

Regards,
SB

I don't agree with the conclusion that Steam stats aren't a good talking point, not for the reason you gave, anyway. If Western people "are more likely to be able to afford and desire both a 4k display and a GPU capable of playing games at 4k" then the relation of 4K displays to GPU capable of 4K remains the same, no matter how small they are in comparison to the total. And right now, the percentage of steam users with high-end cards and 4K is at most* 14%, which is a clear evidence that when the m,ajority of people pay such a high amount of money they are looking for something, and that that thing is not 4K gaming. That would be high quality settings and/or fps, because those are the only things left that a high-end card posseses vs a lower-end card of the same family.

*A requirement for that is that every single 4k monitor owner is also a high-end card owner, which is quite an assumption to make.
 
I don't agree with the conclusion that Steam stats aren't a good talking point, not for the reason you gave, anyway. If Western people "are more likely to be able to afford and desire both a 4k display and a GPU capable of playing games at 4k" then the relation of 4K displays to GPU capable of 4K remains the same, no matter how small they are in comparison to the total. And right now, the percentage of steam users with high-end cards and 4K is at most* 14%, which is a clear evidence that when the m,ajority of people pay such a high amount of money they are looking for something, and that that thing is not 4K gaming. That would be high quality settings and/or fps, because those are the only things left that a high-end card posseses vs a lower-end card of the same family.

*A requirement for that is that every single 4k monitor owner is also a high-end card owner, which is quite an assumption to make.

Which if we were to exclude China might be as high as 28% give or take. While not the majority, 1/4 of installed hardware isn't something to be hand waved away.

Considering that AAA games sell to less than 1/4 of the Steam user base (going by numbers I used to track when we could reliably get somewhat accurate Steam software ownership numbers)... What does that say?

The only thing I can say is I'm one of those potentially 25% that has a 4k display and a card capable of playing at 4k (GTX 1070) at 60 FPS (I don't need or desire max IQ quality, but 60 FPS is a must) that only buys AAA games when they are 75% off or more. And the lowest resolution I'll game at is 2400x1500 (almost never anymore), with my average game resolution being 3200x1800, both windowed resolutions.

I'm certainly not in my majority, but no one buying a 2060/2070/2080 is. And that's what this is about. The 2060 is current NOT a mainstream card. It's priced way to high for that. It's priced like a upper-midrange performance card, just 50 USD less than the GTX 1070 launched at. It's basically targeting he same buyers that were interested in the 1070 last generation. It certainly isn't targeting the buyers that got a 1060 (249 USD launch price) which was at the upper end of the mainstream price bracket.

Regards,
SB
 
Last edited:
This is why moderators have strong urges to ban.
 
Which if we were to exclude China might be as high as 28% give or take. While not the majority, 1/4 of installed hardware isn't something to be hand waved away.

Are the chinese more likely to spend $500++ on a card than western people? Are they somehow more likely to spend $500++ on a GPU than $300 in a 4K monitor, unlike western guys? Unless any of those are a resounding yes, the absence of chinese has no effect on the stats.
 
You can't be serious bringing that poll with with less than 50 people, in an enthusiast as evidence... and expecting me to take anywhere near as seriously as Steam hardware survey.
Does the Steam survey report on what settings people prefer? My little poll shows there's no clear standard, even amongst hard-core tech enthusiasts. Your data shows what monitors and GPUs people have. You need to assume a lot about hardware use to go from hardware choices to rendering resolutions, so though my little poll is far from statistically relevant, the information is at least transparent and doesn't require interpretation.

I also don't see how this line of reasoning applies to the qualification of a card as 4K or not. That the majority of 4K may be using high-end cards doesn't mean no-one will buy a 2060 to play 4K. In fact it can be argued that until now, no mid-range card has really been that great for 4K gaming. It'd be more useful to look back at 1080p adoption and what cards people bought and played 1080p at. Given the sales of mid-range and low-end GPUs like the 750ti and 1060, I think that matches the notion that people will choose settings to match their preferences as per my poll. The only reliable extrapolation is elite gamers wanting the best will buy high-end cards. The moment a gamer moves outside that elite bracket, it's a conventional cost/benefit consideration, how much GPU can they get for their money and what experience can they get from that by tweaking settings, which is subjective.

If you have a valid argument to why people keep upgrading their GPU, but not their monitor, and that has nothing to do with IQ and fps expactations, please explain.
I don't have a counter argument because that's what happened and I wouldn't suggest otherwise. Higher resolution monitors than 1080p have been rare and expensive until recently, as I understand it. Monitors have been stuck at 1080p for a long time, and obviously people have been content to stick with that except for multi-monitor gaming. That didn't stall GPU progress though because new GPUs meant playing games in better quality. Now GPU progress is being matched with display progress and there's another quality consideration when upgrading GPU.

Again, I don't see how that applies to GPU classification.

I said multiple times I'm against the labelling of the card as a 4K capable card, either by reviewers or the community. I'm against the labelling!!!!
Okay. Your posts have been read (certainly by me) in favour of the labels because you were replying to replies to pro-distinction posts. The context was established as "2060 isn't a 4K card," the counterargument to that being "any GPU can be a 4K", and the natural counter to that counterpoint is implied in support of the first suggestion.

I think people often forget that the ideas they are posting aren't necessarily seen in the context which they intend, because fora consist of multiple persons presenting different positions in a flow of ideas that implies connection. I guess it's always good practice to be explicit in what point one is making, and whether one is in agreement, disgreement, or exploring a different perspective.

This is one of the many reasons why discussions should maintain civility, to account for the difficulties of managing a conversation in such a structure, and to give people the benefit of the doubt and first and foremost, ask for clarification or proof rather than accusing them of being obtuse/moronic/trollish.
 
Last edited:
I also don't see how this line of reasoning applies to the qualification of a card as 4K or not. That the majority of 4K may be using high-end cards doesn't mean no-one will buy a 2060 to play 4K.

Because I'm not talking about what people will do with them, I never said people should not use it for 4K. I'm talking about the way that vendors, reviewers and people label the cards when "marketing" or presenting the card to the audience. IMO (and I have stated this to be just my opinion multiple times) they should only do this (label the card as xxx capable) when there's a pretty good guarantee that it will remain true or mostly true to the same extent* as it was at review time, for as long as the card stays in stores.

*With this I mean that if the review is showing a bunch of tests at Max/Ultra settings, as they usually are and in the conclusion they say "4K capable at max settings", people are going to expect that to be true, 2 years later when a buyer reads the review before making a purchasing decision (because there will be no newer reviews, or they'll be scarce anyway). That "reality" should remain mostly true. What happens after the card is taken out of the stores doesn't matter, because no decision can be made on the card. That's why timeframe is important. And that's why being a midrange card is also important. games would rarely aim at hardware that is more capable than what exists (high-end), but they will adjust to it, they will surely max it. So a card like the 2080 Ti, which starts well above the threshold, is far more likely to remain above in 2 years, as games become more demanding. A midrange card will simply fall below the threshold, whatever that threshold is.

EDIT: And regarding the classification, would you be OK with Nvidia selling 1050 cards as "4K Gaming Cards". Surely there's some settings on some games where that would be true, but do you think that would be honest? Would it be responsible of reviewers to agree?
 
Last edited:
I don't care about marketing. This discussion arose regards benchmarks and whether a card should be evaluated for is 4K performance with a view to people buying a 4K card.

Post one, "I wouldn't buy a $350 GPU for 4K gaming."
Post two, "People are 4K gaming on $350 consoles so why shouldn't a $350 GPU be a suitable option for 4K gaming (and thus benchmarked)?"

That has been the discussion as far as I'm concerned regardless of what stickers IHVs put on boxes to sell them. For a 1050, consumers should be able to find benchmarks and make their own choices. If a benchmark says it can 4K a game they want at quality settings they are happy with at a framerate they can play at at a price they can afford, they can buy it for 4K gaming even if the box doesn't recommend it.
 
I don't care about marketing. This discussion arose regards benchmarks and whether a card should be evaluated for is 4K performance with a view to people buying a 4K card.

Post one, "I wouldn't buy a $350 GPU for 4K gaming."
Post two, "People are 4K gaming on $350 consoles so why shouldn't a $350 GPU be a suitable option for 4K gaming (and thus benchmarked)?"

That has been the discussion as far as I'm concerned regardless of what stickers IHVs put on boxes to sell them. For a 1050, consumers should be able to find benchmarks and make their own choices. If a benchmark says it can 4K a game they want at quality settings they are happy with at a framerate they can play at at a price they can afford, they can buy it for 4K gaming even if the box doesn't recommend it.

Yeah, but it wasn't a reply to those that spawned our particular discussion. It was one where you replied to my post where I was explaining why, despite being equally fast, the "4K label" was somewhat more bearable (justifiable if you want) for the GTX 1080 in year 2016 than it is for the 2060 in year 2019. And the content of that post was entirely and surely about labels/classification.

I thought that testing @4K on reviews had been settled. Of course they should and the good news is that they mostly do test at 4K. As I said in a previous post, AnandTech did 4k testing, Guru3D did 4K testing, PCper did 4K testing, TechPowerUp did 4K testing and only Tom's didn't out of the ones I frequent, but then again they didn't test 4K in their RX590 either, so no conspiracy IMO. I'm pretty sure those are some of the largest and oldest sites. I feel like I've been frequenting them since forever. I could be exagerating if I say I've been reading at least some of them for 20 years, but maybe not.
 
Back
Top