What's the definition of a 4K capable GPU? *spawn *embarrassment

Pick a 1080p GPU from yester-year. Can they play modern games at 1080p? Battlefield 1. Minimum specs GTX 660. Performance 1080p at > 30fps. If you bought a GTX 660 in 2012 to play 1080p games, you'd still be playing 1080p versions of the latest games on it 6 years later.

When Gtx660 was launched 1080p was not the holy grail of graphics anymore. I actually had a 450GTS at that time, a much much weaker card, and I could actually play games at 1080p on it at low/medium settings, including Batman AA with Physics enabled at >30 FPS. Can you do that currently with a GTX1050Ti at 4K? I doubt you can.

Moving on, the Gtx660 could play most games at 1200p with full graphics settings and anti aliasing with ~60 FPS (reaching 90FPS even). And what is most incredible, it did it while armstrung by only 2GB VRAM. There were few 4GB cards in the while and I would have liked to see what it did with double the memory at the few games that broke its neck (but still >30fps).

"The GTX 660 provides a surprisingly high level of performance for its MSRP of $230. It's roughly 23% cheaper than last month's GTX 660 Ti while being only 14% slower when running games at 1920x1200."

Can the GTX2060 play current games at 4K with all bells and whistles and AA turned above 60FPS? No, it can't and neither can the GTX2080Ti most of the time.

The point is, where the Gtx660 performance was in 1080p is not comparable at all to where the GTX2060 is in relation to 4K today, so its totally legit to assume that the GTX2060 will not be as long lived in 4K as the Gtx660 was in 1080p.
 
Last edited:
I think people are applying there own personal perspectives on what qualifies as a "4K card" probably a little too much.

Suffice to say that the GTX 2060 can play games at 4K, at a frame rate somewhat determined by the end user by reducing various quality settings in a particular game.

If you want maximum qaulity settings in every game at 4k at a minimum of 60fps, pick a different card.
 
You're reducing the argument to make it fit. Anthem will run on 2060 with reduced quality. Anthem 2 will probably run on 2060 at 4K with reduced quality.

Pick a 1080p GPU from yester-year. Can they play modern games at 1080p? Battlefield 1. Minimum specs GTX 660. Performance 1080p at > 30fps. If you bought a GTX 660 in 2012 to play 1080p games, you'd still be playing 1080p versions of the latest games on it 6 years later.

That's science, and data. Real hard data that a GPU bought for a resolution still games at that resolution a good 6 years later, which is as long as a console generation and a fair time for a GPU. GTX 1080 is still an option for 4K gaming now and for a few years yet. RTX 2060 is an option for 4K gaming now and a few years yet.

That's actually a good example, because NO a 660 is not considered to be able to play Battlefield 1 by the majority. 30 fps is not considered enough to play a fast paced multiplayer game by the standards of the majority. Low settings are not the standard of the majority. You can absolutely take any car to your local circuit and race with it, but no one sane would call every car a racing game.

The discussion has never been about the "technicallities" that every graphics card can play at 4K if you lower the IQ settings enough, just like the definition of a "racing car" would never revolve around a car "technically" being able to enter a circuit and race. Rather the discussion was about the responsibility of reviwers to portray those cards as "4K cards", with the implication of them being really good at 4K gaming without sacrificing IQ on games to do so.

No one said reviews shouldn't include 4K testing. Most reviews actually do include them despite the accusations of some guys here.
 
Looking at the mobile versions on the just announced MSI laptops, it seems the RTX 2060 model uses the same 180W adapter as it was needed for the GTX 1070 Max-Q model.

It'll be interesting to see how the GTX 1070 Max-Q compares to the laptop RTX 2060, but so far it points out to Turing being substantially more power efficient than Pascal, at lower TDPs.




Oh that, did watch till his complaining got on my nerves.

pharma - "Do you have a link as proof of that statement or is it just bullshit?"
- Link provided.
pharma - "But it's not a mainstream site!"
- With 320k subscribers it's as mainstream as mainstream will ever be.
pharma - "But I don't know the details!"
- Details are in the link you asked for.
pharma - Oh that? I won't watch it because he was complaining!


I should have known better than to go down this hole...


30 fps is not considered enough to play a fast paced multiplayer game by the standards of the majority. Low settings are not the standard of the majority.

giphy.gif


I doubt you have the slightest idea of what the majority does or thinks. You may know a lot about your own anecdotal experiences between close friends, sure. But the majority? Naah..
In my anecdotal experience, most people don't even touch the graphics settings in the PC games they play from beginning to end. People with lower end GPUs may just play at low with whatever framerate their GPU can do.
 
You really have problems, don't you. You need to stop talking out of your arse, the entire point of this dicussion is that mainstream review sites received RTX 2060 cards a week before reviews were released.
Continue self-promoting your beloved site it as mainstream ... every click helps!
 
So this whole, long winded, derailing argument is because some say "4K gaming" to mean "4K at 60 fps with high quality settings", and instead of just clarifying that, they harp on about other people being wrong and ridiculous.

This thread is shameful.

The petty bickering over ill-defined semantics, of which no-one could fairly assume their interpretation, and far more interest in throwing out accusations and starting a fight than explaining a POV. If people were talking about viability of the 2060 for 4K gaming at high framerates and high detail, why wasn't that explained in one line, negating 65 replies of shit? A useless conversation which entailed calling those who think 4K gaming includes 30 fps RTS and single player games and shooters like PUBG and Fortnite at lower quality settings ridiculous and such?

Utterly shameful.
 
My post somehow remained in the other thread so I'm just going to copy it here, please delete the other one. Can't even edit it now.

I doubt you have the slightest idea of what the majority does or thinks. You may know a lot about your own anecdotal experiences between close friends, sure. But the majority? Naah..
In my anecdotal experience, most people don't even touch the graphics settings in the PC games they play from beginning to end. People with lower end GPUs may just play at low with whatever framerate their GPU can do.

1.42% of steam users have 4K display.

1.88% have 1080 Ti
2.85% have 1080
1.14% have 1070 Ti
4.16% have 1070

Wow! it would seem that a mayority of people buying $500 graphics cards don't do it for 4K gaming... Surely they are all idiots.

OR

they have other standards they consider more important and worthy of a $500+ upgrade than 4K. Any guess about what those could be besides IQ settings and frames per second?
 
So this whole, long winded, derailing argument is because some say "4K gaming" to mean "4K at 60 fps with high quality settings", and instead of just clarifying that, they harp on about other people being wrong and ridiculous.

High fps and high quality settings is what most people expect when buying a new card. Why even buy a new card otherwise if you can just lower the settings?

The GTX 960 was once well over 15% of Steam user base, now it's only 3.56%. How do you explain that other than people upgrading to i.e 1060 which is now over 15%? Did they do it for 4K which is only the 1.42% of Steam? Clearly not. 1440p even (3.89%)? Clearly not. So what is there left that all those people upgraded for, if not fps and settings?? I've had a 960 until 6 months ago or so, could play everything at combinations of med/high/very high, so that's clearly not why >>12% of Steam users upgraded.
 
The 1080 was sold as a 4K card, to lots of people who didn't have 4K displays because they used the same hardware to run the games at higher framerates and quality settings. A resolution choice says nothing about what settings people use. I asked on this forum what people preferred and choices were very varied - there's no obvious clear, de facto standard.

Image1.png

As resolution is just a setting from the triumvirate of res/quality/framerate, I see zero logic in assuming someone wanting a 4K game experience (playing a game at 4K resoltuion) will be mandating 60fps at high or better quality. That may be the expectation of elite and competitive PC gamers, but PC games is way, way bigger than that. Here's a list of the world's most popular PC games -

https://newzoo.com/insights/rankings/top-20-core-pc-games/

If a GPU can run these games at 4K, 30 fps (60 fps for competitive games) at lowest quality or above, why isn't it a valid option for someone wanting to play these games on a 4K display?

This labelling thing is as silly as calling a GPU a 120 fps card without clarifying what res or quality settings.

Feel free to select GPUs for different purposes based on different criteria - it's perfectly acceptable and logical to talk about GPUs capable of running the latest games at best quality 4K60 - but don't try and think a single parameter is sufficient to qualify that selection and expect everyone to know your selection criteria. If people want to argue the 2060 isn't a valid purchasing choice for those wanting 4K higher framerate high quality gaming, be clear on that position.

And most importantly, this response wasn't, "oh yeah, if we'd just been a bit clearer all this mess could have been avoided. We won't repeat that mistake again," but, "no, no. I'm still right to make these assumptions and expect everyone to make them too, and you're just too stupid to not make the same assumptions as me."
 
Last edited:
High fps and high quality settings is what most people expect when buying a new card. Why even buy a new card otherwise if you can just lower the settings?

The GTX 960 was once well over 15% of Steam user base, now it's only 3.56%. How do you explain that other than people upgrading to i.e 1060 which is now over 15%? Did they do it for 4K which is only the 1.42% of Steam? Clearly not. 1440p even (3.89%)? Clearly not. So what is there left that all those people upgraded for, if not fps and settings?? I've had a 960 until 6 months ago or so, could play everything at combinations of med/high/very high, so that's clearly not why >>12% of Steam users upgraded.

You cant be serious? Some people want max graphics all the time. Some like me can live with lowered settings over the years.

The steam statitistics dont back any of your claims, pc gaming is growing, more users enter the market with entry gpus like the 1060. Your steam stats dont imply a reason why ppl upgrade, we could guess anything here.

4k wasnt really a thing either for the 960. Its a 1080p gpu to begin with. Same as for ppl upgrading from base ps4 to Pro then.
 
The 1080 was sold as a 4K card, to lots of people who didn't have 4K displays because they used the same hardware to run the games at higher framerates and quality settings. A resolution choice says nothing about what settings people use. I asked on this forum what people preferred and choices were very varied - there's no obvious clear, de facto standard.

View attachment 2810
You can't be serious bringing that poll with with less than 50 people, in an enthusiast as evidence... and expecting me to take anywhere near as seriously as Steam hardware survey.

As resolution is just a setting from the triumvirate of res/quality/framerate, I see zero logic in assuming someone wanting a 4K game experience (playing a game at 4K resoltuion) will be mandating 60fps at high or better quality. That may be the expectation of elite and competitive PC gamers, but PC games is way, way bigger than that. Here's a list of the world's most popular PC games -

https://newzoo.com/insights/rankings/top-20-core-pc-games/

If a GPU can run these games at 4K, 30 fps (60 fps for competitive games) at lowest quality or above, why isn't it a valid option for someone wanting to play these games on a 4K display?

This labelling thing is as silly as calling a GPU a 120 fps card without clarifying what res or quality settings.

Feel free to select GPUs for different purposes based on different criteria - it's perfectly acceptable and logical to talk about GPUs capable of running the latest games at best quality 4K60 - but don't try and think a single parameter is sufficient to qualify that selection and expect everyone to know your selection criteria. If people want to argue the 2060 isn't a valid purchasing choice for those wanting 4K higher framerate high quality gaming, be clear on that position.

I am not making any weird selection of criteria, it's what the cold hard data (which you supposedly love) suggests. Explain the Steam data before you dismiss my criteria. If you have a valid argument to why people keep upgrading their GPU, but not their monitor, and that has nothing to do with IQ and fps expactations, please explain.

And most importantly, this response wasn't, "oh yeah, if we'd just been a bit clearer all this mess could have been avoided. We won't repeat that mistake again," but, "no, no. I'm still right to make these assumptions and expect everyone to make them too, and you're just too stupid to not make the same assumptions as me."

I didn't insult you, nor did I made any claim that resembles the above statement... If someone has made these thread shameful, it's you right now, with such claims.
 
Last edited by a moderator:
why isn't it a valid option for someone wanting to play these games on a 4K display?

I'll answer this more calmly. No one said anything to the contrary. You've not read or understood my position this whole time. I said multiple times I'm against the labelling
of the card as a 4K capable card, either by reviewers or the community. I'm against the labelling!!!! Not against whichever use a user wants to give a card.

EDIT: i.e. look at vipa's posts above.
EDIT2: And I didn't think it was necessary, but maybe it is not to assume anything. What labelling means. If as you say "every card is a 4K card as long as it has a 4K output", then the label "4K capable card" has no meaning at all. Since it actually is a label that is often used by the media and users to refer to "not every card, but only a selection of them", it means that it has to have some criteria. And i'm using the criteria of what most people choose. Or what I'm 99.99% sure people choose based on Steam data. You disagree? OK, please debate that.
 
Last edited:
I see this thread is really going places and I dont want to detract from that, but it should be noted that 4k isn't really the be all end all of screen resolutions anymore.

5k and 8k is a thing now, not even counting multimonitor and ultrawides which at least for the latter are very mainstream these days.

Point being that you dont, and shouldn't need to have the absolute most expensive and most powerful bleeding edge card just to play in 4k.

I feel like most people are just gonna buy whatever they can afford and reduce settings until they reach a comfortable frame rate if they have a 4k monitor.

Like they always have.
 
He doesnt even recognize the difference between 660>960 and 960>1060. The 10 series came in a era where 4k started to become 'mainstream'. As did the mid gen consoles, in his mind there would be 'irony' in people upgrading from a base ps4 to a Pro to get 4k.
Theres no point in discussing as theres no seriousness in the whole thread, i have no idea what their gain in the whole thing is.

4K is not even remotely close to mainstream today. Just another example of pure nonsense coming out your mouth.

And see how you can't understand the irony in your claim? No it has nothing to do with consoles. The irony is that you are debating against my point of "not wanting to label cards as "XX capable" or "x resolution card" based on arbitrary criteria that is only true at a certain point in time, by labelling the 960 a "1080p card". And of course at the same time you're shitting on Shifty Geezers entire point of view, which is again ironic, again considering you are liking his posts. Because the one thing we both agree with, and completely disagree with you, for obvious reasons, is that a certain card should be labelled as an "xx resolution card".
 
Point being that you dont, and shouldn't need to have the absolute most expensive and most powerful bleeding edge card just to play in 4k.

I feel like most people are just gonna buy whatever they can afford and reduce settings until they reach a comfortable frame rate if they have a 4k monitor.

Like they always have.

Yes, but that is true of pretty much all cards, not just the 2060 and up. Or 1070 and up, like vendors and reviewers and gamers label the cards. So then, you can't just call the 2060 or 1070 or whichever card you choose to draw the line, based on 5 games from 2017 and say "oh 4K card, because it does over 30fps in 5 games" but at the same time "you know what, the 1060 is NOT a 4K card, because it does not do 30 fps in those same benchmarks". It's either "every card is 4K" as Shifty said, or if you must absolutely have to label a card like i.e. vipa does ("960 is a 1080p card"), and many reviewers also do and the PC community does in general, then it is MY OPINION that as a reviewer, before you label a card as "XX capable", you have to be 100% sure that it will meet the requirements expected by your readership, while the card is on stores, at least.

And my point regarding the 2060 was that since they must absolutely label the cards as "xx capable" or "non-capable", since they won't stop labelling them A or B. Based on current and expected future performance the 2060 should fall in the non-4K capable group, just like the 1060 to name one.
 
The steam statitistics dont back any of your claims, pc gaming is growing, more users enter the market with entry gpus like the 1060. Your steam stats dont imply a reason why ppl upgrade, we could guess anything here.

The 1060 is not an entry level GPU, the GTX1050 is. The fact that the GTX1060 is more popular than the latter shows you that a relevant number of people care for graphical quality and performance. Otherwise why would they spend more money than on a GTX1050. They could still run many games at low settings at 1080p!
 
I commented on your broad generalization about what the majority of gamers demand for IQ settings and framerates, and you justify that with the 1.42% of steam users who have 4K displays, or the agglomerate of 10% who own high-end GPUs?

Hmm yes? 1.42% of 4K vs 10+% (I didn't include AMD, nor the 9 series, nor the RTX series) who own a high-end graphics card. In case you failed math, that means that IF all 4K owners also had a high-end card (which is probably false, even if there's arguably a high attach rate), only 14.2% of high-end card owners are playing at 4K*, while the majority are playing at lower resolutions. Now, what would be the reason to spend $500 to upgrade a graphics card, while mantaining your low resolution monitor?

* I mean, that's not even completely true. They do have a 4K monitor which does not automatically translate to them playing at 4K.
 
maybe they should just test all the cards at the same resolution and settings and not make arbitrary decisions about what resolution tier a card belongs to.

at least anandtech still pits the review cards against its predecessors although they have been less consistent with that lately.

look at toms review of the 590, should a person with a 290x buy that? is it an upgrade? no good lol.
 
Back
Top