What's the definition of a 4K capable GPU? *spawn *embarrassment

Absolutely no one here made such a statement.

Not as clear cut, but implied by yourself:

It's crystal clear that nvidia wants to convince graphics card customers to spend more money in a product with a much higher ASP if they say they want to play at 4K resolutions.

If its clear cut that nvidia wants to convince graphics cards customers to spend more money to play at 4K like you said, you are in fact defending that the GTX2060 should be assumed by them as perfectly fine for 4K, so gamers spend less, hence branded as such.

But there is more:

The RTX 2060 looks like a pretty competent card to play at 4K

Regarding my expression of stopping to chase windmills, I'm sorry you got that wrong, but I was not explicit enough either.

The other part being the completely unnecessary flamebait jabs like this:
Now I'm chasing imaginary windmills because I'm considering the card for myself to play some games at a specific resolution.

I did not intend that to be a flamebait, but only an expression of something you finally came to grips with on this answer:That there is no conspiracy here to hide GTX2060 4K results.

First the accusation was me having an agenda against the card.

Not against the card, no. But let's drop that yes, it is just "dejá vu" and I'm not expecting any sort of closure.

On that note:

To be honest, I don't think nvidia finds anything inherently wrong with reviewers showing positive or neutral 4K results.

I'm glad you have finally seen that, but that was not what you defended until now:

“I don't believe for a second that nvidia encouraged reviewers to test games at 4K. On the contrary.

PS - It is funny how I'm a bit too much expressive sometimes (not always intended as flamebait), but in all honesty, you are not that far behind, with very strong affirmations (involving the word "clear" a lot or "I don't believe for a second", to then, a short time later, backtrack on what you said (a little bit, common, its true, I'm not trying to flame bait... ).
 
Ladies and gentleman, say hello to the Schrondigers GTX2060. It either can run 4K but NVIDIA doesn't want you to know Or it can't run 4K but NVIDIA doesn't want you to know. The baseline is NVIDIA doesn't want you to know "something" but for the performance you really need to open the box!
:???: 4K's not a binary switch. If a video card can output 4K to a 4K display, it's a 4K card. It'll play games at whatever framerate based on the user's settings.

I'd have thought it helpful for consumers to have info on how well any GPU runs at different resolutions and settings to make an informed decision over which card to buy, whether an alternative in the same family that offers better bang-per-buck, or if they'll stretch for more frames, or if the alternative company offers a better option, particularly if one can grab a good deal somewhere.

What other cards are priced <$350? Are these cards options for someone with a limited budget to play games on a 4K display, or are 2070+ now the only options for people wanting to play 4K?

In terms of cost, a 4K monitor can be got for <£300. A 2060 is £350. a 2070 is £460. I don't believe that people who have £650 to spend on a 4K display an RT enabled GPU can be assumed to have an extra £100+ to get a 2070. If 4K monitors were £1000, sure, but they're not and I don't see a fair argument to say people who are price sensitive aren't going to be interested in gaming at 4K.
 
In terms of cost, a 4K monitor can be got for <£300. A 2060 is £350. a 2070 is £460. I don't believe that people who have £650 to spend on a 4K display an RT enabled GPU can be assumed to have an extra £100+ to get a 2070. If 4K monitors were £1000, sure, but they're not and I don't see a fair argument to say people who are price sensitive aren't going to be interested in gaming at 4K.
To that, I'll add that many people have 4K TVs and the budget for that is considered "home appliance" money and not gaming PC money.
In practice, it means gamers will often split the bill for a new TV with their spouses/girlfriends/boyfriends, whereas PC monitors come from their own nerd stuff budget.
And.. people often connect their gaming PCs to TVs. It's not that far-fetched.


I do not know the facts,
You could have watched the video you so vehemently asked to the link for..
 
4K's not a binary switch. If a video card can output 4K to a 4K display, it's a 4K card. It'll play games at whatever framerate based on the user's settings.

Whats so hard for people to understand this? If it can run games at lowered settings @4K at perhaps 30fps with dips, its still a 4K capable card. Its like some think 4k has become the standard now? Its still a very high resolution, with steep hardware requirements.
If the 1080 was considered a 4K capable GPU then the 2060 is too. Its performance is very close, even @4K.
What about PS4 Pro/One X, they have much less capable hardware, but they are considered 4k capable, and rightfully so, as both can output 4k games.

know that we're in a graphics centric forum, but honestly why is the discussion about benefits of RT reflections so limited to IQ?

IQ/visuals probally are the most prominent thing to most, ray tracing on the new gpu's impresses everyone, well not the naysayers then, but many. Played BFV once and i just hope il get a RTX gpu soon or later, perhaps RTX 30xx series. Its too expensive just now. Screen space reflections are a generation behind now.

Ray Tracing isn't only about reflections, there is shadows, lighting, AO and refractions.

Turing also adds DLSS, variable rate shading and things like AI/deep learning. Much can be done with the new Turing architecture, we just need more software to show it in games i think. Theres a rather large list but i would like to see that trend continue in the future.
 
If the 1080 was considered a 4K capable GPU then the 2060 is too. Its performance is very close, even @4K.

And again, I disagree vehemently. The 1080 was a 2016 product and could play (some) 2016-2018 games at 4K fairly well. It still plays many, but far less of them than when it launched. In a year or 2 will it play new games at 4K? Nope. Highly unlikely. However it'd been a card that's been capable of "being a 4k card" for most of its life. In 2020 the RTX 2060 will still be selling most likely, will it play most games at 4K then, especially with its 6GB of VRAM. Hell no.
 
:???: 4K's not a binary switch. If a video card can output 4K to a 4K display, it's a 4K card. It'll play games at whatever framerate based on the user's settings.

Sure, my Shrondigers remark was regarding the flip flopping of opinions about RTX2060 performance at 4K and associated shenanigans of NVIDIA constraining reviewers action. i.e. Claims that nvidia was controlling reviewers both because 4K performance is good enough (therefore competing with RTX2080) or because performance is bad (e.g. RE7).

In terms of cost, a 4K monitor can be got for <£300. A 2060 is £350. a 2070 is £460. I don't believe that people who have £650 to spend on a 4K display an RT enabled GPU can be assumed to have an extra £100+ to get a 2070. If 4K monitors were £1000, sure, but they're not and I don't see a fair argument to say people who are price sensitive aren't going to be interested in gaming at 4K.

This is where we will have to agree to disagree then. From my own experience of myself and friends, anything north of 150 pounds for a monitor is hugely expensive, nevermind 300!!

Yes, people who buy TVs might spend a bit more, but they are taking into account the size of the screen which will be much larger than the average monitor, as well as the fact that it has a tuner, smart apps (especially with Android TV now), etc, something most monitors won't have. The difference in value proposition between both product types is huge, really.

I would definitely expect someone who buys a 250+ monitor to get at least an RTX 2070 to go with it. It's not like they have to spend the money all at once! If you have a decent PC already and the monitor, you are just upgrading the GPU..

Edit - Looking at the Steam Survey, FWIW, users with 4K monitors represent only 1.42% of all the users. This should give you an idea of how niche a market 4K still is for PC gaming.
 
Last edited:
Neither will the Pro or One X, yet they are a classified as 4k capable now. even a 2080 will suffer at 4k in about 2/3 years.

Yes they will be able to do 4K, there's no going back from what they are offering right now (which is not true 4K anyways). Console hardware specs are not a moving target, unlike PC.

As for the 2080, in 2/3 years it will be replaced and by then it'd would have done its expected job. The 2060 will not be replaced in just a year, and it will most likely start to fail at doing 4K in that timeframe. No one should expect a card like the 2060, with its 6GB and bandwidth to be able to do 4K in the near future. It doesn't even do such a great job right now...
 
And again, I disagree vehemently. The 1080 was a 2016 product and could play (some) 2016-2018 games at 4K fairly well. It still plays many, but far less of them than when it launched. In a year or 2 will it play new games at 4K? Nope. Highly unlikely.
Again, it's not a binary switch. You can lower quality settings to hit playable framerates, or play 4K at lower framerates. You can also play your old and favourite games in 4K. So unless future games absolutely won't run at 4K (fat G buffer that doesn't fit the VRAM, sort of thing), it's still a 4K card.

Technically (and this is still supposed to be a technical forum ;)) a 4K GPU is simply any card that can output 4K, and has enough VRAM to be able to render a 4K output, even if that's 1 fps in some game. If you want to be more exclusive, you can mandate it has to be able to run 30 fps minimum on the lowest quality settings.
 
Yes they will be able to do 4K, there's no going back from what they are offering right now (which is not true 4K anyways). Console hardware specs are not a moving target, unlike PC.

The 2060 is faster then One X and Pro, and will still be that, even in 20 years. Btw RDR2 on One X is offering native 4k, its the most impressive game so far.
 
Yes they will be able to do 4K, there's no going back from what they are offering right now (which is not true 4K anyways). Console hardware specs are not a moving target, unlike PC.

As for the 2080, in 2/3 years it will be replaced and by then it'd would have done its expected job. The 2060 will not be replaced in just a year, and it will most likely start to fail at doing 4K in that timeframe. No one should expect a card like the 2060, with its 6GB and bandwidth to be able to do 4K in the near future. It doesn't even do such a great job right now...

Just forget it, I said exactly the same things you are saying now pages ago. This will go in circles until everyone is exhausted...
 
Again, it's not a binary switch. You can lower quality settings to hit playable framerates, or play 4K at lower framerates. You can also play your old and favourite games in 4K. So unless future games absolutely won't run at 4K (fat G buffer that doesn't fit the VRAM, sort of thing), it's still a 4K card.

Technically (and this is still supposed to be a technical forum ;)) a 4K GPU is simply any card that can output 4K, and has enough VRAM to be able to render a 4K output, even if that's 1 fps in some game. If you want to be more exclusive, you can mandate it has to be able to run 30 fps minimum on the lowest quality settings.

Come on now. That's not what's being discussed here. I don't think it's in anyone's mind to buy a new card to play pong at 4K...
 
Yay, The lowest of low Intel iGPUs are now considered 4k gaming GPUs!

Whats so hard for people to understand this? If it can run games at lowered settings @4K at perhaps 30fps with dips, its still a 4K capable card. Its like some think 4k has become the standard now? Its still a very high resolution, with steep hardware requirements.
What's your break point? If I can run CS:GO at low settings 15fps is that considered 4k gaming GPU then? Generally neither PC gamers nor reviewers consider 4k30 at high settings on average across titles to be considered a true 4k gaming GPU. The 60fps target has always been considered to be the true worth and it has been that way for a long time. "Can it run Crysis?" has never been about achieving 30fps...

Yes, it's somewhat subjective and yes depending on the game and the level of settings you're content with, you could indeed be happy at 4k with a lower-mid GPU. But it's generally not what is considered to be truly a PC 4k gaming experience.
 
Come on now. That's not what's being discussed here. I don't think it's in anyone's mind to buy a new card to play pong at 4K...
You're reducing the argument to make it fit. Anthem will run on 2060 with reduced quality. Anthem 2 will probably run on 2060 at 4K with reduced quality.

Pick a 1080p GPU from yester-year. Can they play modern games at 1080p? Battlefield 1. Minimum specs GTX 660. Performance 1080p at > 30fps. If you bought a GTX 660 in 2012 to play 1080p games, you'd still be playing 1080p versions of the latest games on it 6 years later.

That's science, and data. Real hard data that a GPU bought for a resolution still games at that resolution a good 6 years later, which is as long as a console generation and a fair time for a GPU. GTX 1080 is still an option for 4K gaming now and for a few years yet. RTX 2060 is an option for 4K gaming now and a few years yet.
 
Yay, The lowest of low Intel iGPUs are now considered 4k gaming GPUs!

What's your break point?
Precisely. It's stupid to hold GPUs to fuzzy definitions, because users choose their own settings. It rindunkulous that everyone's so caught up on this and utterly derailed this thread, which I haven't time to clear up to spawn a "what's the definition of a 4K GPU?" thread.

It should be dropped. Facts are people can choose to buy a 2060 for playing 4K games at a decent framerate and it will be able to play 4K games at a decent framerate for years to come on the latest software, as born out by all the GPUs that have come before and continue to game at whatever resolutions. There's no point arguing that. Attention should return to how well the 2060 is performing in games and whatever else these icky PC threads attempt to discuss.
 
Come on now. That's not what's being discussed here. I don't think it's in anyone's mind to buy a new card to play pong at 4K...

The GPU in the Xbox One X provides native 4k, with a quite stable 30fps for RDR2. Its having one of the most impressive graphics out there, up there with Sony's AAA games, that says alot. Thats RX580/1060 performance.
2060 will do just fine at lowest settings @4k in two years, if not more.

The 60fps target has always been considered to be the true worth

DF found RDR2 technically very impressive, running @4k 30fps. 30fps isnt always a bad thing, in special in SP games.

But it's generally not what is considered to be truly a PC 4k gaming experience.

People who buy entry level Turing products for its price tag might be content with 4k 30fps. Your raising the bar to min 60fps somehow, thats a steep requirement even for mid-end gpu's. You sure dont like gaming on consoles do you? There we live with 30fps and upscaled 4k for the most. Theres nothing wrong with that as 4k is a huge resolution with huge requirements even on todays hardware.

Pick a 1080p GPU from yester-year. Can they play modern games at 1080p? Battlefield 1. Minimum specs GTX 660. Performance 1080p at > 30fps. If you bought a GTX 660 in 2012 to play 1080p games, you'd still be playing 1080p versions of the latest games on it 6 years later.

Can confirm this, my pc thats connected to the tv has a MSI GTX660, im able to run wolfenstein 2 @ 1080p 30fps, quite stable too. Yes i have to reduce settings, but nowhere below that of base PS4 settings, which is very acceptable imo. Same for doom, looks and plays better even.
Not my video but a 760 isnt far off from a 660.



Edit: Saw shiftys last post after i wrote this one. Wont continue about it :)
 
I disagree with the second sentence. Performance at release may not hold true on future tittles, throughout the life of a card, thus painting a false picture of the card's capabilities when people, down the line, go read launch day reviews and expect that 4K performance at launch to still be true many months/2 years later. Cards tested and capable of 4K at launch were "labeled" as "4K cards" in the past and has remained for their lifetime despite not being true anymore. I find painting a mid-range card (regardless of price it is still mid-range Turing) as 4K capable quite problematic IMHO. Just my opinion tho and I'm not against testing at 4K as an extra data point, I'm kinda against the conclusions that inevitably arise from testing at such resolution, cards that may appear to punch above their weight on games that are or will be old through the cards lifetime.

You mean like the 980 TI? Where NV wanted it tested at 4k?

Or the 1070? Which performs worse at 4k than the 2060, but was tested at 4k at launch?

This is a ridiculous assertion as no graphics card performs the same on future games as it does on past games. :p

This would be like saying no card should ever be tested at anything other than 640x480 because in the future games it won't perform as well at higher resolution as it does now. Oh wait, maybe 640x480 is too high? Perhaps 320x240 would be better? :)

Regards,
SB
 
Precisely. It's stupid to hold GPUs to fuzzy definitions, because users choose their own settings. It rindunkulous that everyone's so caught up on this and utterly derailed this thread, which I haven't time to clear up to spawn a "what's the definition of a 4K GPU?" thread.

It should be dropped. Facts are people can choose to buy a 2060 for playing 4K games at a decent framerate and it will be able to play 4K games at a decent framerate for years to come on the latest software, as born out by all the GPUs that have come before and continue to game at whatever resolutions. There's no point arguing that. Attention should return to how well the 2060 is performing in games and whatever else these icky PC threads attempt to discuss.


But that is the point. The RTX 2060 is not a 4k card. Because the RTX 2070 isn't either...

Just because someone is able to/or decides to hook a RTX2060 to their 4k television for cinematic movies & arcade games, doesn't mean it can push 3840 x 2160 pixels at stable frames. And yes stable 60 frames/Hz is the defacto standard for being able to push a particular resolution.

My RTX 2080 is barely ahead of the game at 3440 x 1440p and my Ti struggled too... so how is a RTX 2060 going to handle 4k when a 80Ti can't handle 2k at stable frames?


I think a RTX 2060 would make a great 4k Desktop computer & media machine. For light cinematic gaming, or downscaled 1080p stuff.
 
Back
Top