What's the definition of a 4K capable GPU? *spawn *embarrassment

He is obviously arguing that the GTX2060 should run 4K at that price point. It can't be any clear than that. Why do you focus only on the price point when he clearly mentions the GTX2060?

I still can't see how I got his argument backwards. What he meant is clear as water.

What I meant is clear as water, yes.
It's just that you didn't get it.

First of all, 4K is not an absolute term that perfectly defines all the variables that influence IQ and framerate. It's just saying how many pixel are being rendered per frame.
Therefore, I'd never say that card X or Y "should run 4K" no matter what the price point. It's a hollow argument from the get go.

A $350 discrete graphics card should have been tested at 4K, and the price isn't a valid argument because consoles who cost $300-400 run at that resolution. That holds specially true with the XBox One X which AFAIK doesn't even have ID Buffer for checkerboard.


New games are most probably heavier to run, so the fact that the GTX2060 has more power than a two generations old flagship says absolutely zero about how it should perform in current games.
What's the excuse for GTA V then?
For example, Toms Hardware tested GTA V at 1440p and 4K with the 980 Ti, but with the RTX 2060 they tested the same game at 1080p and 1440p.


@Shifty Geezer you still think Totz does not have an axe to grind here? Was he / is he innocently arguing a legit point about a price point or is he activelly just here to cry wolf about an nvidia product like he always does?
The last time I suggested someone to purchase a graphics card was a RTX 2070 which a good friend of mine did buy and is using ATM. It was a great purchase at the time even despite the unused features.
I'm interested in the frametimes for the RTX 2060 at 4K because I'm considering buying one for my HTPC. Which lo and behold is connected to a 4K TV (and currently has a RX480). On my HTPC I play mostly 3rd person action games like Tomb Raider so I don't really care about solid 60 FPS, and the RT hardware would be convenient for offline renders in my job.


Am I really the one with an axe to grind here? Your very first post earlier today was already loaded, and this last one is 80% flamebait and personal accusations.
Kindly do back off or just focus on the arguments, please.
 
What I meant is clear as water, yes.
It's just that you didn't get it.

First of all, 4K is not an absolute term that perfectly defines all the variables that influence IQ and framerate. It's just saying how many pixel are being rendered per frame.
Therefore, I'd never say that card X or Y "should run 4K" no matter what the price point. It's a hollow argument from the get go.

A $350 discrete graphics card should have been tested at 4K, and the price isn't a valid argument because consoles who cost $300-400 run at that resolution. That holds specially true with the XBox One X which AFAIK doesn't even have ID Buffer for checkerboard.



What's the excuse for GTA V then?




The last time I suggested someone to purchase a graphics card was a RTX 2070 which a good friend of mine did buy and is using ATM. It was a great purchase at the time even despite the unused features.
I'm interested in the frametimes for the RTX 2060 at 4K because I'm considering buying one for my HTPC. Which lo and behold is connected to a 4K TV (and currently has a RX480). On my HTPC I play mostly 3rd person action games like Tomb Raider so I don't really care about solid 60 FPS, and the RT hardware would be convenient for offline renders in my job.


Am I really the one with an axe to grind here? Your very first post earlier today was already loaded, and this last one is 80% flamebait and personal accusations.
Kindly do back off or just focus on the arguments, please.

Read my post again and you will get your answers.
 
He is obviously arguing that the GTX2060 should run 4K at that price point.
It should run at 4K at that price point and it can run at 4K at that pricepoint. Just change the game settings (to match the consoles). As such, as the GPU can run 4K games, why should benchmarks of its 4K performance be ignored? What's needed are benchmarks comparing it at different settings and framerates versus the other cards, which is the very purpose of benchmarks.

Malo is right in that benchmarks versus higher end cards running high end settings is not a basis for saying 2060 is a crap option, but the implication that a $350 card shouldn't be considered for 4K isn't valid either. 2060 should be benhcmarked at different resoltuions and different quality settings for fair, widespread comparisons. Although that is predicated on the purpose of benchmarks being for fair comparisons and not just for fanboy internet pissing contests!

@Shifty Geezer you still think Totz does not have an axe to grind here? Was he / is he innocently arguing a legit point about a price point or is he activelly just here to cry wolf about an nvidia product like he always does?
I don't give a shit about 'axes to grind'. With none of the PC history baggage the long-time posters here are carrying, I'm just seeing an argument about 2060 benchmarking.

Amiable posters should assume a misunderstanding rather than an agenda. This line..."You are just grasping at straws, even going to the point of defending that reviewers should test old games to see if a new card can ran them. Do you know how ridiculous that sounds?" is just making an argument. Totz hasn't even made a strong point to be 'grasping at straws' with. He just said 2060 should be benchmarked at 4K as it's a viable option for PC gamers in the market for a new GPU wondering what they can get for their money and if $350 is good enough for a reasonable 4K experience.

I'm even somewhat bemused as to your response and position. Should the 2060 be exempt from 4K benchmarks? How does that help consumers?
 
I'm even somewhat bemused as to your response and position. Should the 2060 be exempt from 4K benchmarks? How does that help consumers?

I'm not saying that it should be exempt, but I'm not shocked they are not done by reviewers either. How many people who are interested and have the money to be in the cutting edge, which 4K still is, would be looking at a 349 card? Or, in other words, for whoever the limit they can spend is 349, running 4K is a bonus, not the main objective.

This is the kind of thought reviewers will do to better optimise the time they spend on reviews. There is no conspiracy here, it's not significantly different from what reviewers have always done. Equally GPU manufacturers, both Nvidia and AMD have always tiered products through target resolutions. Why is this concept alien and, OMG the spawn of Al <ModEdit>, all of a sudden?

Do you think that reviewers should test a GTX1050Ti at 4K as well? They can reduce the settings and achieve 30FPS maybe? What about a GT1030? No, connect a laptop that costs more than a console and try that MX150! After all they are both full systems! Consumers need to know!!!! 99999

See how ridiculous this argument is now?

Edit - If anything, what we should be discussing is the huge increase in price for a x60 tier card, which obliterates the fact that it performs like a GTX1070Ti. Barely any improvement in price / performance.
 
Last edited by a moderator:
It's not ridiculous. It's a fair argument and you present a fair counter-argument. In an ideal world, the 2060 would be benchmarked at a wide range of resolutions and quality settings, but time constraints mean websites are going to limit their attentions to targets they think represents their audience. Fair points. It can still be argued that it's reasonable for someone buying a $350 GPU to be interested in gaming at 4K. If that discussion were ot proceed in an orderly fashion, people would firstly present anecdotal evidence as the go to, followed by stats if someone finds them for PC hardware expenditure, monitor resolutions, and even national average incomes and PC gaming populaces.

Plenty of sane debate to be had without needing to make it personal or claiming the different view to yours is absurd. Most importantly, you could have presented your counterpoint in the above fashion without jumping straight to accusations. ;) Would have saved on typing and internet electrons too, making polite discussion the more efficient, ecofriendly option. :D
 
Last edited:
Spawn of Al? <ModEdit>
...
Ok bran whatever. <ModEdit>



Computerbase.de has done a frametime comparison between the 2060 and 2070, on games they found the 2GB less VRAM to make a substantial difference:
https://www.computerbase.de/2019-01...gramm-final-fantasy-xv-2560-1440-6-gb-vs-8-gb

They only did it at 1080p and 1440p, but on both Final Fantasy XV and CoD: WW2 there are some pretty tall spikes. FFXV shows a ~18ms average with spikes up to 60-70ms so this is bound to produce stutter.

MOzE599.png


Since the RTX 2070 has the same architecture and doesn't show any of those spikes, I'm guessing this is the VRAM filling up and the card sits idling while waiting for data to stream from the main system RAM through the PCIe 3.0 bus

This is seemingly happening with only a handful of games (high profile games aren't showing this behavior), I wonder if nvidia is hard at work doing manual driver optimizations on a per-game basis to avoid filling the VRAM with latency-sensitive data.
Since Turing doesn't have a multi-level memory access organizer like HBCC, maybe they're doing the same as AMD did when they launched Fiji (which was also tight-ish at 4GB at the time of release).
 
Last edited by a moderator:
Since the 2060 is more capable then 4k consoles at 4k it should be tested at that res too. Its as much a 4k gpu as a 1070ti/1080. It will do fine at 4k if you can accept reduced fps/settings.

Comparing spending 349-399 on a console that will always perform in "4K", no matter what, to a 349-399 GPU that will likely struggle in 6 months to keep up with new software pushing the boundaries is obtuse and senseless.

My gtx 670 2gb/i7 920 pc is older then base ps4 but still can run games att higher fps/settings.2008 cpu, early 2012 gpu. My gpu was in that price range then.
Wolfenstein and doom both give a better experience on the 670 pc. Upgrading every 6 months is a thing of the past.
That 2060 is probably in line with ps5 performance.
 
Upgrading every 6 months is a thing of the past.
That 2060 is probably in line with ps5 performance.

Not for enthusiasts, it isn't (as long as there are new cards each 6 months lol). And even if you are right with slow upgrades, that's precisely why going for the top GPU makes sense for enthusiasts. That's the whole point from the beginning. 4K monitors are still expensive, so whoever has them will surely not settle for a 349 GPU that can play 4K at medium settings in the short term, low settings on the medium term. The x60 range was never for enthusiasts.

I sure hope the PS5 is at least 30% more powerful than a GTX2060.
 
Last edited:
I hope not...

Same here, but its not too unrealistic. 2060 being like a 1080, even beating it. Somewhile ago ppl thought ballpark 1080 performance isnt far off what could be in there. Tflops dont say much.
It all depends on what amd has with navi.
Im offtopic with this though :)
 
Spawn of evil?
...
Ok bro whatever.

Yup, that's an hyperbole reflecting how much I considered you arguments to be hyperbolic, in light of what was always the practice (most reviewers didn't the test the GTX1060 in 4K either for example and that includes the Tom's Hardware from your argument). You implied that reviewers were somehow in bed with nvidia for not testing the GTX2060 at 4K so nvidia can direct them to the more expensive options. In my view that's nonsense, hence my sassiness.
 
Last edited:
Same here, but its not too unrealistic. 2060 being like a 1080, even beating it. Somewhile ago ppl thought ballpark 1080 performance isnt far off what could be in there. Tflops dont say much.
It all depends on what amd has with navi.
Im offtopic with this though :)
So RTX 2060 is faster than GTX 1080 already? Yesterday, when I checked the reviews, average performance was at the level of GTX 1070 Ti. I'd say RTX 2060 is faster in high-FPS situations, slower in low-FPS situations. Significantly faster than GTX 1070, but no way better than GTX 1080. As for GTX 1070 Ti, its 8GB memory and more stable performance seems to make it a better solution.
 
Time constraints intentionally forced upon reviewers by Nvidia who know what they're doing and get the cards to them just before CES when any embargoes are lifted. So they get a day to do tests and record content knowing they have to get the review out and then attend CES.

Ok just to clear out the 4K conversation, it started with me writing this:

It also seems very clear that nvidia gave strong instructions to reviewers to stop them from showing results of the RTX 2060 at 4K, probably to avoid graphs like this:

KP912PT.jpg




When sending out new GPUs to reviewers, IHVs provide them with guidelines on how to review the hardware. It's a fact that these guidelines exist, as it's also a fact that the tables they provide show the resolutions/settings they want the cards to be tested at.
I'll stand by what I wrote about these guidelines for the RTX2060 being clear about testing with 1080p and 1440p. I don't believe for a second that nvidia encouraged reviewers to test games at 4K. On the contrary.

And they did so to avoid showing some embarrassing results like that one on Resident Evil, and not because the RTX 2060 is inherently unable to run many games at 4K with high/ultra settings (it's not).
It's also not because it has "60" in the name, nor because it costs $350.
Name and price don't fully determine how far a card can go. Performance does.
 
Ok just to clear out the 4K conversation, it started with me writing this:






When sending out new GPUs to reviewers, IHVs provide them with guidelines on how to review the hardware. It's a fact that these guidelines exist, as it's also a fact that the tables they provide show the resolutions/settings they want the cards to be tested at.
I'll stand by what I wrote about these guidelines for the RTX2060 being clear about testing with 1080p and 1440p. I don't believe for a second that nvidia encouraged reviewers to test games at 4K. On the contrary.

And they did so to avoid showing some embarrassing results like that one on Resident Evil, and not because the RTX 2060 is inherently unable to run many games at 4K with high/ultra settings (it's not).
It's also not because it has "60" in the name, nor because it costs $350.
Name and price don't fully determine how far a card can go. Performance does.

Performance has always dictated name and price, so name and price creates expectations of what the performance is. No matter how you twist it, you know its true.
 
Point is name nor price are an indicator of what resolution should be used for testing. Performance at time of release is.

I disagree with the second sentence. Performance at release may not hold true on future tittles, throughout the life of a card, thus painting a false picture of the card's capabilities when people, down the line, go read launch day reviews and expect that 4K performance at launch to still be true many months/2 years later. Cards tested and capable of 4K at launch were "labeled" as "4K cards" in the past and has remained for their lifetime despite not being true anymore. I find painting a mid-range card (regardless of price it is still mid-range Turing) as 4K capable quite problematic IMHO. Just my opinion tho and I'm not against testing at 4K as an extra data point, I'm kinda against the conclusions that inevitably arise from testing at such resolution, cards that may appear to punch above their weight on games that are or will be old through the cards lifetime.
 
If we take a 40 FPS average as "minimum acceptable performance", the $300 GTX 260 was a card for playing demanding titles in 2008 at 1280*1024. In 2011 the $200 GTX 560 had raised those stakes for 1680*1050. In 2013 the $250 GTX 760 rose the resolution to 1920*1080. In 2016 the $300 GTX 1060 drove that resolution threshold up again to 2560*1440.
There's nothing on the "xx60" name or its $350 price that says it can't/shouldn't run games at whatever resolution. If nvidia keeps up this naming scheme (and we don't all transition to GaaS), a point in time will come where the xx80 card is meant for dual 5K VR 90FPS + reprojection, and it doesn't even make any sense to test the $99-$999 xx60 of that family at anything lower than 4K.

And yet none of those jumps happened in two consecutive generations.

Between the GTX260 and the GTX560 there was the GTS8600, the undying variations of G92 and the GTX460.

Between the GTX560 and the GTX760 there was the GTX660.

Between the GTX760 and the GTX1060 there was the GTX960.

On the context of the RTX series where 4K is still the top "doable" resolution with a single GPU, its only natural that a mid range GPU (can be argued if 349 price is dead centre middle, but its not high end) is not expected to perform great at 4K.

Just like you said, there will be a point in time when it will not make sense to not test a xx60 at 4K. But it is not today, when even a GTX2080Ti has trouble running some games at 4K with all bells and whistles!

Why do we keep beating around the bush with this? It's just common sense.

Look if you think that the GTX2060 should be tested at 4K, fine. Build your own website, buy the card and test it yourself. Enough with chasing imaginary windmills.
 
Last edited:
I disagree with the second sentence. Performance at release may not hold true on future tittles, throughout the life of a card, thus painting a false picture of the card's capabilities when people, down the line, go read launch day reviews and expect that 4K performance at launch to still be true many months/2 years later. Cards tested and capable of 4K at launch were "labeled" as "4K cards" in the past and has remained for their lifetime despite not being true anymore. I find painting a mid-range card (regardless of price it is still mid-range Turing) as 4K capable quite problematic IMHO. Just my opinion tho and I'm not against testing at 4K as an extra data point, I'm kinda against the conclusions that inevitably arise from testing at such resolution, cards that may appear to punch above their weight on games that are or will be old through the cards lifetime.

Exactly! You can bet that if nvidia would have branded the GTX2060 as 4K ready we would now be discussing if nvidia was misleading customers.

I mean, how many discussions were there here already about Nvidia cards not performing as well in the future (especially Kepler ones) compared to AMD cards of the same age?

Now people want for a mid range card to be branded as 4K when 4K is still a though nut to crack? Madness, madness, I tell yah!
 
The increasing fear of being left out of Nvidia's good graces perhaps. There are more tech sites being left out of 2060 sampling for example than before due to less-than-stellar reviews of initial RTX cards.

Why would NVIDIA care in this situation? They did not say the GTX 2060 was 4K capable. If they would have and then reviews would show its performance in 4K is limited, I would understand you. But that's not the truth.

Or do you really think that the usual buyer of x80 and above tiers would suddenly go "oh, hold on I'll get the GTX2060 instead because I do not need an uncompromised solution"?

This conversation is borderline schizofrenic.

Totz first says that the GTX2060 can run 4K and that NVIDIA is not letting reviewers test it because they want people to buy the GTX2080 instead.

But then the same Totz says "oh they don't let reviewers test it in 4K so people don't see how bad it runs RE7 at 4K"

If you don't understand how these two sentences are mutually exclusive, you need to go back to school.

Ladies and gentleman, say hello to the Schrondigers GTX2060. It either can run 4K but NVIDIA doesn't want you to know Or it can't run 4K but NVIDIA doesn't want you to know. The baseline is NVIDIA doesn't want you to know "something" but for the performance you really need to open the box!
 
Never considered that review site mainstream, so not terribly surprised.
Usually when this happens the common excuse for Nvidia and AMD is "lack of cards" to go around. I recall Vega did something along those lines.

Hardware Unboxed is the vlog arm of TechSpot. they've been around for 3 years and already have 320k subscribers.
Gamers Nexus is 9 years old and has 500k subscribers.
If an audience of several hundred thousand isn't "mainstream" then I don't know what is.

And if AMD punished tech sites for unfavorable review scores then that's terrible and it doesn't excuse nvidia for doing the same.
 
Back
Top