Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Exactly, the way I see it, 2060 serves only one purpose: a cheap entry to ray tracing, it's very mildly lower than 2070 in ray tracing, and even though it's a capable 1440p card, future prospects dictates it should be used as a 1080p card, as it's 6GB framebuffer would last at that resolution much longer, just like a 1060.
 
The 6gb 2060 has a limited future regarding 4k/highest settings then. I dont think a 8 to 11gb 2060 is going to happen.

This is exactly why a 349 card should not be presented as capable of running 4K. If nvidia would have said that, we would now be discussing if they were being too generous with the truth or misleading consumers. It's a glass half full, half empty situation.

It's the perfect type of argument for whomever has an evil furry cat. <ModEdit>
 
Last edited by a moderator:
Time constraints intentionally forced upon reviewers by Nvidia who know what they're doing and get the cards to them just before CES when any embargoes are lifted. So they get a day to do tests and record content knowing they have to get the review out and then attend CES.

Did that also happen with the GTX1060? Because it was also not tested at 4K by most outlets.
 
Thread needs cleaning up
Time constraints intentionally forced upon reviewers by Nvidia who know what they're doing and get the cards to them just before CES when any embargoes are lifted. So they get a day to do tests and record content knowing they have to get the review out and then attend CES.
BS, no time constraints forced upon reviewers by Nvidia.
HH at Guru3D mentioned they received and tested the RTX 2060 more than a week before published review.
Hilbert Hagedoorn: Where did that come from?, the board arrived over a week prior to article release.
 
Edit: side note. I wish more games were good at telling me when I'm exceeding vram.
There does seem a trend in recent AAA games to have those metrics either in the benchmark screen or in the graphics settings. I hope it continues (and gets more detailed/better), and maybe it's just a sign of the tools getting more streamlined/better that it's worthwhile for developers to expose for QA to test on various hardware as it would make life rather easy for a seemingly simple metric.

Thread needs cleaning up
Remember kids, fibre is all part of a healthy balanced diet. Source: Bran. This message is brought to you by the letters A and L, the irrationAl number pi, and paid by some Toilet Hardware Company for the low low price of a shilling.
 
Last edited:
There does seem a trend in recent AAA games to have those metrics either in the benchmark screen or in the graphics settings. I hope it continues (and gets more detailed/better), and maybe it's just a sign of the tools getting more streamlined/better that it's worthwhile for developers to expose for QA to test on various hardware as it would make life rather easy for a seemingly simple metric.
...

I find it particularly annoying on PC because most optimization guides are written by people testing with the highest end cards. Turn lighting to ultra because it had very little effect on performance on my 1080ti! Games should do a better job of informing a player what settings will have an impact, maybe by profiling cards with micro benchmarks (fillrate, overdraw, shading etc). R6 Siege has a benchmark, but you still have to twist the knobs yourself and figure out which settings will have the biggest impact on your particular pc.

But back to the 2060, the limit of 6GB is going to be non-obvious to many gamers. They know that 8GB is better than 6GB, but they might not know why, or what impact that's going to have. At least if I were playing a game that was showing me vram consumption, I could see that I need at least 8GB if I want to turn up the various knobs. 6GB does not turn the 2060 into a bad product, but I think if you expect to use it for 2-4 years, the 6GB limit has real considerations, even at 1080p. I think most people buying it are on a budget and probably would hope for that lifetime.
 
I find it particularly annoying on PC because most optimization guides are written by people testing with the highest end cards. Turn lighting to ultra because it had very little effect on performance on my 1080ti! Games should do a better job of informing a player what settings will have an impact, maybe by profiling cards with micro benchmarks (fillrate, overdraw, shading etc). R6 Siege has a benchmark, but you still have to twist the knobs yourself and figure out which settings will have the biggest impact on your particular pc.

Yeah, I guess it'd have to be more like an actual profiling tool to examine the nitty gritty of the breakdown of a frame - need snapshots for various scenes, but it might be pretty nice to just see the make-up of a frame that shows what settings affect which part of the frame (shadow pass, lighting, each post-process step), which is perhaps something the driver teams do to some extent :?: Could make life a little easier for troubleshooting too - "such and such settings aren't running properly versus other folks' experience, and this is what the frame breakdown is showing as fubar on x-driver etc."
 
Anandtech included a GTX 980 Ti in their review of the RTX 2060. I'll give it a look.

https://www.anandtech.com/show/13762/nvidia-geforce-rtx-2060-founders-edition-6gb-review

They checked the frame times on some of the games, and both the 980 Ti and the 2060 felt the effect of Wolfenstein 2's Mein Leben! setting when running at 3840 x 2160 (the 980 Ti and the 2060 becoming equally bound). The other games tested for their frame times, Battlefield 1, Ashes of the Singularity: Escalation, appear to have been fine.

Edit: And GTA 5 was fine as well.
 
Last edited:
It's funny there's contention about 6Gb being limited and there's rumors of 4Gb and 3Gb variants, with both GDDR5x and GDDR6 coming out for the 2060.
 
All review are saying the 2060 is on par with 1070Ti @1080p and 1440p, am I missing something?

Oh hell, you're right, I messed up. When I looked at the Anandtech and Tomshardware reviews, I didn't have much time so just glanced at the graphs. For some reason (probably since I was in a hurry), I saw the 1060 as the 2060. Big oops.

There are some weird results for the card though. GTA V at 1080p is way slower on the 2060 than the 1070, but faster at 1440p and 2160p. Not that it matters since even at 1080p it was pulling 123 FPS.

https://www.guru3d.com/articles_pages/geforce_rtx_2060_review_(founder),22.html

Regards,
SB
 
Last edited:
When sending out new GPUs to reviewers, IHVs provide them with guidelines on how to review the hardware. It's a fact that these guidelines exist, as it's also a fact that the tables they provide show the resolutions/settings they want the cards to be tested at.
Those guidelines don't dictate what sites should test though. If a website wants to test 4K, they can, unless one wants to go into the territory of companies controlling the media by refusing samples etc. There's no point in that discussion in this thread. The benchmarks and reviews are what they are.
 
Those guidelines don't dictate what sites should test though.


They do.
https://videocardz.com/77983/nvidia-geforce-rtx-2080-ti-and-rtx-2080-official-performance-unveiled

In GeForce RTX reviewer’s guide, NVIDIA is not using any other resolution than 4K. So all benchmarks (except VRMark Cyan Room) were performed at 3840×2160 resolution. In fact, the RTX 2080 series were ‘designed for 4K’, as the document claims.

I'm happy to leave the conversation as it is, though. It's not important for this thread.
All I said was the RTX 2060 wasn't tested more thoroughly at 4K because the card's reviewer guide says so. We could discuss the semantics of what constitutes "suggestion" or "instruction" to test at resolution X or Y, as well as the repercussions / or lack of thereof for not following said implicit/explicit guidelines, but that's not important for this thread either.


Point is name nor price are an indicator of what resolution should be used for testing. Performance at time of release is.
If we take a 40 FPS average as "minimum acceptable performance", the $300 GTX 260 was a card for playing demanding titles in 2008 at 1280*1024. In 2011 the $200 GTX 560 had raised those stakes for 1680*1050. In 2013 the $250 GTX 760 rose the resolution to 1920*1080. In 2016 the $300 GTX 1060 drove that resolution threshold up again to 2560*1440.
There's nothing on the "xx60" name or its $350 price that says it can't/shouldn't run games at whatever resolution. If nvidia keeps up this naming scheme (and we don't all transition to GaaS), a point in time will come where the xx80 card is meant for dual 5K VR 90FPS + reprojection, and it doesn't even make any sense to test the $99-$999 xx60 of that family at anything lower than 4K.
Besides, Anandtech's title for the RTX 2060 review even says "Not Quite Mainstream", as they recon the 2060 marks an even larger departure from the usual target market of the xx60 cards.


The RTX 2060 looks like a pretty competent card to play at 4K (or eventually 1440p + DLSS when/if that ever gets widely adopted) and it'll fit quite nicely in a small HTPC case with a modest 500W power supply. Which is the case of my living room PC.
Don't like that I'm considering one for myself? Sue me :p
 
It's a guide, not a set of orders. You don't have to follow them, unless I'm mistaken and there's a contract stating what reviews are and are not allowed to report on. What was stopping videocardz.com benchmarking at different resolutions to the guide recommendations.
 
It's a guide, not a set of orders. You don't have to follow them, unless I'm mistaken and there's a contract stating what reviews are and are not allowed to report on. What was stopping videocardz.com benchmarking at different resolutions to the guide recommendations.
The increasing fear of being left out of Nvidia's good graces perhaps. There are more tech sites being left out of 2060 sampling for example than before due to less-than-stellar reviews of initial RTX cards.
 
The increasing fear of being left out of Nvidia's good graces perhaps. There are more tech sites being left out of 2060 sampling for example than before due to less-than-stellar reviews of initial RTX cards.
Do you have a link? Or is this more BS like similar to what you posted before regarding RTX 2060 reviewers timeline?
 
Mini-ITX builders rejoice:
https://www.gigabyte.com/Graphics-Card/GV-N2060IXOC-6GD#kf




I disagree with the second sentence. Performance at release may not hold true on future tittles, throughout the life of a card, thus painting a false picture of the card's capabilities when people, down the line, go read launch day reviews and expect that 4K performance at launch to still be true many months/2 years later. Cards tested and capable of 4K at launch were "labeled" as "4K cards" in the past and has remained for their lifetime despite not being true anymore. I find painting a mid-range card (regardless of price it is still mid-range Turing) as 4K capable quite problematic IMHO. Just my opinion tho and I'm not against testing at 4K as an extra data point, I'm kinda against the conclusions that inevitably arise from testing at such resolution, cards that may appear to punch above their weight on games that are or will be old through the cards lifetime.

There's no mid-range Turing at the moment IMO. The TU106 is a 445mm^2 chip. It's 2.2x larger with 2.45x more transistors than the GP106 in the GTX 1060.
$350 is also the most expensive a xx60 card has ever been at launch. The GTX 970 launched with a $330 MSRP.
The mid-rangers right now are the Polaris 10/20 and GP106 cards.


IMO, the Turing chips are just a byproduct of nvidia cancelling a full family of Volta chips (GTX 11-series?) that were planned to launch in mid 2018 using 12FFN. This would be akin to a Kepler -> Maxwell transition and nvidia was ready to implement a tick-tock strategy.
AMD's inability to compete and regain marketshare plus the mining boom during 2017 led nvidia to cancel the development of the whole Volta family in the except for GV100. With this, nvidia saved a bunch of money in R&D plus marketing plus whatever they were going to spend on replacing Pascal with Volta production lines. This obviously gave them considerable YoY revenue increases which are now impossible to keep up with, hence their latest stock value "crash" down to Q2 2017 levels.
Regardless, with all of the above, nVidia found themselves with the time and money to make a line of dedicated Quadro chips that will be unbeatable at offline rendering for years to come. There's dedicated hardware for raytracing plus tensor units for denoising, plus Volta's new shader modules.
And why is Turing coming for consumers after all? Because post mining crash the 2nd-hand market is being flooded with cheap Pascal cards and nvidia needs some steady revenue from gaming GPUs (which is their primary source of revenue by far). So they had to release something with an increased value proposition over those existing Pascal chips.

That said, I don't think Turing was initially meant for real-time rendering and gaming. The RT hardware isn't fast enough to provide a clear-cut advantage over screen space reflections (and probably never will be) and no one seems to know what to do with the tensor units in games, as DLSS implementations keep being pushed back month after month. I'll be happy to be proven wrong, but at the moment I'll stick to the same opinion as Gamers Nexus on BFV.
They're very fast at rasterization, sure, but that's coming from its Volta heritage and the fact that they're all large chips with the smallest being almost as big as a GP102.

I don't think for a second that a group of engineers at nvidia thought "what would be a great mid-range chip for 2019?" and came up with a partially disabled TU106.



Now people want for a mid range card to be branded as 4K when 4K is still a though nut to crack?
Absolutely no one here made such a statement.
Which is part of the reason why a discussion with you seems so exhausting from the get go. The other part being the completely unnecessary flamebait jabs like this:
Enough with chasing imaginary windmills.
First the accusation was me having an agenda against the card. Now I'm chasing imaginary windmills because I'm considering the card for myself to play some games at a specific resolution. Next will be..?
Look, I might've had the patience (maybe even eagerness I confess) for this in the past, but I certainly don't have it now. I might be better off just hitting the ignore button..



Do you have a link? Or is this more BS like similar to what you posted before regarding RTX 2060 reviewers timeline?
Here's your link.
To be honest, I don't think nvidia finds anything inherently wrong with reviewers showing positive or neutral 4K results.
That Chapuzas Informático graph on the other hand...



Honestly man you need to accept that this place bleeds red when cut. It always has. Accept it and move on, there's no point getting frustrated about it.
Sigh..
If the mods could have a dollar for all the times this is said for either side, they'd be too busy sipping a 1973 Port in a secluded Hawaii beach to moderate the forum.
 
There's no mid-range Turing at the moment IMO. The TU106 is a 445mm^2 chip. It's 2.2x larger with 2.45x more transistors than the GP106 in the GTX 1060.
$350 is also the most expensive a xx60 card has ever been at launch. The GTX 970 launched with a $330 MSRP.
The mid-rangers right now are the Polaris 10/20 and GP106 cards.

Apples and oranges. First of all, GTX 1060 was the full GP106, RTX 2060 isn't. Not to mention the addition of Tensor and RT cores. TU104 and TU102 are also significantly bigger than their Pascal counterparts, so that's hardly of relevance. There's no point comparing die sizes.

As for price, Turing has seen an increase in all segments, of which the 2060 is the smallest. That's what lack of compatition does. People were expecting a Vega 20 consumer part for less than $600, but it's $700 instead, consumes more power and does not have new features. So the landscape is what it is, sadly.
 
Back
Top