Mini-ITX builders rejoice:
https://www.gigabyte.com/Graphics-Card/GV-N2060IXOC-6GD#kf
I disagree with the second sentence. Performance at release may not hold true on future tittles, throughout the life of a card, thus painting a false picture of the card's capabilities when people, down the line, go read launch day reviews and expect that 4K performance at launch to still be true many months/2 years later. Cards tested and capable of 4K at launch were "labeled" as "4K cards" in the past and has remained for their lifetime despite not being true anymore. I find painting a mid-range card (regardless of price it is still mid-range Turing) as 4K capable quite problematic IMHO. Just my opinion tho and I'm not against testing at 4K as an extra data point, I'm kinda against the conclusions that inevitably arise from testing at such resolution, cards that may appear to punch above their weight on games that are or will be old through the cards lifetime.
There's no mid-range Turing at the moment IMO. The TU106 is a 445mm^2 chip. It's 2.2x larger with 2.45x more transistors than the GP106 in the GTX 1060.
$350 is also the most expensive a xx60 card has ever been at launch. The GTX 970 launched with a $330 MSRP.
The mid-rangers right now are the Polaris 10/20 and GP106 cards.
IMO, the Turing chips are just a byproduct of nvidia cancelling a full family of Volta chips (GTX 11-series?) that were planned to launch in mid 2018 using 12FFN. This would be akin to a Kepler -> Maxwell transition and nvidia was ready to implement a tick-tock strategy.
AMD's inability to compete and regain marketshare plus the mining boom during 2017 led nvidia to cancel the development of the whole Volta family in the except for GV100. With this, nvidia saved a bunch of money in R&D plus marketing plus whatever they were going to spend on replacing Pascal with Volta production lines. This obviously gave them considerable YoY revenue increases which are now impossible to keep up with, hence their latest stock value "crash" down to Q2 2017 levels.
Regardless, with all of the above, nVidia found themselves with the time and money to make a line of dedicated Quadro chips that will be unbeatable at offline rendering for years to come. There's dedicated hardware for raytracing plus tensor units for denoising, plus Volta's new shader modules.
And why is Turing coming for consumers after all? Because post mining crash the 2nd-hand market is being flooded with
cheap Pascal cards and nvidia needs some steady revenue from gaming GPUs (which is their primary source of revenue by far). So they had to release something with an increased value proposition over those existing Pascal chips.
That said, I don't think Turing was initially meant for real-time rendering and gaming. The RT hardware isn't fast enough to provide a clear-cut advantage over screen space reflections (and probably never will be) and no one seems to know what to do with the tensor units in games, as DLSS implementations keep being pushed back month after month. I'll be happy to be proven wrong, but at the moment
I'll stick to the same opinion as Gamers Nexus on BFV.
They're very fast at rasterization, sure, but that's coming from its Volta heritage and the fact that they're all large chips with the smallest being almost as big as a GP102.
I don't think for a second that a group of engineers at nvidia thought "what would be a great mid-range chip for 2019?" and came up with a partially disabled TU106.
Now people want for a mid range card to be branded as 4K when 4K is still a though nut to crack?
Absolutely no one here made such a statement.
Which is part of the reason why a discussion with you seems so exhausting from the get go. The other part being the completely unnecessary flamebait jabs like this:
Enough with chasing imaginary windmills.
First the accusation was me having an agenda against the card. Now I'm chasing imaginary windmills because I'm considering the card for myself to play some games at a specific resolution. Next will be..?
Look, I might've had the patience (maybe even
eagerness I confess) for this in the past, but I certainly don't have it now. I might be better off just hitting the ignore button..
Do you have a link? Or is this more BS like similar to what you posted before regarding RTX 2060 reviewers timeline?
Here's your link.
To be honest, I don't think nvidia finds anything inherently wrong with reviewers showing positive or neutral 4K results.
That Chapuzas Informático graph
on the other hand...
Honestly man you need to accept that this place bleeds red when cut. It always has. Accept it and move on, there's no point getting frustrated about it.
Sigh..
If the mods could have a dollar for all the times this is said for either side, they'd be too busy sipping a 1973 Port in a secluded Hawaii beach to moderate the forum.