Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Yes, but the third option is nVidia steps up and bares the cost of the software implementations. They fund the applications implementing RTX, which gives a reason for users to buy the cards, which leads to an install base, which leads to more devs adding RTX acceleration on their own because there's now an established market for it.

Many new consumer techs face a chicken-and-egg problem. Businesses trying to launch them need to do something to overcome that or else it'll flunk out (or, in this case, cost money). By and large it means up-front investment to secure content and to give people a reason to buy, or acceptance of slow growth whcih one should factor into ones sales expectations.
Indeed. On the other hand, it would require nV to do it on a unilateral-IHV-independent basis, but as we've seen with GameWorks and TWIMTBP it's incompatible with their self-serving/competitive business. *ahem* :|
 
The customer in this case could have been professionals. Leave the gaming for a generation, sell high-margin products to pro imaging, work on best practices in the R&D department, then roll out RTX2 to gamers in affordable cards that means all that research will be used.
That would have made zero, ZERO sense. All the devs need to buy Quadro cards (= more expensive). They would still need to produce Game Ready Drivers, so Devs can test their stuff. And a lot enthusiasts would have bought the Quadro cards anyway because of the better performance and would have felt ripped off and rightly so. The backlash would have been enormous. And the stock market would have shown nVidia the bird, because the cards yield no profit. Sorry, this strategy of yours would have been a disaster of epic proportions for nVidia. Plus Turing is not only RTX, there are a couple of other worthwhile features like tensor cores, mesh shaders and variable-rate shading that need to be in the wild to gather support.
 
Game devs buy Quadros? Why? I thought they were very different drivers and use case GPUs.
 
Game devs buy Quadros? Why? I thought they were very different drivers and use case GPUs.
In Shifty's opinion nVidia should a have released Turing as "high-margin products to pro imaging". That means Quadros. Please correct me if I'm wrong.

Quadros usually employ the exactly same chips as consumer cards, but have special hardware and bios that enable some additional functionality. You can install Game Ready Drivers for Quadro GPUs but not the other way round. I currently have a notebook with a nVidia Quadro P5000, but I use the latest Game Ready drivers.
 
Game devs buy Quadros? Why? I thought they were very different drivers and use case GPUs.
Depends on what type of game devs it is...Artists (modelers, animators...etc) will/can by Quadros because of the "better" drivers for DCC apps...in reality nobody in the game dev world buys Quadros because they are to damn expensive for what they offer compared to "regular" models. It's a bit different in the VFX & ArchiViz sector where the use of DCC apps is more prominent & the need for more Vram
 
All the devs need to buy Quadro cards (= more expensive). They would still need to produce Game Ready Drivers, so Devs can test their stuff.

1 - Purchase of new GPUs is a drop in the bucket of the overall investment for developing any major professional application.

2 - Haha no. nvidia doesn't have to produce game ready drivers for developers of professional applications to test their stuff. They don't even have to do that for game developers. There are lots of driver releases that don't even come to public and are exclusively aimed at game developers.
 
The "better driver" part is also BS. In reality, it's only some hardware error concealing features activated (ECC RAM, error concealing in video decoder etc.), and a couple of virtualization features unlocked in the firmware.

Other than that NVidia claims that you would get better support for Quadro than for GeForce, but let's be real: It's a blatant lie. There is effectively zero support for GeForce cards, no matter how badly the driver or hardware is failing. And for Quadro cards, if you make them a revenue of less than 100k annually - don't even bother asking for any real support either.

Often the very same bugs will hit you with the "certified" driver as well, if it's actually a driver bug (or even just the lack of validation, and a fault on the application side causing the driver to crash). Faulty GPUs are less common, but some nasty hardware flaws are obviously also reproducible in the Quadro or Tesla series GPUs.

You got to realize that the distinction isn't actually professional/consumer market either. The split is between graphics (DirectX, OpenGL and Vulkan for all use cases are same team), and CUDA/virtualization/professional features on the other side.

And the later group is effectively telling you as a developer "we refuse to test it in-house whether it works as designed with consumer GPUs". Chances are good that some stuff is broken on some cards even within the same architecture due to the aggressive feature masking in that division.
 
Last edited:
The 2060 is in the same league.

Close but around 20% costlier at $360-390 range depending on the model and store and special. Once you see it hitting $300 or below, it'll take off. Of course that's assuming there aren't better bargains out there on higher performing older gen cards (like 1070 / 1080).

Of course that also assumes the 6GB memory footprint isnt a limitation that negatively impacts product desirability.
 
Last edited:
970 wasn't available for $329 until quite some time after the launch either. You can buy the 2060 directly from nVidia as we speak for $349.

https://www.nvidia.com/en-us/shop/geforce/?page=1&limit=9&locale=en-us

Turing line is little overpriced, but other than that it's a solid lineup and far more appealing than what the competition has out there. 2060 is pretty good product at the current price, would be nicer at $329. I'd like to see 2070, 2080 and 2080 Ti being $50, $100 and $150 per model cheaper than their MSRPs currently are to feel like they are hitting the right spots. Hopefully they get there.
 
The 2060 is in the same league.

Right. They either realized they screwed up with the pricing on the higher-end SKUs, properly evaluated the price sensitivity of buyers of this class of card from the beginning, or realized that AMD actually has and will shortly have even more competition for them at this performance level.

If I were looking for a card in this price range, and I was committed to buying now, this is the one I would get without reservation. Whatever the RTX component can deliver in this SKU is just a bonus.
 
Eh?

Several other sources also list $329 as the launch price.

As far as I can remember there really wasn't availability at that launch price for quite some time after the launch, or there was price gouging soon after launch... It's been a while, but I remember 970s being sold for quite a bit higher than MSRP at some point.
 
https://www.fool.com/investing/2019/01/31/3-reasons-gamers-arent-buying-nvidias-newest-gpus.aspx
Motley Fool's reasons for the RTX line not selling well:

1 - Current AAA games are still targeting the specs for the 2011 consoles first, so the IQ is generally small above the current mid-end $200-250 cards.

2 - Gamers are postponing their big GPU investments, opting for cheap 2nd-hand GTX1060/1070 instead.

3 - RTX's features are barely exposed in the market.


Close but around 20% costlier at $360-390 range depending on the model and store and special. Once you see it hitting $300 or below, it'll take off.Of course that's assuming there aren't better bargains out there on higher performing older gen cards (like 1070 / 1080).
The problem is that nvidia is already acknowledging the TU106 can't be scaled down in price too much, by launching the TU116 cards.
If the GTX1660 Ti offers a rasterization performance close to the RTX2060 at $280, we may never see the RTX 2000 line ever really taking off in the gaming space.

Especially if the GTX1660 Ti clocks close to 1900-2000MHz (or more), which it could because it's a significantly smaller chip made on the 3rd generation of 12FFN.
 
Of course that also assumes the 6GB memory footprint isnt a limitation that negatively impacts product desirability.
Judging by the popularity of 4GB and 6GB 970 and 1060, it's not.
Close but around 20% costlier at $360-390 range depending on the model and store and special.
The 970 and 1060 had the same things applied to them as the 2060, in reality they are 10$ apart.
 
TechSpot: Is 6 GB of VRAM enough for 1440p Gaming? Testing Usage with RTX 2060
January 31, 2019
Today we're investigating claims that the new GeForce RTX 2060 is not a good buy because it only features 6GB VRAM capacity. The RTX 2060 offers performance similar to the GTX 1070 Ti, but that card packs an 8GB memory buffer, as did its non-Ti counterpart.
...
It's clear that right now, even for 4K gaming, 6GB of VRAM really is enough. Of course, the RTX 2060 isn’t powerful enough to game at 4K, at least using maximum quality settings, but that’s not really the point. I can hear the roars already, this isn't about gaming today, it’s about gaming tomorrow. Like a much later tomorrow…

The argument is something like, yeah the RTX 2060 is okay now, but for future games it just won’t have enough VRAM. And while we don’t have a functioning crystal ball, we know this is going to be both true, and not so true. At some point games are absolutely going to require more than 6GB of VRAM for best visuals.

The question is, by the time that happens will the RTX 2060 be powerful enough to provide playable performance using those settings? It’s almost certainly not going to be an issue this year and I doubt it will be a real problem next year. Maybe in 3 years, you might have to start managing some quality settings then, 4 years probably, and I would say certainly in 5 years time.
https://www.techspot.com/article/17...00248&usg=ALkJrhhBQySIeVLh97SaGYv-7VZ-SRf1tg/
 
Usable at what quality level? can they provide true reflections? Soft PCF shadows? Area Shadows? dynamic GI? proper refractions? Nope. RT is an elegant solution that encompasses everything. See Quake 2 on Vulkan RTX for a proper demonstration of a complete path tracing solution.
.

Yes, better, better, better, etc. Sphere tracing is just raytracing with a different acceleration structure, one not amenable to animation as much, and without triangles isn't as directly useable today straight from modeling programs.

I mean, if you really want to know what sphere tracing is: http://mathinfo.univ-reims.fr/IMG/pdf/hart94sphere.pdf is a decent overview plus a look at an interesting use case of sphere tracing that BVH RT can't do.

I wouldn't necessarily want to use it for reflections due to it's scaling problem, a hybrid solution of larger scale SDF tracing with low level polys still used might overcome the exponential data update problem that makes small, non basic shape details too hard to do with SDFs. But programmability is always preferred over narrow, fixed function hardware given just a choice between the two without consideration for other externalities.

Given that AMD has stated they're [doing raytracing in conjunction with game developers] I take this to mean they're asking game devs what they want to do with raytracing, and how they want to do it, rather than the MS and Nvidia strategy of doing so by fiat. The former is probably a better decision, AMD decided to put in tessellation hardware way back, before any open standard was made for it and fairly fixed function, without asking game devs how they'd want to use it. I can't remember much use made of it.

Some games might use pure BVH, and support RTX hardware for doing so. But other might use pure signed distance fields or cone tracing, or some hybrid of all 3. Having purely fixed function hardare for only one of these might not be worth the cost of silicon, relatively fixed as it has become versus the past, compared to hardware with more multi use capabilities.
 
The "better driver" part is also BS. In reality, it's only some hardware error concealing features activated (ECC RAM, error concealing in video decoder etc.), and a couple of virtualization features unlocked in the firmware.
It's not BS. Workstation sales pay for driver optimizations for professional apps. These optimizations are the main thing you're paying for in markets like CAD/CAM.
 
Yes, better, better, better, etc. Sphere tracing is just raytracing with a different acceleration structure, one not amenable to animation as much, and without triangles isn't as directly useable today straight from modeling programs.

I mean, if you really want to know what sphere tracing is: http://mathinfo.univ-reims.fr/IMG/pdf/hart94sphere.pdf is a decent overview plus a look at an interesting use case of sphere tracing that BVH RT can't do.

I wouldn't necessarily want to use it for reflections due to it's scaling problem, a hybrid solution of larger scale SDF tracing with low level polys still used might overcome the exponential data update problem that makes small, non basic shape details too hard to do with SDFs. But programmability is always preferred over narrow, fixed function hardware given just a choice between the two without consideration for other externalities.

Given that AMD has stated they're [doing raytracing in conjunction with game developers] I take this to mean they're asking game devs what they want to do with raytracing, and how they want to do it, rather than the MS and Nvidia strategy of doing so by fiat. The former is probably a better decision, AMD decided to put in tessellation hardware way back, before any open standard was made for it and fairly fixed function, without asking game devs how they'd want to use it. I can't remember much use made of it.

Some games might use pure BVH, and support RTX hardware for doing so. But other might use pure signed distance fields or cone tracing, or some hybrid of all 3. Having purely fixed function hardare for only one of these might not be worth the cost of silicon, relatively fixed as it has become versus the past, compared to hardware with more multi use capabilities.
They're not removing the compute units so no need to panic about "purely fixed function hardware".
 
Back
Top