Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Launching Titan RTX first and solely (along with Quadro RTX) would probably have helped.
 
Exactly. Until software is using the RTX cores to accelerate raytracing, it's redundant hardware. Sales should pick up once the software is up to speed, but who's going to update the software for RTX specific paths if no-one's using it? The old chicken-and-egg issue. Once DXR is widespread across hardware, it'll makes sense for the software companies to target it.
Not to get to much OT for a product review thread, aside from Adobe utilizing RT in their professional software, the latest collaboration (Jan. 2019) is with AutoDesk to use RT core acceleration with Arnold, Maya and 3ds Max. Companies that initially planned to use RTX cores in some capacity and the current application list that keeps growing indicates Quadro Pro sales is quite healthy. Any links regarding your professional RTX software penetration claims, or is it that professional segment sales are down industry wide?
 
Not to get to much OT for a product review thread, aside from Adobe utilizing RT in their professional software, the latest collaboration (Jan. 2019) is with AutoDesk to use RT core acceleration with Arnold, Maya and 3ds Max. Companies that initially planned to use RTX cores in some capacity and the current application list that keeps growing indicates Quadro Pro sales is quite healthy. Any links regarding your professional RTX software penetration claims, or is it that professional segment sales are down industry wide?


"Adobe utilizing RT in their professional software" <- What does that even mean ?

" the latest collaboration (Jan. 2019) is with AutoDesk to use RT core acceleration with Arnold, Maya and 3ds Max" <- Arnold is currently a CPU Ray-Tracer which has the option to use OptiX for denoising using CUDA (on ALL NVidia GPUs). Arnold GPU has been announced by Marcos in 2011 we are on 2019 nobody outside of Solid Angle/Autodesk knows the full extend of it's GPU acceleration (it was functional in 2017/2018 way before Turing GPUs were on the map so it obviously already supports either or both CUDA & OpenCL besides RT Cores).

"Companies that initially planned to use RTX cores in some capacity"
As of today, January 29 2019 only ONE publicly available software uses the RT cores (Substance Designer for baking maps). OptiX 6 which will bring support to Turing GPUs is not finalised yet and there's no release date (should be in the coming months..). Every single piece of software featuring OptiX denoising is running on CUDA Cores. At this moment the RT Cores & Tensor cores are totally useless (unless some lunatic specifically bought a Turing GPU just to bake maps in Substance..).
 
Last edited:
Any links regarding your professional RTX software penetration claims, or is it that professional segment sales are down industry wide?
More an absence of links announcing RTX integration into renderers. I know blender's cycles only recently got a build to even run on RTX cards, let alone use custom RTX paths to accelerate. You post says as of just this month, RTX will come to Adobe, meaning it's not there yet. So if you are a professional using Arnold, how much incentive was there to rush out and get an RTX card if acceleration for it isn't present in Arnold yet? Surely it makes more sense to wait until the software is upgraded.

It's common sense from that point on, the old chicken-and-egg argument, that something needs to incentivise pro software to add an RTX path. That probably ought to be nVidia funding it, and they really should have done that for the hardware launch. Imagine if the day RTX released, companies could buy a card and get 10x faster raytracing then and there. Sales would have been stronger than they have been (nVidia saying they have missed targets showing sales aren't that great, unless they really did overspeculate sales).
 
The only thing wrong with Turing/RTX is the price, way too high.
No. It doesn't matter about price if it doesn't bring advantages. If you are a pro imaging firm running Arnold on 1080s, and the 2080 comes out at a decent price, if it's no faster raytracing then there's no point spending money to upgrade your GPUs. The price isn't at all unrealistic for the professional markets. A 10x speed-up in raytracing is phenomenal. The problem here is the software is using the hardware to be accelerated, meaning no matter what the price, the GPU is a dumb purchase as you're spending money for no gains. The cost/gains ratio is infinitely bad.

The moment pro imaging utilises RTX properly, it'll be well worth the money (not necessarily for gaming).
 
Has that demo been rolled out into the final product?

https://www.pugetsystems.com/labs/a...IA-GeForce-RTX-2080-2080-Ti-Performance-1244/
What makes these new RTX cards hard to review and test is the fact that Premiere Pro currently does not use either of these new types of cores. We can (and will) look at straight performance gains with the current version of Premiere Pro, but really what you are paying for is technology that might give you significant performance gains in the future.

As for the factual links, just link to the products that currently feature RTX acceleration. My Googlage hasn't found any.
 
So if you are a professional using Arnold, how much incentive was there to rush out and get an RTX card if acceleration for it isn't present in Arnold yet? Surely it makes more sense to wait until the software is upgraded.
Agreed. The main point is I don't think any company is waiting to incorporate RT into their software products. I believe work on Arnold start around August and likely takes time for these companies to modifiy their existing software.
 
Adobe Dimension CC

As for your other comments, kindly back up with factual links as anything else is just rumor and speculation.

Dude, you are just repeating PR material from Nvidia in every single one of your posts. Not a single piece of software on the market right now besides Substance Designer supports RT cores (or Tensor Cores because Optix 6 isn't out yet!). That PR video you posted again is nothing more than a PR Video!. This was a R&D project done by Dimitri Diakopoulos while he was at Adobe last year to showcase RTX @ Siggraph 18 and AdobeMAX 18. Nothing in the Adobe CC Suite uses the RT Cores & Tensor Cores today.
 
Last edited by a moderator:
I see it as the ps2 (or any console with capable hw), nice hardware but not many games/software, and overpriced units. Software takes awhile to catch up with new hw.
RTX/turing for gaming is great, just priced too high, thats what one reads everywhere, not many complains about performance. Aside from naysayers then.

Ike Turner your caps was left on.
 
Pharma, please stop posting hallow PR statements about the future as if they are the word of God. It really makes you hostile towards open rational discussions.
 
Try google.
I already mentioned I didn't find much on Google. Your Google result is a repeat of what you already said about Arnold which isn't available yet, and as others state is just an nVidia promo piece about what could be done in the future.

What applications have RTX acceleration? If you're a professional artist, which applications are driving your purchasing of RTX boards? Or are there no applications available yet, resulting in lacklustre adoption of RTX while professionals wait for the software to justify it?
 
As I said earlier, Nividia's biggest mistakes was not just Turing pricing, but specifically Turing pricing in the face of the cresting tsunami of Pascal cards coming onto secondary market out of mining rigs. They were used to dealing with customers who would spend $250-700 on GPUs every 2-3 years to the ones selling cards by container after only a few months. Their models for upgrade forecasting, product EOL, etc were all nuked, if there was ever a time to focus of value it was now. Instead, they allowed themselves to envision the world where people were lining up to pay $800-1200 for GPUs. Talk about zigging when they should have zagged.
 
As I said earlier, Nividia's biggest mistakes was not just Turing pricing, but specifically Turing pricing in the face of the cresting tsunami of Pascal cards coming onto secondary market out of mining rigs. They were used to dealing with customers who would spend $250-700 on GPUs every 2-3 years to the ones selling cards by container after only a few months. Their models for upgrade forecasting, product EOL, etc were all nuked, if there was ever a time to focus of value it was now. Instead, they allowed themselves to envision the world where people were lining up to pay $800-1200 for GPUs. Talk about zigging when they should have zagged.

Nice summed, some dont understand this.
Once amd and possibly intel are more competitive we will see altered prices too.
 
Nice summed, some dont understand this.
Once amd and possibly intel are more competitive we will see altered prices too.
People keep mentioning Intel like their new to the market but they've been selling socketed GPUs with integrated CPU for years now.
 
Intel will compete with both Nvidia and AMD in the (near) future, this and AMDs performance getting better and better we will see prices come down eventually.
 
Back
Top