Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

At least the feature works as confirmed by developers, which is a good thing.
According to some tweets today some game areas still need to be polished. A patch may be released after release enabling the RTX feature.
 
I was blown away by the initial presentation of Turing and am awed by the power of the 2080ti, but find the pricing really problematic in context of the price/performance of the 1080ti right now. I was looking hard at the 2080, but I just couldn't stomach both sacrificing performance and paying more for the RTX capabilities and the 2080ti costs more than I care to spend. The 2070 seems like a good value, but doesn't hit the performance level I want. So, I ended up jumping on an EVGA SC2 1080ti for $599 instead to get the performance I'm looking for at 4k. I will have the option to use EVGA's Step-Up program for the next 90 days if I change my mind after reviews come out, though.
 
Oh, I was not expecting the card to struggle @1080p, even if this are not fully optimised demos. I can't see player going from 1440p or 4k to 1080p sub 60fps for better lighting.

Anyway, you need a first product out... Still impressed by the concept, but yeah maybe they needed to wait 7nm. At the same time, they're alone in the market and can try whatever they want without big risks.

Now, I'm very curious about non RT performances.
 
Oh, I was not expecting the card to struggle @1080p, even if this are not fully optimised demos. I can't see player going from 1440p or 4k to 1080p sub 60fps for better lighting.


Yup, here's anandtech's take on performance:

https://www.anandtech.com/show/13261/hands-on-with-the-geforce-rtx-2080-ti-realtime-raytracing

If the $1200 2080Ti struggles to maintain 60FPS at 1080p, I wonder how useful is RT on the 2070 with a fraction of the first's performance, let alone some lower range chips (TU106?) that may appear.


Is there some form of smart upscaling (checkerboard-like) planned for RTX? That could help somewhat, at least with the 2080Ti.
 
Yup, here's anandtech's take on performance:

https://www.anandtech.com/show/13261/hands-on-with-the-geforce-rtx-2080-ti-realtime-raytracing

If the $1200 2080Ti struggles to maintain 60FPS at 1080p, I wonder how useful is RT on the 2070 with a fraction of the first's performance, let alone some lower range chips (TU106?) that may appear.


Is there some form of smart upscaling (checkerboard-like) planned for RTX? That could help somewhat, at least with the 2080Ti.
It's already doing that "smart upscaling" by shooting a lot less rays than necessary and denoising the heck out of it (which is the right way to do it). But we still don't know how the performance metrics were calculated by Nvidia and probably wont until a third party tests them. 10Grays/s? Ok. But when/where? In the Cornel Box? A "real" game scene? Microsoft's DXR samples? From the presentation it was clear that the Cornel Box demo was not in 4K and not running at 60fps either (looked like 25/30fps).


IViF7sn.jpg
 
Last edited:
People were really expecting that realtime raytracing would debut with openworld games running at 60+ fps and 4K resolution right off the bat?
They thought we would go from “no raytracing at all” straight to “raytraced FarCry at 4K / 60fps” after a single GPU generation?
 
@Ryan Smith, did you try any games not running the ray-tracing feature?
In terms of frame rate, Shadow of the Tomb Raider ran at a mostly consistent 50-57 fps, which is impressive giving the game is running on a single GPU and in such an early state – on top of all the new ray tracing techniques.

We also played a variety of other PC games that shall not be named, and saw performance run in excess of 100 fps at 4K and Ultra settings. Unfortunately, we also don’t know how much power these GPUs had to draw to reach this level of performance.
https://www.techradar.com/au/reviews/nvidia-geforce-rtx-2080-ti
 
People were really expecting that realtime raytracing would debut with openworld games running at 60+ fps and 4K resolution right off the bat?
They thought we would go from “no raytracing at all” straight to “raytraced FarCry at 4K / 60fps” after a single GPU generation?


Not 4k60... but lets say a 20% performance drop ? With dedicated hardware/cores, I was expecting more. It's not like they were using classic shader cores to do this.
 
Not 4k60... but lets say a 20% performance drop ? With dedicated hardware/cores, I was expecting more. It's not like they were using classic shader cores to do this.

I think your expectations are unrealistic rather

Anyway, what is a dedicated raytracing core? From what I'm told, memory architecture is also rather crucial for raytracing. Throwing "moar cores !!!11" (dedicated or not) at the problem won't solve that much

( LE : i do think that being able to showcase hybrid rendering techniques in shipping games is quite impresive on nV's part )
 
Last edited:
I think your expectations are unrealistic rather

Anyway, what is a dedicated raytracing core? From what I'm told, memory architecture is also rather crucial for raytracing. Throwing "moar cores !!!11" (dedicated or not) at the problem won't solve that much

( LE : i do think that being able to showcase hybrid rendering techniques in shipping games is quite impresive on nV's part )


Well I guess the rt cores have a very unique way to manage memory / memory access, since yes, it's the crucial point for Raytracing. And I don't know, Caustic / PowerVR were showing real time demo in what, 2014 or 2016 ? I hoped nVidia could do better that what they showed few days ago. Still, don't get me wrong, it's a impressive step.
 
Yeah I am sticking with my Asus Strix highly overclocked out of the box 1080Ti. I honestly have yet to hit a game that this thing can't tear through. Playing Witcher 3 @ 4k maxed out is an awe inspiring experience even with high res texture mods etc applied. Perhaps next summer when I build a new gaming rig something new will be out as in the 7 nm tech and it will be worth upgrading to then. Still keen on seeing what the gaming benchmarks show when the product finally hits the market. That said I loved the Nvidia presentation and demos. Just wow.
 
I love speculating on the future products and I wish Nvidia would share the roadmap like they used, but I don't expect a 7nm product for consumers in 2019.

Sure this gen of cards could be shorter, but if it is, it will be 18 months not less then a year (maybe a 2020 spring launch like Pascal?). Next year, I bet they announce a V100 successor on 7nm that will ship at the end of the year with no consumer derivative parts.
 
From what I'm told, memory architecture is also rather crucial for raytracing.

That’s a good point. While the BVH and intersection acceleration are critical components it’s still dependent on the same shared caches and bandwidth as the rest of the chip. And raytracing is just one of many other things happening during rendering of a frame.

Once we see actual performance of games with and without DXR enabled we’ll have a much better idea of whether the transistors were worth it. I suspect there are enough graphics whores out there who would appreciate better visuals if it means settling for 60fps. Especially in slower paced games.

What we need are talented devs who can really showcase this stuff. I don’t have high hopes for impressive visuals from 3dmark given their last few benchmarks but fingers crossed.
 
Back
Top