Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

It's already doing that "smart upscaling" by shooting a lot less rays than necessary and denoising the heck out of it (which is the right way to do it). But we still don't know how the performance metrics were calculated by Nvidia and probably wont until a third party tests them. 10Grays/s? Ok. But when/where? In the Cornel Box? A "real" game scene? Microsoft's DXR samples? From the presentation it was clear that the Cornel Box demo was not in 4K and not running at 60fps either (looked like 25/30fps).


IViF7sn.jpg
There's been a lot of chatter on Twitter from devs about Nvidia's 10Gray/s in the past few days. With some of them wondering how/when/where was this achieved given that there are a shit ton of ways to "Ray Trace" and WTH are those RT Cores all about exactly. Sebbi just blessed us with the following:

 
Apologies if already posted. Nvidia claims 2x performance over 1080 Ti though not exactly apples to apples (TAA vs DLSS)



https://blogs.nvidia.com/blog/2018/08/20/geforce-rtx-real-time-ray-tracing/
Well that's a 4 years old, non-interactive demo created on an engine version that has since be depreciated (UE 4.9) and which is freely available to the public and NV (assets, shaders etc) so they could have tweaked a ton of stuff in there to make there case. You would hope that they had more to show. Any other company would have been ridiculed for claiming better performance this way.:cry:
 
So around 50% better performance at 4k? Of course there's little information here and we love marketing graphs, but it's something.

Seems to be faster with HDR games and async compute workloads. Weird selection of games though. 40% increase in bandwidth and 15% increase in CC over the 1080 seems to line up perfectly with that (especially considering the difference in bandwidth and those benchmarks being run at 4K).
 
Some more marketing numbers ....
index.php



Deep Learning Super Sampling
Performance measurements will be a bit different this round though as NVIDIA is introducing DLSS, a supersampling antialiasing methodology that runs in AI directly over the Tensor cores after the image frame has been finished. The new technology looks promising and will offer great performance increases as you will not apply to say TAA over the rendering engine, DLSS was running over the tensor cores, AI AA.

We've seen some game demos of the same game running on a 1080 Ti and the other on the 2080 Ti, the performance was often doubled with close to the same image quality. DLSS is short for High-Quality Motion Image generation, Deep Learning Super Sampling. And it looked quite impressive. DLSS is not game specific, and in the future should work on most game titles.

https://www.guru3d.com/news-story/nvidia-releases-performance-metrics-and-some-info-on-dlss.html
 
Last edited:
Why would they put 60Hz for the slide specification? How is that relevant for unlocked fps numbers?
 
Why would they put 60Hz for the slide specification? How is that relevant for unlocked fps numbers?

Because you can maintain at least 60hz on all titles on that slide. Them not including settings other than res makes this a bit pointless. Benchmarks are going to be interesting.
 
It's so stupid of NVIDIA not to include this comparison in the conference, instead of focusing exclusively on RTX.
Them not including settings other than res makes this a bit pointless. Benchmarks are going to be interesting.
NV always tests at Ultra quality, that's how you get the regular 1080 to choke on 4K like that anyway.
 
NV always tests at Ultra quality, that's how you get the regular 1080 to choke on 4K like that anyway.

It's very unlikely that they've tested FFXV at Ultra. It can barely maintain 30 fps on an overclocked 1080 Ti with everything maxed out. The Destiny 2 slide is also a joke, as it can easily reach 100 fps on a 1080 Ti. I'll say it again, benchmarks are going to be interesting :p
 
Exclusive of DLSS or RTX ray-tracing we are seeing gains no less than 30% for the RTX 2080 at 4K HDR 60 Hz.
 
Last edited:
It's very unlikely that they've tested FFXV at Ultra. It can barely maintain 30 fps on an overclocked 1080 Ti with everything maxed out. The Destiny 2 slide is also a joke, as it can easily reach 100 fps on a 1080 Ti. I'll say it again, benchmarks are going to be interesting :p
This isn't 1080Ti, they are comparing it to a regular 1080. regular 2080 vs regular 1080.
And No Destiny 2 @Ultra max settings and 4K can't maintain 60fps on a 1080Ti. Especially with 3D AO and Ultra DoF.

As for 2080Ti vs 1080Ti:
We've seen some game demos of the same game running on a 1080 Ti and the other on the 2080 Ti, the performance was often doubled with close to the same image quality.
https://www.guru3d.com/news-story/nvidia-releases-performance-metrics-and-some-info-on-dlss.html
 
Some german pages rate the new DLAA/DLSS positively. I would like to see this for myself.

"ComputerBase also had the opportunity to take a look at the new DLSS edge smoothing (Deep-Learning-Super-Sampling) itself. A comparison system was set up that showed the Epic infiltrator demo. One monitor was running the demo with the classic TAA, the other with the new DLSS. And indeed, the result looked much better with DLSS - in two respects

At first, the edge smoothing itself was significantly better. The TAA treats some edges very well in the demo, but others less so that the image flickers. With DLSS, on the other hand, the edges are smoothed almost throughout, so that the image appears smoother. Furthermore, the demo with TAA loses some of its sharpness. The problem does not appear with DLSS. And last but not least, there are smaller picture elements that are no longer displayed correctly with TAA, but with DLSS. Finally, the DLSS in Epic's infiltrator demo is visibly better than Temporal Anti-Aliasing. Whether DLSS looks better in current games, however, remains to be seen.

In addition, a frame counter was displayed during the demo. DLSS was calculated on a GeForce RTX 2080 Ti, the TAA system was equipped with a GeForce GTX 1080 Ti. The new Nvidia graphics card was about 20 to 100 percent faster, depending on the scene. Up to twice the performance can be seen in the first Nvidia film in games that also use DLSS. However, this is not directly comparable with each other, so the "DLSS" bars should be ignored in this respect. ComputerBase will clarify exactly what is behind DLSS in the launch review at a later date."

https://www.computerbase.de/2018-08/nvidia-geforce-rtx-2080-performance-benchmark/


"A 4K presentation of DLAA instead of TAA with Epics infiltrator demo showed a drastically smoother frame rate on the Geforce GTX 2080 than on the Geforce GTX 1080 with a subjectively similarly smooth Image."
https://www.golem.de/news/nvidia-tu...rechnet-50-prozent-schneller-1808-136141.html
 
Last edited:
German pages rate the new DLAA/DLSS positively. I would like to see this for myself.

"ComputerBase also had the opportunity to take a look at the new DLSS edge smoothing (Deep-Learning-Super-Sampling) itself. A comparison system was set up that showed the Epic infiltrator demo. One monitor was running the demo with the classic TAA, the other with the new DLSS. And indeed, the result looked much better with DLSS - in two respects

At first, the edge smoothing itself was significantly better. The TAA treats some edges very well in the demo, but others less so that the image flickers. With DLSS, on the other hand, the edges are smoothed almost throughout, so that the image appears smoother. Furthermore, the demo with TAA loses some of its sharpness. The problem does not appear with DLSS. And last but not least, there are smaller picture elements that are no longer displayed correctly with TAA, but with DLSS. Finally, the DLSS in Epic's infiltrator demo is visibly better than Temporal Anti-Aliasing. Whether DLSS looks better in current games, however, remains to be seen.

In addition, a frame counter was displayed during the demo. DLSS was calculated on a GeForce RTX 2080 Ti, the TAA system was equipped with a GeForce GTX 1080 Ti. The new Nvidia graphics card was about 20 to 100 percent faster, depending on the scene. Up to twice the performance can be seen in the first Nvidia film in games that also use DLSS. However, this is not directly comparable with each other, so the "DLSS" bars should be ignored in this respect. ComputerBase will clarify exactly what is behind DLSS in the launch review at a later date."

https://www.computerbase.de/2018-08/nvidia-geforce-rtx-2080-performance-benchmark/


"A 4K presentation of DLAA instead of TAA with Epics infiltrator demo showed a drastically smoother frame rate on the Geforce GTX 2080 than on the Geforce GTX 1080 with a subjectively similarly smooth Image."
https://www.golem.de/news/nvidia-tu...rechnet-50-prozent-schneller-1808-136141.html
Would be interesting to see how good DLSS works on other titles during really live gameplay. DLSS 'AI" has probably been trained running the Infiltrator demo (which is 100% predictable content given that it's simply a real-time "movie") at super high resolutions on the DL clusters for the ground truth generating a perfect ML model which in turn is integrated into the driver. What I'm trying to say is that in the case of the Infiltrator demo (and Porsche demo which is the only other DLSS demo shown) it's super easy to train the AI on it given that the content is 100% predictable (unlike playing a game) and have a perfect model then in the driver to have a nearly perfect IQ.
 
Last edited:
If DLSS isn’t robust enough to have one model work for all visual styles they can always train game specific models and add them to the driver. Not that big a deal and generating training data for games is laughably trivial compared to other ML use cases.

PS. It’s pretty freaking amazing that we’re even discussing machine learning based AA right now :D
 
If DLSS isn’t robust enough to have one model work for all visual styles they can always train game specific models and add them to the driver. Not that big a deal and generating training data for games is laughably trivial compared to other ML use cases.

PS. It’s pretty freaking amazing that we’re even discussing machine learning based AA right now :D

That's exactly what Jensen said they would be doing. It's just that you will never get a better case scenario than a "pre-recorded" & 100% replicable content like the Infiltrator and Prosche demos shown compared to real gameplay.
 
Maybe a silly question for some of the demonstration videos posted here... For instance, the Tomb Raider video: is everything raytraced or just lighting and shadows were RTed?
 
Back
Top