Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Maybe a silly question for some of the demonstration videos posted here... For instance, the Tomb Raider video: is everything raytraced or just lighting and shadows were RTed?
Only the shadows. Oh and DLSS doesn't universally work on all games but will require driver support for each title (each game will have its on ML model trained by Nvidia which will then be included in the driver).
 
According to videocardz, Sept 14th for benchmarks.

https://videocardz.com/77696/exclusive-nvidia-geforce-rtx-2080-ti-editors-day-leaks

No one has actual drivers yet for the cards. Don't look at this pic. It's embargoed until 9/14.


NVIDIA-Turing-vs-Pascal-Shader-Performance.jpg
 
If true, nice departure from same day release/embargo lifted scenarios!
 
What I'm trying to say is that in the case of the Infiltrator demo (and Porsche demo which is the only other DLSS demo shown) it's super easy to train the AI on it given that the content is 100% predictable (unlike playing a game) and have a perfect model then in the driver to have a nearly perfect IQ.
The network has probably only a few thousand/ten thousand/whatever parameters.

That’s not nearly enough for it to matter whether it’s a predictable movie or a game.

They likely trained it on lots of data from different games and have a universal set of parameters. That’s how anybody would reasonably do, just like what everybody does for DNN picture upscalers and denoisers.

https://engineering.flipboard.com/2015/05/scaling-convnets

What you’re describing is an overtrained network.
 
The network has probably only a few thousand/ten thousand/whatever parameters.

That’s not nearly enough for it to matter whether it’s a predictable movie or a game.

They likely trained it on lots of data from different games and have a universal set of parameters. That’s how anybody would reasonably do, just like what everybody does for DNN picture upscalers and denoisers.

https://engineering.flipboard.com/2015/05/scaling-convnets

What you’re describing is an overtrained network.
They are actually going to train the network on a game to game basis with each game having their own model which will then be included in the drivers. So it looks like they are not only going to have a universal data set but also one for each game. That's how it was also described during the Quadro RTX launch keynote at Siggraph (Porsche demo was rendered at super high res on the cluster to train the model which was then used to denoise and AA the real-time version.)
 
Last edited:
Sounds like something that game developers might eventually do once the process is better known.
DLSS technology applies deep learning and AI to rendering techniques thanks to the AI tensor cores for crisp, smooth edges on rendered objects in supporting game titles. DLSS will be part of the driver and NVIDIA will use game titles that developers give them for the deep learning / AI processing in NVIDIA’s labs. There will be no fee for developers for this and if a patch comes out that requires changes, NVIDIA will need to re-learn the title and that could take anywhere from a number of days to a week or so. NVIDIA gets game titles and patches ahead of the public, so they don’t see the AI processing time being an issue. As of right now the only cards that will support DLSS would be the GeForce RTX 2070, RTX 2080, and RTX 2080 Ti graphics cards.
....
NVIDIA showed us a demo that had a GeForce GTX 1080 Ti running the Temporal Anti-Aliasing (TAA) method right now to a system with a GeForce RTX 2080 Ti. The demo itself was the Unreal Engine 4 Infiltrator tech demo in 4K on NVIDIA G-Sync displays. Right off the bat we noticed that the system with the RTX 2080 Ti card was running the demo smoother, but it was also crisper. The frame rate meter in the upper left had corner had the GeForce GTX 1080 Ti running at 31 FPS and the GeForce RTX 2080 Ti running at 71 FPS in the scene below. That makes the 2080 Ti about 229% faster than the 1080 Ti and with better looking images being rendered. NVIDIA has shown a 50% increase in performance going from a 1080 Pascal based card to a 2080 Turing based card, so it looks like enabling DLSS helps performance and doesn’t hurt it. All of the technical discussion around DLSS is still under embargo, but it looks impressive.
http://www.legitreviews.com/nvidia-deep-learning-super-sampling-dlss-shown-to-press_207461
 
Last edited by a moderator:
If DLSS really is all about upsampling a low resolution image I can see it becoming a baseline requirement for consoles next generation. It has to be a lot cheaper than shipping hardware that can handle native 4K.

To be honest I don’t know how I feel about it after all these years of “real” pixels. Need to learn more about the science behind it.
 
Would be interesting to see how good DLSS works on other titles during really live gameplay. DLSS 'AI" has probably been trained running the Infiltrator demo (which is 100% predictable content given that it's simply a real-time "movie") at super high resolutions on the DL clusters for the ground truth generating a perfect ML model which in turn is integrated into the driver. What I'm trying to say is that in the case of the Infiltrator demo (and Porsche demo which is the only other DLSS demo shown) it's super easy to train the AI on it given that the content is 100% predictable (unlike playing a game) and have a perfect model then in the driver to have a nearly perfect IQ.

Of course this is a ideal scenario. Nevertheless, the visual style of the game is trained and mostly this does not change significantly.

Since the training is based on real ground truth I have read that the method also has the potential to correctly suppress artifacts that are in a rasterized image but still worng and therefore the image would be closer to what higher resolution or supersampling produces.
 
Last edited:
They are actually going to train the network on a game to game basis with each game having their own model which will then be included in the drivers.
So it looks like they are not only going to have a universal data set but also one for each game. That's how it was also described during the Quadro RTX launch keynote at Siggraph (Porsche demo was rendered at super high res on the cluster to train the model which was then used to denoise and AA the real-time version.)
Here’s how things could work:
You typically train with a large set of disparate images from different sources. These should be sufficient to get good quality. If you have very specific high quality needs, you could do incremental training steps for a specific scenario.

It’s possible that Nvidia does this for their professional rendering, but I doubt that it is even the case for the Porsche example. My reading of the Siggraph presentation was that he just gave an example about how training works in general, not that they retrain the network for each particular case. It seems like it would be prohibitively time consuming to do so.

Edit: the paragraph above has already been invalidated by a later comment.

But, hey, if they do this even for games, why not? Even there my original point stands: the network would not be able to create a perfect image because the amounts of parameters is way too low. You’d just have an additional boost in quality.
 
Last edited:
If DLSS really is all about upsampling a low resolution image I can see it becoming a baseline requirement for consoles next generation. It has to be a lot cheaper than shipping hardware that can handle native 4K.
It’s not exactly the same if they use stochastic sampling for the ray tracing part. With regular upsampling, you know that the known pixels are in a regular grid.
Whether this makes it easier or harder to achieve good quality? No idea.

To be honest I don’t know how I feel about it after all these years of “real” pixels. Need to learn more about the science behind it.
Think of the difficulty of benchmarking this. Just like Rage, which modulated render quality to keep constant frame rates. So much opportunities for online controversies!
 
Toms Hardware take on the marketing slides ....

NVIDIA-GeForce-RTX-2080-vs-GTX-1080-1000x506.jpg



Its comparison necessitates a bit of analysis, though.

Right out of the gate, we see that six of the 10 tested games include results with Deep Learning Super-Sampling enabled. DLSS is a technology under the RTX umbrella requiring developer support. It purportedly improves image quality through a neural network trained by 64 jittered samples of a very high-quality ground truth image. This capability is accelerated by the Turing architecture’s tensor cores and not yet available to the general public (although Tom’s Hardware had the opportunity to experience DLSS, and it was quite compelling in the Epic Infiltrator demo Nvidia had on display).

The only way for performance to increase using DLSS is if Nvidia’s baseline was established with some form of anti-aliasing applied at 3840x2160. By turning AA off and using DLSS instead, the company achieves similar image quality, but benefits greatly from hardware acceleration to improve performance. Thus, in those six games, Nvidia demonstrates one big boost over Pascal from undisclosed Turing architectural enhancements, and a second speed-up from turning AA off and DLSS on. Shadow of the Tomb Raider, for instance, appears to get a ~35 percent boost from Turing's tweaks, plus another ~50 percent after switching from AA to DLSS.

In the other four games, improvements to the Turing architecture are wholly responsible for gains ranging between ~40 percent and ~60 percent. Without question, those are hand-picked results. We’re not expecting to average 50%-higher frame rates across our benchmark suite. However, enthusiasts who previously speculated that Turing wouldn’t be much faster than Pascal due to its relatively lower CUDA core count weren’t taking underlying architecture into account. There’s more going on under the hood than the specification sheet suggests.


index.php


A second slide calls out explicit performance data in a number of games at 4K HDR, indicating that those titles will average more than 60 FPS under GeForce RTX 2080.

Nvidia doesn’t list the detail settings used for each game. However, we’ve already run a handful of these titles for our upcoming reviews, and can say that these numbers would represent a gain over even GeForce GTX 1080 Ti if the company used similar quality presets
.


https://www.tomshardware.com/news/nvidia-rtx-2080-gaming-benchmarks-rasterized,37679.html
 
Last edited by a moderator:
Only the shadows. Oh and DLSS doesn't universally work on all games but will require driver support for each title (each game will have its on ML model trained by Nvidia which will then be included in the driver).

Well that would likely make it fairly useless to me as I mostly play indie games or high-A games nowadays. It's fairly unlikely for NV to go through the trouble to do this for the 100's or 1000's of indie and high-A games out there.

Regards,
SB
 
Well that would likely make it fairly useless to me as I mostly play indie games or high-A games nowadays. It's fairly unlikely for NV to go through the trouble to do this for the 100's or 1000's of indie and high-A games out there.
Stay tuned!
 
I presume that VRAM utilization is going to be much higher for most game engines given that the entirety of the scene is going to have to be transformed and sorted in a BVH that's resident in VRAM? You couldn't do any sort of early frustum culling on the CPU, correct?
 
Are you a nvidia rep with privileged information?
With all the stock options? ... yea I wish!:LOL:

Actually I'm dying for more info, but something TomsHardware said about DLSS which made me think the process could possibly incorporate a "generic" algorithm in the future for those old games no longer supported. While not perfect it may provide a better quality than the original game.
DLSS is a technology under the RTX umbrella requiring developer support. It purportedly improves image quality through a neural network trained by 64 jittered samples of a very high-quality ground truth image. This capability is accelerated by the Turing architecture’s tensor cores and not yet available to the general public (although Tom’s Hardware had the opportunity to experience DLSS, and it was quite compelling in the Epic Infiltrator demo Nvidia had on display).
https://forum.beyond3d.com/posts/2040706/
 
Back
Top