Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
nVidia have a VR tracking solution?
hm...
https://uploadvr.com/nvidia-smi-siggraph-foveated-rendering-eye-tracking/ :?: (2016 though)

still goggling

https://blogs.nvidia.com/blog/2016/08/30/eye-tracking-deep-learning/

And in this scenario, not sure how much power is needed to run AI models, like the expectation of it. I can assume that more power will run things faster ie higher frame rate or higher resolution. But it’s not exactly a 1:1. If you put too much power it may never be used.

Superscaling GameDVR/streams?
 
Last edited:
Superscaling GameDVR/streams?
lol sure.

the discussion on DLSS vs DirectML is an interesting topic. DLSS is clearly not something you can program for, the more i think about it. It's entirely built into the driver, a bit like forced super sampling etc.

Nvidia will be responsible in improving that algorithm and it's performance and DLSS is a blackbox.

DirectML is something that programmers must code into the render path, leaving it up to developers to modify or change the algorithm to their desired needs, so we may see similar progression in AI up-res as we do to algorithmic upscaling.
 
How about training bots to fill in the gaps for larger multiplayer games?

300 Spartans :v
 
Thanks, didn't realize there was a discussion. :oops:

/hangs up mod cape
for 3 seconds
 
To make room for the DLSS to be used on the "Tensor Cores".
I don't believe that was the reason at all. They started implementing RT into their engine long before (8 months+) Turing was available so their de-noising was developed over compute.
 
I don't believe that was the reason at all. They started implementing RT into their engine long before (8 months+) Turing was available so their de-noising was developed over compute.
They developed it on Volta, which had Tensor Cores as well. NVIDIA was probably planning DLSS into the mix long before Turing was unveiled.
 
They developed it on Volta, which had Tensor Cores as well. NVIDIA was probably planning DLSS into the mix long before Turing was unveiled.
considering that we have a demo of a iGPU that can do it, i have little doubt that it requires the usage of 110 TensorFlops of power to perform DLSS. Pretty sure ML in the middle of a render pipeline is not ready until DirectML is released.
 
considering that we have a demo of a iGPU that can do it, i have little doubt that it requires the usage of 110 TensorFlops of power to perform DLSS. Pretty sure ML in the middle of a render pipeline is not ready until DirectML is released.
iGPU running DLSS-like feature? Care to link, I want to see this :cool:
 
Cheers, they didn't mention what it was running on though? (at least when skipping to 14o_Ox)
It looked like a Lenovo laptop. Lol. That’s more or less what I thought it was. I could be wrong though.

Seemed to be running the game at low settings and not super well either. Honestly not sure. They mentioned that it was a seemingly regular setup.
 
It looked like a Lenovo laptop. Lol. That’s more or less what I thought it was. I could be wrong though.

Seemed to be running the game at low settings and not super well either. Honestly not sure. They mentioned that it was a seemingly regular setup.
Yeah, but many Lenovo laptops include dGPU ;)
 
Status
Not open for further replies.
Back
Top