D
Yea I just watched the gaming section of the presentation and saw that. The Super Resolution video upscaling tech looks pretty great, and there's a lot of streams and content out there which could be vastly improved by it. So I'm pretty excited to test that out.I spoke a little too soon, that AI based video upscaling sounds pretty cool.
And that eye repositioning tech is both cool and creepy at the same time. I can see public speakers jumping all over that which isn't necessarily a good thing.
Yea I just watched the gaming section of the presentation and saw that. The Super Resolution video upscaling tech looks pretty great, and there's a lot of streams and content out there which could be vastly improved by it. So I'm pretty excited to test that out.
Yea, pretty sure it will be a global setting and will automatically detect supported videos/streams from supported browsers. Essentially I'm thinking of it being like how the Super Resolution is on the Shield devices. If content isn't supported it doesn't engage, and if it does you can very quickly toggle it on and off with the press of a button.I assume it's automatic or at least can be set globally through the NVCP. Definitely a cool addition with every day applications.
I'm assuming so as well, probably even better. There's some content that looks great using the AI upscaling. It doesn't suit all content but overall it's excellent to have on my Shield and I can toggle it on the remote.Essentially I'm thinking of it being like how the Super Resolution is on the Shield devices.
It has no access to anything but the final output.Does anyone know how this works on the Shield? Specifically, is it operating at the output pixel level, or is it operating further upstream in the decode pipeline, thus having access to motion vectors?
Over half of RTX users run their display at resolutions above 1080p, but over 90% of online video is 1080p or lower, resulting in upscaling that further degrades the picture. In February, we’re releasing RTX Video Super Resolution, which uses AI to improve the quality of any video streamed through Google Chrome and Microsoft Edge browsers on GeForce RTX 30 and 40 Series GPUs. Support for GeForce RTX 20 Series GPUs will come in a later update.
Still you would expect for the model to just work on Turing. Or is it too complex for Turing to handle? But 3050 will do fine? Weird.Turing support comes later
No confirmation yet though would be interesting if true.
Not really that crazy. They have used AI to optimize physical chip layout already. Using it to optimize (parts) of the GPU driver seems logical. In fact I wonder why they only do it now - probably more marketing than actual new thing?That's crazy if so!
Layout and programming are still quite different things. So far AI was merely a helpful tool for doing the most common things programmers don't want to type n-times a day. But maybe I am behind times and should seek a new career.Not really that crazy. They have used AI to optimize physical chip layout already. Using it to optimize (parts) of the GPU driver seems logical. In fact I wonder why they only do it now - probably more marketing than actual new thing?
Curious if creating drivers this way necessitates more or less input from game developers as graphics/rendering techniques advance.