Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Lol, you don't need a 3090 to play raytraced games. A 2060 can play Metro Exodus Enhanced at 1080p with DLSS on and all of the settings jacked up. The 2080 is not a slide show. I don't know where you've been getting your info, but it's wrong.

This is a 2060 LAPTOP

I am not playing into your game. I own 4-5 NVidia cards and two RTX cards. Ray tracing is a novelty, not one pro gamer streams with it on... it is not why people own RTX cards. When the Consoles start getting it in games on a regular basis, then it will matter, until then less is more. Cyberpunk proved that ray tracing is a joke... Consoles will get a chance to show the same game with ray tracing on, after NVidia exclusive 8 month contract is up.... will the difference in ray tracing on the consoles verse the 3090 even matter to 90% of the people...?

I am just being objective and I am not taking anything away from NVidia. DLSS is great, I use it in Warzone, I am just pointing out how DLSS really doesn't have much of a future, being proprietary and exclusive to $700+ hardware...
 
I am just being objective and I am not taking anything away from NVidia. DLSS is great, I use it in Warzone, I am just pointing out how DLSS really doesn't have much of a future, being proprietary and exclusive to $700+ hardware...
Even if you're right, who cares? RTX owners get to have used DLSS for many years before some potential DirectML solution that "someone" would create.
 
I am not playing into your game. I own 4-5 NVidia cards and two RTX cards. Ray tracing is a novelty, not one pro gamer streams with it on... it is not why people own RTX cards. When the Consoles start getting it in games on a regular basis, then it will matter, until then less is more. Cyberpunk proved that ray tracing is a joke... Consoles will get a chance to show the same game with ray tracing on, after NVidia exclusive 8 month contract is up.... will the difference in ray tracing on the consoles verse the 3090 even matter to 90% of the people...?

I am just being objective and I am not taking anything away from NVidia. DLSS is great, I use it in Warzone, I am just pointing out how DLSS really doesn't have much of a future, being proprietary and exclusive to $700+ hardware...

By this logic, high and ultra settings are a novelty because pro gamers play pretty much exclusively low settings.
 
Sorry, but I almost spilled my drink when reading your post....


How many of those "DLSS" Games will the 100 million people on the XSX & PS5 be using...? SO clearly, no Game Developer is going to worry about some 50k ~ 100k RTX Gamers, who happen to have a $700+ dGPU... !!

It does not matter if DLSS is better, when FSR will have superior adoption rate. (see: Betamax vs VHS)

What have the PS5/Xbox to do with DLSS which is only and only for the PC space. Your lost in the wrong section of the forum.

50-100k RTX gamers? I'm guessing you didn't fact check that before posting it?

Try 20 million as of February...

https://www.hardwaretimes.com/nvidi...eries-shortages-to-continue-thru-q2-2020/amp/

Are there that many RTX gpu owners? Wow.... Much more than all next gen consoles combined.... AMD gpus not even counted.

lol...
How many of those were $700+ RTX cards, that can actually do ray tracing..? Because we know the 3090 struggles with RT and the 2080 is a slide show. Game Developers know this, even if you feign ignorance of it.

Again, you are NOT being objective.

Lol. My 2080Ti does ray tracing very fine. Even a 2060 is suffice when using DLSS. A 2060 will run CP2077 at high/ultra DXR when using the appropiate dlss modes. Theres many videos on YT lol.
 
  • Like
Reactions: HLJ
Even if you're right, who cares? RTX owners get to have used DLSS for many years before some potential DirectML solution that "someone" would create.
I agree, and there was no argument in that.
RTX cards are directX 12 U compliant and can do directML anyways. Obvious point being that directML being the common denominator between PC & Consoles. And game Developers may look past DLSS as the industry shifts to agnostic solutions w/support for each brand of hardware (ie: directML & metacommands, etc)


Again, don't mistake me for knocking NVidia's technology, because I think DLSS is great and I have two Gaming rigs based on RTX cards. Again, I use DLSS in COD @ 3440 x 1440 for added frames, & many of my friends use DLSS as well.. But I also understand that nvidia's way, is not the only way. And I see Microsoft's goals.

Machine learning is going to be a huge part of games in the future, with many different engines and needs, dGPU are going to have to be flexible..
I welcome it all.
 
Yes, but how is this relevant to DLSS? Unless Microsoft is including a free temporal upscaling ML model in the DirectML SDK that I haven't heard about.
The difference between an API and an implementation of an algorithm is well understood by anyone with some basic undergrad CS training. It's great that these forums present an opportunity for casual tech enthusiasts (who may be formally untrained) to learn from others and gradually partake in deep technical discussions. Such discussions may even encourage them to take up formal training and contribute to the field. The problem arises when a select few learn a couple of buzzwords, jump onto the Dunning-Krueger rollercoaster and somehow fly off on a tangent at the peak never to come down to earth again.
 
  • Like
Reactions: HLJ
  • Like
Reactions: HLJ
There’s also this.

“Xbox is leveraging machine learning to make traditional rendering algorithms more efficient and to provide better alternatives. The Xbox graphics team is seeking an engineer who will implement machine learning algorithms in graphics software to delight millions of gamers. Work closely with partners to develop software for future machine learning hardware. ”

https://www.techspot.com/news/90323-microsoft-hiring-engineers-develop-ai-based-upscaling-tech.html
So something not Open source and exclusive for MS consoles and gaming partners.
 
It's not in the SDK, but there actually is a free DirectML sample for super resolution:
https://github.com/microsoft/DirectML/tree/master/Samples/DirectMLSuperResolution
referenced here (timestamped link):

It's more of a starter though, not really optimized for realtime performance and such.

It's a single image based ML super resolution that was used in DLSS 1.0 but has now been abandoned.
MS demoed it in 2018 Siggraph with Nvidia Titan V and Nvidia trained model, before release Turing.
 
Ray tracing is a novelty, not one pro gamer streams with it on... it is not why people own RTX cards.

What is this streaming malarkey? Is that something I need to look into if I’m looking at PRO Gaming as a long term viable career option?

This is all very confusing news. I was enjoying ray tracing in Control, Metro and Doom and such but now I feel like I’ve been doing it wrong.
 
It's a single image based ML super resolution that was used in DLSS 1.0 but has now been abandoned.
MS demoed it in 2018 Siggraph with Nvidia Titan V and Nvidia trained model, before release Turing.
Yes, it's a starting point, not a turnkey solution. Apparently, they did some research into this area and then, for whatever reason, abandoned it.

Also AMD did not take this approach for FSR. I guess it has good reasons, not going that way for the hardware they are developing it for. Who know, maybe coming down from 60 to 16 ms was already much of the optimization possible when running this without specizalized hardware? Maybe we'll be seeing a revival of this approach with DirectML, once more hardware has the TFlops to spare.
 
Back
Top