Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Sure it was. Since it had essentially the same shading h/w which was also not compatible with DX8's PS 1.0.
They got more complex? Everyone "copies" designs btw because you don't really have a lot of options if you want to be compatible with DX and VK/OGL. Remember S3TC? That was "copied" by everyone.
Seriously? You're actually suggesting that ATi in about 6 months analyzed what NVIDIA did, designed their own hardware capable of same, designed a chip around it and mass produced the damn thing ready to markets?

edit: and not putting my head on the table for it, but IIRC R100's pixel shader'esque capabilities were more advanced than GF256
edit2: as for S3TC, it wasn't copied by everyone, it was licensed by everyone.
 
Last edited:
Seriously? You're actually suggesting that ATi in about 6 months analyzed what NVIDIA did, designed their own hardware capable of same, designed a chip around it and mass produced the damn thing ready to markets?
Yes, seriously. It is you however who are coming up with the "6 months" figure again.
What was added in NV10 was a logical evolution of what was there in NV5. That evolution could have been done by anyone, especially since MS was actually planning on standardizing the PS 0.5 spec up until some point.
Also note that it was about 6 months between NV5 and NV10 and then another 6 months till R100. So...
 
Alan Wake Remastered will use DLSS, but no RT

PC specific features

  • PC version will be x64 and support DX12 only
  • No Ray Tracing
  • DLSS
    Nvidia DLSS – Off, Ultra-Performance, Performance, Balanced, Quality
  • Ultra-wide screen support
    Yes, 21:9 aspect ratio
  • Caveats: pre-rendered cut scenes will not render like this – they are 16:9
  • Unlocked frame-rate
    Yes – as in the original game
  • Display: full screen / window / borderless?
    All supported

    https://www.alanwake.com/story/faq/
 
Elder Scrolls Online will be implementing DLSS and something "new" termed DLAA (Deep Learning Anti- Aliasing). Functionally it seems like DLAA will be akin to the originally termed DLSS 2X mode that never made an official appearance. However with the wording I'm not exactly clear if this is referring to a new mode/function (or going forward) or an ad-hoc type implementation specifically by Bethesda. I also have some recollection of DLAA being used by Nvidia before, but I think it was in the context of a presentation geared towards the production side?

 
Elder Scrolls Online will be implementing DLSS and something "new" termed DLAA (Deep Learning Anti- Aliasing). Functionally it seems like DLAA will be akin to the originally termed DLSS 2X mode that never made an official appearance. However with the wording I'm not exactly clear if this is referring to a new mode/function (or going forward) or an ad-hoc type implementation specifically by Bethesda. I also have some recollection of DLAA being used by Nvidia before, but I think it was in the context of a presentation geared towards the production side?


Certainly sounds like an ad-hoc implementation by Bethesda but very interesting nonetheless. I can't wait to see what kind of image quality : performance ratio this is able to push out. One can imagine something pretty insane if you consider Quality mode targets 4x the native pixels and is generally considered as good as native even when having to upscale as well. Could it be that image quality at 4k could be something akin to supersampled 7680x4320 with the relatively cheap ms cost of DLSS Q (only 2.5ms on an RTX 2060)? Ultra Quality mode could be even more insane with 9x the pixels. Hopefully this will prompt similar implementations in more games, perhaps the actual standardised mode from Nvidia that we've been waiting for.
 
Hmm, Isn't that a mythical "DLSS 2X" announced in 2018?
In addition to the DLSS capability described above, which is the standard DLSS mode, we provide a second mode, called DLSS 2X. In this case, DLSS input is rendered at the final target resolution and then combined by a larger DLSS network to produce an output image that approaches the level of the 64x super sample rendering – a result that would be impossible to achieve in real time by any traditional means. Figure 21 shows DLSS 2X mode in operation, providing image quality very close to the reference 64x super-sampled image.

NV Dev Blog - NVIDIA Turing Architecture In-Depth
 
If it actually worked we would’ve seen it in games by now.
It died to NVIDIA allowing DLSS only in cases where it improves performance. That's why DLSS was first limited to certain resolutions with certain cards, too.
 
Exactly. So why haven’t we seen it in games yet.
Because the point of DLSS is to provide performance benefits at close to native image quality. Such mode would provide AA but at performance cost. Possibly it wouldn't be anything special over your typical TAA in either of these.
 
I don't see why it wouldn't work. Render in native, do temporal accumulation, use AI to resolve final pixels. It's basically TAA.
You could also have a scenario where you could be running DLAA + DSR for the added benefits.
 
What Degustator posted makes sense. Perhaps minimal quality benefit so they focused all efforts to improve DLSS reconstruction instead.
 
You can even turn it on in UE4, should you please.
It doesn't work in UE. UE is capped at 67% just like everything else (except Control). The console variable for ultra quality mode does exist but it just falls back to balanced mode when you try to enable it.

If this DLAA thing is true, it looks like competition pressured Nvidia to finally unlock higher input res. About bloody time Jensen kept his promise...
 
Elder Scrolls Online will be implementing DLSS and something "new" termed DLAA (Deep Learning Anti- Aliasing). Functionally it seems like DLAA will be akin to the originally termed DLSS 2X mode that never made an official appearance. However with the wording I'm not exactly clear if this is referring to a new mode/function (or going forward) or an ad-hoc type implementation specifically by Bethesda. I also have some recollection of DLAA being used by Nvidia before, but I think it was in the context of a presentation geared towards the production side?

I'd love to see that, as ESO has some serious temporal aliasing problems, even at 4K resolution. It has long view distance and some nice architectures, so things like staircase or wall linings tend to have a lot of temporal aliasing.
I tried with 4K DSR at 1080p (basically 4X SSAA) and the aliasing is still quite bad. Final Fantasy XIV also has a similar aliasing problem.
 
Looks like this might be a new Nvidia DLSS mode as opposed to custom implementation by Bethesda afterall:

https://wccftech.com/nvidia-dlaa-deep-learning-antialiasing-to-debut-in-elder-scrolls-online/

Creative Director Rich Lambert said:
While we were working on adding NVIDIA DLSS, we also worked with them on some new tech that we're going to be the first game that's ever done this before. This is debuting their new tech, which is called NVIDIA DLAA. It's the same kind of concept, you won't get a performance boost out of this but what you get is absolutely incredible anti-aliasing. It's unbelievable, it's crazy how good it is.

Sounds like it might be a significant step up over TAA. Consider me very excited!!
 
Back
Top