Nvidia DLSS antialiasing discussion *spawn*

Discussion in 'Architecture and Products' started by DavidGraham, Sep 19, 2018.

Tags:
  1. glow

    glow Newcomer

    You linked a tech talk slide, so you had to have read it. Using an API =/= source code for whatever is behind that API.

    Exactly. Should have been obvious that DLSS implementation in a game engine (the topic of the post I quoted from Dampf) was the issue at fault here. IMO, Nvidia has to get ahead of half baked DLSS work based off of cargo culting the CDPR leaks, since before that, it was more or less (mostly less, IMO) Nvidia-invite-only to use it in your project.
     
  2. OlegSH

    OlegSH Regular

    That's a GPL license issue, it has nothing to do with DLSS technical implementation and obviously CDPR leaks have nothing to do with the DLSS plugin release, do you imagine them making this plugin in a week and just because someone's source has leaked?
    Of course that's not the case, I can only suggest you rereading manux's posts.
     
    HLJ, pharma, DegustatoR and 2 others like this.
  3. manux

    manux Veteran

    I'm interested in technology. This recent discussion was about did cdpr2077 leak lead to publishing unreal plugin for all. No, it did not.

    Nvidia/closed source/... bad and other non technical stuff has been gone over many times before. I'm not participating into that as that is waste of time. No conclusion will be found and haters and fanboys will emerge. Those things should be in another non technical thread.
     
    jlippo, HLJ and PSman1700 like this.
  4. PSman1700

    PSman1700 Legend

    Theres also tensor/DLSS hardware involved on the GPUs themselfs.
     
  5. manux

    manux Veteran

    If one reverse engineered the neural net it would be possible to run and optimize version for shader cores. Where that would lead, who knows. Likely slower than tensor cores but how much slower is another question.

    Great thing about unreal plugin is that amd can use same API's and to create their own super resolution plugin. If amd is smart whatever they do will work with all unreal4.26 titles using same plugin mechanism and API's that nvidia now uses. This could be one reason why it took time to make plugin. Perhaps epic consulted intel and amd also on how to expose the necessary data to plugin doing scaling before committing to specific approach.
     
  6. Kaotik

    Kaotik Drunk Member Legend

    Yes and no.
    Yes, as in current version of DLSS is running on tensor cores.
    No, as in nothing would prevent one running the same code on CUDA cores or AMDs Compute Units for that matter (or rather, same calculations, they could use some internal API which obviously wouldn't run on AMD hardware as it is).
     
  7. PSman1700

    PSman1700 Legend

    Not disputing that. But i can imagine DLSS being the most performant on the dedicated hardware for a while to come. Obviously using CU's/compute does eat performance there.
    I am on the other hand welcoming the idea of seeing DLSS across AMD, Intel GPUs which is a good thing for everyone, even though it will most likely be the most performant etc on NV's hardware.
     
  8. DegustatoR

    DegustatoR Veteran

    This is a bit like saying that nothing prevents you from running Minecraft RTX on a GTX9800.
     
    PSman1700 likes this.
  9. Malo

    Malo Yak Mechanicum Legend Subscriber

    That's a bit extreme of an example :) Control showed DLSS 2 performance on compute was feasible but obviously somewhat faster on Tensors.
     
  10. Dictator

    Dictator Regular

    That was not doing the network though - rather an image scaler approximate (aka, simple history buffer).
     
    iroboto, HLJ, Malo and 4 others like this.
  11. DegustatoR

    DegustatoR Veteran

  12. pharma

    pharma Veteran

    NVIDIA DLSS for Unreal Engine (pugetsystems.com)

    Written on February 17, 2021 by Kelly Shipman
    [​IMG]
     
  13. pharma

    pharma Veteran

     
    Lightman likes this.
  14. HLJ

    HLJ Regular

  15. dorf

    dorf Newcomer

    That's because it doesn't exist yet and when you try to enable it, it just falls back to Balanced mode.
     
    Lightman, CeeGee, Dictator and 3 others like this.
  16. HLJ

    HLJ Regular

    Roger, I had like a divide by zero moment there :-D
     
  17. MfA

    MfA Legend

  18. Kaotik

    Kaotik Drunk Member Legend

    No, TXAA and TAAU are different techniques.
    TXAA, for all I could find, refers to Unreal Engines Temporal AA without upsampling compontent, while TAAU is Temporal AA with upscaling
     
  19. DavidGraham

    DavidGraham Veteran

    TXAA is originally a technique developed by NVIDIA that has TAA + MSAA components, it preceded TAA during the Kepler architecture era, it had 2X, 4X and 8X levels, and was featured in games such as GTA V, Assassin's Unity, and Crysis 3.
     
Loading...

Share This Page

Loading...