Nvidia DLSS antialiasing discussion *spawn*

Discussion in 'Architecture and Products' started by DavidGraham, Sep 19, 2018.

Tags:
  1. glow

    Newcomer

    Joined:
    May 6, 2019
    Messages:
    38
    Likes Received:
    28
    You linked a tech talk slide, so you had to have read it. Using an API =/= source code for whatever is behind that API.

    Exactly. Should have been obvious that DLSS implementation in a game engine (the topic of the post I quoted from Dampf) was the issue at fault here. IMO, Nvidia has to get ahead of half baked DLSS work based off of cargo culting the CDPR leaks, since before that, it was more or less (mostly less, IMO) Nvidia-invite-only to use it in your project.
     
  2. OlegSH

    Regular Newcomer

    Joined:
    Jan 10, 2010
    Messages:
    609
    Likes Received:
    1,061
    That's a GPL license issue, it has nothing to do with DLSS technical implementation and obviously CDPR leaks have nothing to do with the DLSS plugin release, do you imagine them making this plugin in a week and just because someone's source has leaked?
    Of course that's not the case, I can only suggest you rereading manux's posts.
     
    HLJ, pharma, DegustatoR and 2 others like this.
  3. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,270
    Location:
    Self Imposed Exhile
    I'm interested in technology. This recent discussion was about did cdpr2077 leak lead to publishing unreal plugin for all. No, it did not.

    Nvidia/closed source/... bad and other non technical stuff has been gone over many times before. I'm not participating into that as that is waste of time. No conclusion will be found and haters and fanboys will emerge. Those things should be in another non technical thread.
     
    jlippo, HLJ and PSman1700 like this.
  4. PSman1700

    Legend Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    5,002
    Likes Received:
    2,235
    Theres also tensor/DLSS hardware involved on the GPUs themselfs.
     
  5. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,270
    Location:
    Self Imposed Exhile
    If one reverse engineered the neural net it would be possible to run and optimize version for shader cores. Where that would lead, who knows. Likely slower than tensor cores but how much slower is another question.

    Great thing about unreal plugin is that amd can use same API's and to create their own super resolution plugin. If amd is smart whatever they do will work with all unreal4.26 titles using same plugin mechanism and API's that nvidia now uses. This could be one reason why it took time to make plugin. Perhaps epic consulted intel and amd also on how to expose the necessary data to plugin doing scaling before committing to specific approach.
     
  6. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,878
    Likes Received:
    4,053
    Location:
    Finland
    Yes and no.
    Yes, as in current version of DLSS is running on tensor cores.
    No, as in nothing would prevent one running the same code on CUDA cores or AMDs Compute Units for that matter (or rather, same calculations, they could use some internal API which obviously wouldn't run on AMD hardware as it is).
     
  7. PSman1700

    Legend Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    5,002
    Likes Received:
    2,235
    Not disputing that. But i can imagine DLSS being the most performant on the dedicated hardware for a while to come. Obviously using CU's/compute does eat performance there.
    I am on the other hand welcoming the idea of seeing DLSS across AMD, Intel GPUs which is a good thing for everyone, even though it will most likely be the most performant etc on NV's hardware.
     
  8. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,370
    Likes Received:
    1,888
    Location:
    msk.ru/spb.ru
    This is a bit like saying that nothing prevents you from running Minecraft RTX on a GTX9800.
     
    PSman1700 likes this.
  9. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,349
    Likes Received:
    4,786
    Location:
    Pennsylvania
    That's a bit extreme of an example :) Control showed DLSS 2 performance on compute was feasible but obviously somewhat faster on Tensors.
     
  10. Dictator

    Regular Newcomer

    Joined:
    Feb 11, 2011
    Messages:
    459
    Likes Received:
    2,683
    That was not doing the network though - rather an image scaler approximate (aka, simple history buffer).
     
    iroboto, HLJ, Malo and 4 others like this.
  11. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,370
    Likes Received:
    1,888
    Location:
    msk.ru/spb.ru
  12. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    923
    Likes Received:
    744
  13. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    4,278
    Likes Received:
    3,536
    NVIDIA DLSS for Unreal Engine (pugetsystems.com)

    Written on February 17, 2021 by Kelly Shipman
    [​IMG]
     
  14. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    4,278
    Likes Received:
    3,536
  15. HLJ

    HLJ
    Regular Newcomer

    Joined:
    Aug 26, 2020
    Messages:
    421
    Likes Received:
    700
  16. dorf

    Newcomer

    Joined:
    Dec 21, 2019
    Messages:
    110
    Likes Received:
    346
    That's because it doesn't exist yet and when you try to enable it, it just falls back to Balanced mode.
     
    Lightman, CeeGee, Dictator and 3 others like this.
  17. HLJ

    HLJ
    Regular Newcomer

    Joined:
    Aug 26, 2020
    Messages:
    421
    Likes Received:
    700
    Roger, I had like a divide by zero moment there :-D
     
  18. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,347
    Likes Received:
    687
  19. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,878
    Likes Received:
    4,053
    Location:
    Finland
    No, TXAA and TAAU are different techniques.
    TXAA, for all I could find, refers to Unreal Engines Temporal AA without upsampling compontent, while TAAU is Temporal AA with upscaling
     
  20. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,578
    Likes Received:
    4,289
    TXAA is originally a technique developed by NVIDIA that has TAA + MSAA components, it preceded TAA during the Kepler architecture era, it had 2X, 4X and 8X levels, and was featured in games such as GTA V, Assassin's Unity, and Crysis 3.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...