AMD Radeon RDNA2 Navi (RX 6700 XT, RX 6800, 6800 XT, 6900 XT) [2020-10-28, 2021-03-03]

Discussion in 'Architecture and Products' started by BRiT, Oct 28, 2020.

  1. troyan

    Regular Newcomer

    Joined:
    Sep 1, 2015
    Messages:
    328
    Likes Received:
    630
    So like 16GB VRAM?
     
    pharma, pjbliverpool and PSman1700 like this.
  2. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    4,520
    Likes Received:
    2,074
    Maybe, but i doubt NV was ever in consideration for Sony/MS, if thats what you mean? I think that AMD was always going to enter the ray tracing market, a push from sony/ms or not. Sony/MS probably had a good following of AMD's tech though, while keeping to the seven year generation shifts.

    The divert from Vega to RDNA most likely had not that much to do with MS or sony either. Rather then Vega being not all that great, and the need for a better architecture. Looking at it like this, they have come long ways from the vega days.
    GCN was very competitive (or even better then NV), then things happened to the worse, now they are keeping up in rasterization at the least (IC helping out the BW, but 4k can take hits).
     
  3. neckthrough

    Newcomer

    Joined:
    Mar 28, 2019
    Messages:
    66
    Likes Received:
    159
    Even though we do not have Alex's training to meticulously tear each frame apart, our brains do subconsciously detect the inconsistencies. Decades of living in the real world has trained our brains to recognize what feels "right". We may feel the scene is "gamey" but without being able to precisely articulate what's wrong. Imperfections in geometry are easy to articulate even to the untrained eye, but lighting and shadows are much more subconscious. Subconscious does not mean it is insignificant. To me personally, switching from inaccurate to a more accurate lighting yields a "hmm, yeah I suppose this is better" response. But switching back from accurate to inaccurate lighting yields a "oh this is garbage!" response. The more time you spend with the accurate modeling, the stronger this response is. There's simply no going back.
     
    nAo, sonen, DavidGraham and 6 others like this.
  4. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,193
    Likes Received:
    1,560
    Location:
    msk.ru/spb.ru
    They definitely did and if you'll look into the timeline of AMD's RT patent fillment you'll see that it happened after the launch of Turing. There's also this rumor of PS5 being initially planned for launch at the end of 2019 on RDNA1 but delayed from there to 2020 to re-base on RDNA2.
     
    PSman1700 likes this.
  5. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    8,545
    Likes Received:
    2,888
    Location:
    Guess...
    I think it's more likely that the console win got significantly more likely around that time and so they realised their priority should be with Navi rather than Vega.
     
    PSman1700 likes this.
  6. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    11,200
    Likes Received:
    1,748
    Location:
    New York
    Exactly! Advances in rendering are much more obvious looking backward. That's just how the human brain works, we don't appreciate anything until it's gone.
     
    sonen, PSman1700 and HLJ like this.
  7. HLJ

    HLJ
    Regular Newcomer

    Joined:
    Aug 26, 2020
    Messages:
    352
    Likes Received:
    589
    I think NVIDIA simply surprised AMD.
    Not with raytracing features, AMD knew they were comming in NVIDIA's GPU's a
    But with the fact that DLSS made realtime hybrid rendering possible.

    The whole DXR thing (without DLSS) would have been 2-3 generarions out from where we are now.

    That is also why AMD scrambled to announce they also would have "upscaling technology"...abd since have shown nothing...because they got caught on the wrong foot.

    NVIDIA dangled DXR (Going all-in and pushing the RTX branding so hard it even less informed posters calls DXR for RTX to this day) right infront if AMD...and AMD never saw DLSS coming in from the side IMHO.
     
  8. HLJ

    HLJ
    Regular Newcomer

    Joined:
    Aug 26, 2020
    Messages:
    352
    Likes Received:
    589
    Like going from AA to no AA...terribad!
    Or sitting at a faster PC, then going back to a slower one...unpossible!

    DXR lighting is very hard to ignore once it has been seen...going back looks more "fake" and my brain picks that up right away.
     
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,687
    Likes Received:
    7,681
    No demanding necessary. All MS did was state that RT would be in an upcoming version of DX (2018).

    AMD likely planned to have RT at launch in RDNA in the same timeframe as NV, 2018. That was the original target date for RDNA with PS5 rumored to be targeting a holiday 2019 launch.

    Basically, both NV and AMD knew RT would be coming in DX in 2018 and it was up to them to get the hardware ready. BTW - for those that get their panties in a bunch this doesn't mean that NV and/or AMD weren't already looking into hardware accelerated RT prior to this, but it's when MS decided (likely with consultation between both NV and AMD) that silicon with hardware accelerated RT would be feasible to launch.

    Obviously something went wrong and not only was RDNA delayed a year but RT got delayed 2 years. Meanwhile NV as they had been doing for the past few years executed well and they had hardware ready for RTs introduction into DX.

    Regards,
    SB
     
    tinokun, NightAntilli and PSman1700 like this.
  10. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,193
    Likes Received:
    1,560
    Location:
    msk.ru/spb.ru
    MS doesn't decide anything on when some silicon is ready to launch in GPUs, GPU vendors do that and MS is then standardize their proposals - if it's even possible. The inclusion of DXR in DX was a result of a GPU vendor coming to MS and saying that they will have RT h/w in their next gen GPUs which they would like to be accessible via DX.
     
    tinokun, NightAntilli, pharma and 3 others like this.
  11. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    4,520
    Likes Received:
    2,074
    Its not only dlss i think, even without it the rt gap is rather large, to the point where playing games with rt and no dlss is actually plausible if you can live with 4k/30, 1440p etc. (cp2077)
    DLSS is just another enabler for those who crave highest resolutions and framerates while rting at a high level.
    A balance between those is killer, ofcourse.

    Raster performance they are doing well imo, just with the 4k/128mb limit in high bw rate situations.

    Consoles basically are behind in every category, quite much so, seeing this is just now. Rdna3/rtx4000 will improve alot i guess, nv feels even more need to improve and AMD is on the right track.
     
  12. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,286
    Likes Received:
    1,551
    Location:
    London
    The idea that RDNA 1 has utterly broken DXR support is fun. But I'm dubious. Though it might explain why RDNA 1 and RDNA 2 compute units appear to be pretty much exactly the same size.

    Then again the dedicated ray acceleration hardware in RDNA seems to be so low in complexity that there's practically nothing to see anyway.

    So I'm going with AMD being blindsided. NVidia has spent quite a long time and employed seemingly dozens if not hundreds of ray-tracing focused engineers for long enough that it was able to present Microsoft with a fait accompli.

    I'm going to guess that inline tracing (DXR 1.1) is something that AMD asked for, because it suits them.

    Once GDC arrives I suppose we'll get a clearer idea about techniques to run well on AMD and perhaps some historical insight.
     
    Dictator, Qesa, PSman1700 and 4 others like this.
  13. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    2,044
    Likes Received:
    1,472
    Location:
    France
    Inline RT helps nvidia too, right ?
     
  14. HLJ

    HLJ
    Regular Newcomer

    Joined:
    Aug 26, 2020
    Messages:
    352
    Likes Received:
    589
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,500
    Likes Received:
    4,108
    Yeah, NVIDIA is faster in 3DMark DX1.1 test and in Dirt 5 as well (RT wise).
     
    PSman1700, pharma and Rootax like this.
  16. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,193
    Likes Received:
    1,560
    Location:
    msk.ru/spb.ru
    It simplifies the programming side of things. H/w side is basically unaffected by this addition.
     
  17. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,286
    Likes Received:
    1,551
    Location:
    London
    Dunno. Is there evidence that NVidia performance is improved by inline techniques?
     
  18. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    12,986
    Likes Received:
    15,717
    Location:
    The North
    I was under the assumption that inline was the ideal method of calling for rays go forward, that way you wouldn't need separate draw calls for RT. Nvidia shouldn't perform any worse with going with inline. It just may not perform better than it currently is.
     
    Krteq likes this.
  19. ToTTenTranz

    Legend Veteran

    Joined:
    Jul 7, 2008
    Messages:
    12,045
    Likes Received:
    7,005
    Not sure if someone thought this would be some gotcha comment, but yes. Of course 16GB in a GPU are mostly useless at the moment.


    Difference being the current and next >180 million 9th-gen consoles userbase will have >12GB of available VRAM and RDNA2 levels of RT performance, not 8/10GB available VRAM with RTX30 levels of RT performance.

    VRAM utilisation in PC games started to skyrocket after the 8th gen consoles released, and 4GB highend cards like the RX290 and GTX980 had pretty bad performance in 2015 onwards.
     
  20. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,193
    Likes Received:
    1,560
    Location:
    msk.ru/spb.ru
    8th gen consoles bumped the RAM sizes by 16x, from 512MB to 8GB while staying with HDDs as the main storage.

    9th gen consoles bumped the RAM sizes only by 2x, from 8 to 16 GBs while moving to ultra fast SSDs for storage.

    If you're expecting this gen have a similar effect on PC side RAM and VRAM sizes then you should really think more on this.

    I imagine that no cards with 8+ GB VRAM will ever have any issues running multiplatform games with console level IQ. Devs will have to be careful and "creative" to support the cards with less than 16 GBs (with 16 they can just not care at all) but these cards will likely do fine, just as 4GB cards did in fact for the overwhelming majority of 8th gen titles.
     
    PSman1700 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...