AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Discussion in 'Architecture and Products' started by BRiT, Oct 28, 2020.

  1. Leovinus

    Newcomer

    Joined:
    May 31, 2019
    Messages:
    142
    Likes Received:
    73
    Location:
    Sweden
    Weird question perhaps, but would you gals and gents know wether or not RDNA is better suited to Apple's Metal API thanks to the infinity cache and other architecture enhancements? My mind is drawing a connecting between infinity cache and the likes of eDRAM for tiled rendering, which Metal is geared towards. Just a spurious thought.
     
  2. pTmdfx

    Regular

    Joined:
    May 27, 2014
    Messages:
    415
    Likes Received:
    379
    Immediate mode GPU is still immediate mode in Metal. TBDR support at API level is a reflection of the graphics pipeline implementation in the hardware, not the other way round.

    So no. You still won’t get stuff like memoryless render targets and imageblocks (abstractions for the TBDR tile memory) when using RDNA2 on Metal, even if RDNA2 is supported by the macOS AMD GPU driver.

    TBDR tile memory also isn’t a transparent hardware cache like Infinity Cache. It is a tile-private scratchpad memory, which is an alien concept to an immediate mode pipeline.
     
    #1582 pTmdfx, Dec 3, 2020
    Last edited: Dec 3, 2020
  3. Remij

    Regular

    Joined:
    May 3, 2008
    Messages:
    677
    Likes Received:
    1,256
    How exactly does Nvidia tank performance on AMD GPUs?

    The blame needs to go on the developers of the games.
     
    PSman1700 and Rootax like this.
  4. joesiv

    Newcomer

    Joined:
    Nov 1, 2012
    Messages:
    57
    Likes Received:
    5
    NVIDIA will offer development support to developers, often working in the developers office to "optimize" for nvidia hardware. What affect their "optimizations" have on AMD hardware could vary I would guess.

    Nvidia has a lot more money to offer developer support than AMD does.

    I'm more of a lurker, but I figured I'd chime in on raytracing performance on AMD cards, specifically minecraft which is horrible on rdna2. Nvidia has spent a lot of time/money developing and reworking the render pipeline in minecraft to work well with RTX cards:


    It's a good listen since it is an interview with 4 Nvidia developers who are working full time on minecraft rtx, especially if you listen to it from the perspective of "does it just work?", or are optimizations required to work with specific hardware architecture. You'll find that, even though it's pathtraced, there are certain things that needed to be done, and still need to be done to make it work best with RTX hardware. Obviously, the same would be true with RDNA2, though the optimizations are being done by NVIDIA staff, on Nvidia hardware, I think it's obvious why performance stinks on RDNA2.

    Once it's out of beta, and perhaps even has a console release, I wonder if the console RDNA optimizations could be ported over to PC, and improve the performance.
     
  5. ECH

    ECH
    Regular

    Joined:
    May 24, 2007
    Messages:
    692
    Likes Received:
    30
    As posted above you cannot force someone to believe in plausible deniability when we actually know you cannot plausible deny it.
    :grin:
     
    #1585 ECH, Dec 3, 2020
    Last edited: Dec 3, 2020
  6. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,398
    Now, now. Developers don't make decisions, that's the job of Project Management and Management...
     
  7. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,451
    Likes Received:
    471
    Isn't RDNA 3 planned for 2022?
     
  8. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    Same way AMD tanks performance on Nv GPUs - by providing 3rd party developers with solutions to problems which are best suited to their h/w.

    In a perfect world of unlimited time and money budgets? Sure.

    The point was more about the laughable idea that Nv needs something like a post on GPUOpen to find out the weaknesses of competitors products though.
     
    PSman1700 likes this.
  9. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.
    First hand experience ? Or examples please...

    My experience as a AAA dev is the opposite, AMD providing code working well on NV/AMD but NV providing code working well on NV and being slow on AMD. (And changes could be made to improve AMD perf with little to no impact on NV perf...)
     
    Silent_Buddha, Kej, Alexko and 9 others like this.
  10. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    Okay. What I've heard from AAA devs is the opposite of what you're saying.
    And as for examples - where's the D3D11 renderer in Valhalla? Which D3D12 exclusive features does this game use?
    Now could you give me an example of the opposite sort?
     
    pharma, xpea and PSman1700 like this.
  11. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
    Usually some outlier results that buck the trend makes you wonder if there is something else going on.
    Devil May Cry 5 Benchmark Performance Analysis | TechPowerUp
     
    PSman1700 likes this.
  12. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    AMD says the years in their roadmaps are inclusive (so it could be 2022), but so far every single architecture under current roadmap style (both CPU and GPU) has released as if it was exclusive (which would suggest 2021).
    Also Wang or some other Radeon bigwig said (or even promised?) they'd deliver new products every year, be it new architecture or tweaked architecture or new process
     
    PSman1700 likes this.
  13. Somewhere in the end of next year Exynos with RDNA should be ready for industrialization for the next-gen Galaxy in the following spring. Although unclear if Samsung wants the latest and greatest RDNA at that point in time.
    But for a mobile form factor, they better be having more than that 50% perf/watt they are aiming, 5nm will help. So most likely RDNA3 should be ready then. Probably launch into early 2022.
     
    pharma and PSman1700 like this.
  14. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,090
    We are three weeks left from 2021, 2022 is almost 'next year'. Anyway, like the above post, RDNA3 could actually see the light in 2021. About a year left max doesnt seem so unrealistic.
     
  15. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,424
    Likes Received:
    908
    It's worse than that. Nvidia code only runs well on their latest GPU. Their own previous GPUs run poorly too. Just see Pascal in the majority of titles since Turing released.
     
    no-X likes this.
  16. NightAntilli

    Newcomer

    Joined:
    Oct 8, 2015
    Messages:
    104
    Likes Received:
    131
    They definitely don't need it. But it can make it a lot easier. I do hope that this is not the case this time, although as someone mentioned previously, Minecraft RTX says quite a bit. And it's actually interesting that we already saw path traced Minecraft running on an Xbox Series X, but somehow the 6800 cards perform abysmally in comparison...
    Hopefully the newer APIs will finally get on their feet on PC, so that more optimizations from RDNA2 in the consoles are translated to RDNA on the PC.
     
  17. Frenetic Pony

    Regular

    Joined:
    Nov 12, 2011
    Messages:
    807
    Likes Received:
    478
    What on earth are you even talking about? Valhalla has pretty much no vendor code anywhere. The only partnership was for marketing with AMD CPUs.
     
    chris1515 likes this.
  18. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    I've been pretty clear on what I've said.
     
    pharma and PSman1700 like this.
  19. OlegSH

    Regular

    Joined:
    Jan 10, 2010
    Messages:
    797
    Likes Received:
    1,624
    Could you elaborate on this a little bit more?
    What are you doing as a AAA dev?
    What AMD code was working well on NV and what NV code was slow on AMD?

    Vendor code is not required at all.
    In Pascal time frame, it was enough just to port code from consoles without aligning structured buffers to tank Pascals performance by 5-10% in DX12 games - https://developer.nvidia.com/pc-gpu-performance-hot-spots
    By aligning these structures or better replacing them on constant buffers one could easily get 5-10% gains out of Pascals in overall performance with 0 visual impact of course.
    Now, it's enough just to port code from consoles without optimizing constants and descriptors to tank performance on every GPU w/o CPU-writable video memory support eighter in hardware or via drivers profiles.
    SAM allows to write more descriptors into video memory and the gain from SAM is a good indicator on how bad devs have optimized constants and descriptors usage on PC.
    Do you know why, all of a sudden, recent AMD-aligned titles all have integrated benchmarks (all were pushed by AMD in recent reviews), all benefit from SAM and all are being performance outliers (RX 5700 XT is 8-10% slower than 2070 Super when you test something other than AMD titles) sometimes to ridiculous extend when RX 5700 XT is being cabable of competing with way more faster RTX 2080 Ti in those titles with pretty much no vendor code anywhere?:roll:
    Doesn't it look suspicious to you?
     
    #1599 OlegSH, Dec 4, 2020
    Last edited: Dec 4, 2020
    DegustatoR, PSman1700 and pharma like this.
  20. Dampf

    Regular

    Joined:
    Nov 21, 2020
    Messages:
    283
    Likes Received:
    474
    Well Valhalla clearly doesn't run as good as it could be on Nvidia hardware, the 5700XT reaching 2080Ti performance is very suspicious and Dirt5 as well as Godfall underperform as well. I have never seen a 5700XT perform under a regular 2060 (without DX12U features, DLSS and RT of course) in Nvidia titles, which would be equivalent of this behaviour for AMD on the Nvidia optimized side...
     
    DegustatoR, BRiT, PSman1700 and 2 others like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...