AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Discussion in 'Architecture and Products' started by BRiT, Oct 28, 2020.

  1. Putas

    Regular

    Joined:
    Nov 7, 2004
    Messages:
    737
    Likes Received:
    354
    Don't think so. Competetive gamers mostly do not care about graphical fidelity enough to buy such expensive cards anyway.
     
  2. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Psychologically, I like that you get the full chip with 6900 XT - unlike RTX 3090 which still is technically a salvage part.
     
    Lightman and PSman1700 like this.
  3. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,451
    Likes Received:
    471
    Mostly according to reviews testing like 5-8 games. Review at ComputerBase, which tested 17 games, shows the same RX 6900XT-RTX 3090 performance delta (7 %) in both 1440p and 4k.
     
  4. I think it's quite the feat that the 128MB of IC work so well on day 0, considering many tech reviewers/analysts seemed so skeptical of its use when these were announced. Also, frametime consistency on Navi 21 cards is unparalleled which translates to better gaming experiences.
    There's probably a bit of a headroom in driver and game optimization, also.

    Regardless, 4K is an absolute overkill for 99.99% of the cases and AMD needs to put out their new upscaler ASAP so that people can play at 1440-1800p + upscaling.

    Maybe with RDNA3 at 5nm we'll see 196/256MB of IC inside the next top-end GPU and that could reach a significantly higher hitrate. That slide from AMD with the cache/resolution curves show that the 4K curve isn't anywhere near flatlining at the maximum 150MB they have in the scale.
    Though I guess eventually AMD will also need to use faster VRAM, of course.
     
    Lightman likes this.
  5. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    That's because once you get out of your typical modern benchmarking suite Navi 21 tend to show weaker performance in 1440p. Probably had something to do with driver optimizations for effective use of IC. I won't be surprised if Navi 21 performance will improve in 4K generally over time.
     
  6. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Hardware Unboxed finally manned up and tested some proper RT titles this time, and it's a blood bath, the 3090 is between 60% and 120% faster @1440p without DLSS.

     
    Babel-17 and PSman1700 like this.
  7. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
  8. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,679
    Competitive gamers have been running overclocked 2080tis with game settings on low at 1080p. They buy whatever can push the most frames. I think benchmarks might show AMD pulling ahead there.
     
    PSman1700 likes this.
  9. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    They did say they wanted to do yearly refreshes but it's not clear whether that means a Zen 2 XT style refresh or something else. Given that they apparently have 4 GPUs in the RDNA2 stack (N21, N22, N23 and N24), and the entire stack will launch only by Q2'21, I have a hard time seeing RDNA3 by Q4. If they truly are targeting a 50% perf/w improvement again, it's a bigger effort than Zen 3 vs Zen 2, and that took 16 months, with a much smaller die. Even Nvidia has taken ~2 years between Pascal, Turing and Ampere. I would still bet on RDNA3 only in 2022.

    Is it actually official that Samsung is using RDNA2 and not 3? I haven't read that anywhere? If I remember correctly the rumour was that they're using RDNA3.

    Regarding R&D, it is my understanding that MS & Sony pay a significant part of the R&D up front/during development in exchange for lower royalties later and this is the route AMD has been taking with the consoles. so saying that RDNA2 was partly funded by MS/Sony is for the most part correct.
    Tbh 4K isn't really a popular gaming resolution for PCs (maybe for consoles as far more people have 4K TVs than 4K monitors). I just checked the latest steam survey for November 2020, and 4K is a whopping 2.25% of the install base. The vast majority of gamers are on 1080p and 1440p and this is unlikely to change much in the near future (especially for mobile which is mostly 1080p and a sizeable chunk of the gaming market today). I don't see a significant ROI for AMD in increasing the IC as much as you suggest, especially early on in the 5nm process lifecycle.
     
    Deleted member 13524 likes this.
  10. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    It's a cache, so conventionally it should work about as well now as it can work in general.
    At least for now, the RDNA2 doc regards it as architecturally invisible, and at the driver level the publicly seen changes are mostly limited to a handful of flags related to not using the cache, most of which don't seem to be performance-critical. Some, like making metadata for things like HiZ and compression would seem like they would cost performance, but that's speculation on my part.
    Flagging specific pages to skip the cache is something that the virtual memory entries for Sienna Cichlid have as an option, but trying to hunt down resources to avoid seems like it's approaching from a less-efficient direction. It's not clear yet if anything but the driver can make this change, and whether there are that many opportunities to use it.

    Perhaps we'll find out that some of the offenders haven't been filtered by the drivers yet, or there's a better way to massage access patterns and cache use.

    That graph seems to be leaving some context off, such as why there are so many points on the graph (3 resolutions, more than 2 colors for 1440p, dots and x's). There are endnotes mentioned for it, which I've seen to indicate part of the extrapolation is based on CU count. How exactly that maps to what's on display is unclear to me.
    4K seems to still be in the "steep" part of its curve that the other had prior to leveling off. So far the pattern has been that a higher resolution will level off before meeting the other curves, so 4K's steepest rise is somewhat unpromising. 1440 has a more pronounced flattening at the far end, which I wonder if there's a later point where it might start to slightly rise like HD does.

    I wonder if there's some ancient games or aggressively modded older games that could be tweaked to fit their graphics contexts wholly on-die. It'd bottleneck on something else far earlier, but it would be a somewhat funny data point for a game that fit in VRAM for ATI 9800-era hardware to achieve a practical minimum in memory bandwidth consumption.
     
    Pete, CarstenS, Lightman and 2 others like this.
  11. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,679
    CS 1.6, old RTS games, Quake 3 (maybe) see competitive play still. Would definitely be an interesting experiment.
     
    PSman1700 and yuri like this.
  12. pTmdfx

    Regular

    Joined:
    May 27, 2014
    Messages:
    415
    Likes Received:
    379
    They could pull the same staggered launch, or launch only a new halo product above Navi 21 without dropping it.
     
  13. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    A fine showing from AMD today. Now if only you could actually buy a card at or near MSRP... Same goes for Nvidia.
     
    Lightman likes this.
  14. kalelovil

    Regular

    Joined:
    Sep 8, 2011
    Messages:
    568
    Likes Received:
    104
    They are leaving a conspicuous gap between the 80cu and 40cu dies.
     
  15. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    There is the 60CU salvage die of course. And they're clocking the 40CU die ~10-15% higher apparently, while giving it 75% of the memory bandwidth of the 80CU die. Should result in it being within ~20% of the 60CU die.
     
  16. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    Had to double check, official wording is "custom IP based on RDNA architecture" with no numbers mentioned. But I'm pretty sure when this came out RDNA2 was mentioned somewhere
    https://www.amd.com/en/press-releas...ce-strategic-partnership-ultra-low-power-high
    https://news.samsung.com/global/amd...-power-high-performance-graphics-technologies
     
  17. yuri

    Regular

    Joined:
    Jun 2, 2010
    Messages:
    283
    Likes Received:
    296
    If this was the case the AMD marketing team would base the whole launch theme on that fact, since presenting the cards as 4k game changers would be silly and incompetent. Oh wait...
     
    pharma likes this.
  18. manux

    Veteran

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,276
    Location:
    Self Imposed Exhile
    Another view to 6900xt, scalping, cyberpunk, ray-tracing,... I really like pcworld fullnerd podcast. I hope others like it as well

     
    Lightman likes this.
  19. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,424
    Likes Received:
    908
    Seems likely RDNA 2 will pull ahead as more current gen games fill these benchmarking suites.
     
  20. Jay

    Jay
    Veteran

    Joined:
    Aug 3, 2013
    Messages:
    4,029
    Likes Received:
    3,428
    Current or previous gen?
    Current gen will get more DXR enabled games as time goes on.
    Previous gen without DXR will fair much better on RDNA2.
     
    PSman1700 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...