AMD Radeon RDNA2 Navi (RX 6800, 6800 XT, 6900 XT) [2020-10-28]

Discussion in 'Architecture and Products' started by BRiT, Oct 28, 2020.

  1. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,948
    Likes Received:
    1,077
    Location:
    msk.ru/spb.ru
    So what you're saying right now is that 3060 will run "most multiplatform titles in the long run" better than either 3070 or 3080?
     
    PSman1700 likes this.
  2. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,913
    Likes Received:
    835
    Location:
    Torquay, UK
    Sorry, I wasn't clear in that sentence. What I was thinking was MM on PS5 already has reflections and we have roughly 2080(Ti) with 6800XT. So Navi 21 has enough performance to do more than shadows, but then we will have lower end cards and consoles developers has to take into account.

    Hope this clears it a bit more.
     
    PSman1700 likes this.
  3. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,913
    Likes Received:
    835
    Location:
    Torquay, UK
    See my clarification. Typing on the phone when a call interrupts your thoughts process resulted in confusing statement.

    With 6800(XT) which was referred to as AMD RT card we should be able to see results closer to full FPS at half resolution in title like that.
     
  4. neckthrough

    Newcomer

    Joined:
    Mar 28, 2019
    Messages:
    47
    Likes Received:
    93
    Yeah I think we can expect to see some good use of RT from talented devs like Insomniac, R*, Nixxes, etc. They do have to put in additional effort into PC ports though, which has a been a hit or miss. I don't expect to see anything good from Ubisoft.
     
  5. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    4,270
    Likes Received:
    1,915
    Yes a 6800XT is double the ps5 gpu, id expect better rt performance from that.
     
    Lightman likes this.
  6. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,266
    Likes Received:
    1,524
    Location:
    London
    Which games run their ray tracing effects at "full resolution" and full frame rate?
     
    Lightman likes this.
  7. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,555
    Likes Received:
    7,467
    Yes and MS only includes that into DX if multiple hardware vendors (in this case at least 2) can deliver on those features. Nothing I wrote contradicts what you just replied with. It's highly likely that MS had been discussing the potential for RT hardware acceleration with both NV and AMD for over a decade as they've been doing R&D WRT RT in various forms for over 2 decades now.

    Similar to how Tessellation wasn't included into DX until it was possible for NV to support it, despite AMD having support for hardware accelerated tessellation for years. NV was also unlikely to support tessellation unless MS pushed them on it. MS had been using tessellation (in the X360) prior to it being included in DX.

    RT is no different. It isn't in the best interest of MS or consumers if only one hardware vendor can support a key feature of DX. This wasn't an optional feature flag, but a cornerstone of that version of DX.

    Regards,
    SB
     
  8. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    8,446
    Likes Received:
    2,626
    Location:
    Guess...
    Fair enough, if you're arguing that VRAM usage at high settings/resolutions will go up beyond 8GB or even 10GB over the course of the next few years then I don't disagree at all. I thought you were making the argument that 8 or 10GB would hamstring todays high end GPU's in relation to the new consoles which I think is far from certain based on how 4 and 6GB cards performed last generation, - although can't be discounted entirely as I mentioned above due to the SSD's active as memory amplifiers.

    I read this after writing the above response to your earlier post. So it seems I was in fact correct that you are making this claim. So I ask you to support it with a precedent from the last generation... show me one multiplatform game that proves a significantly more powerful GPU with only 50% of the consoles memory (4GB) is insufficient to maintain parity with those consoles. The GTX980 or 290X seem like appropriate starting points given your post above.

    Just because developers can't guarantee a specific base performance level of PC IO, doesn't mean they can't take full advantage of the higher performing systems. Streaming bandwidth requirements are trivially easy to scale. Simply halving texture resolution (from 4K to 2K) would drop streaming requirements by almost 75%. And there are plenty of other ways to scale things too (LOD levels, draw distance etc...). Developers could accommodate max settings on PC's with less VRAM provided they have a fast SSD, or slow SSD's provided they have more VRAM. And if you have neither, you simply drop texture resolution or some other scalable element. And that's without considering the impact of system RAM acting as an additional cache between the SSD and VRAM.
     
    PSman1700 likes this.
  9. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,948
    Likes Received:
    1,077
    Location:
    msk.ru/spb.ru
    System RAM doesn't exist in this world. A PC with 16GB RAM and 8GB VRAM has apparently less RAM than a console with 16GB RAM.
     
    PSman1700 and DavidGraham like this.
  10. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,692
    Likes Received:
    789
    Location:
    WI, USA
    Has anyone seen a game use more than 16GB system RAM? Dishonored 2 was the first game that forced me to go up from 8GB because it was causing disk swapping. Having lots of room for system file caching certainly has benefits too.
     
    #2190 swaaye, Jan 14, 2021 at 9:46 PM
    Last edited: Jan 14, 2021 at 9:52 PM
  11. techuse

    Regular Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    620
    Likes Received:
    358
    The 8GB GPU of today is 25% faster than a PS5. 980 was 200% faster than a PS4(before the Nvidia performance tax brought it down to <2x.) Not really a legit comparison.
     
    ToTTenTranz likes this.
  12. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    4,270
    Likes Received:
    1,915
    What 8GB is 25% faster as opposed to the PS5's 10TF (at max clocks) RDNA2 gpu?
     
  13. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    8,446
    Likes Received:
    2,626
    Location:
    Guess...
    I wasn't comparing to any specific contemporary GPU. I was making the more general point that having 50% of the consoles VRAM didn't appear to act as a hard barrier to parity irrespective of GPU power last generation. If it did, then we'd see GPU's like the 980 falling behind the last generation consoles regardless of how much more powerful they are. Could the lack of VRAM be impacting on their relative performance though even at console settings? Yes perhaps. But it's not a hard barrier that justifies making a binary statement like 8GB (or 10GB) is insufficient for parity in multiplatform titles this generation. It may well turn out to be, but there's no precedent at the moment for making such a definitive statement.
     
    pharma, DegustatoR and PSman1700 like this.
  14. NightAntilli

    Newcomer

    Joined:
    Oct 8, 2015
    Messages:
    104
    Likes Received:
    131
    To me, the interesting part is not that the RTX 3080 is 34% faster at 4K. That was already the case without RT;

    RT Shadows OFF:
    RTX 3080: 86.7 avg, 67.0 min
    RX 6800XT: 69.6 avg, 57.0 min

    The RTX 3080 is already 25% faster here... Now let's compare their own internal scaling with RT;

    RTX 3080
    RT Shadows Off; 86.7 avg, 67.0 min
    RT Shadows Fair; 74.6 / 56
    RT Shadows Good; 74.3 / 55.0
    RT Shadows High; 67.1 / 49.0

    That translates in performance percentage to;
    Fair; 86.0% / 83.6%
    Good; 85.7% / 82.1%
    High; 77.4% / 73.1

    The same for the 6800XT gives us;

    6800XT
    RT Shadows Off; 69.6 avg, 57.0 min
    RT Shadows Fair; 60.2 / 49.0
    RT Shadows Good; 60.0 / 48.0
    RT Shadows High; 50.2 / 40.2

    That translates in performance percentage to;
    Fair; 86.5% / 86.0%
    Good; 86.2% / 84.2%
    High; 72.1% / 70.5%

    So in actuality, RDNA2 is scaling equal here to Ampere (and better with minimums) except at the high setting where it falls behind. And even at the high setting, it's not THAT much worse... That is what I find the most interesting about this.

    nVidia will definitely push to crank RT higher to put their cards in the best light, even when not necessary. I'm getting GameWorks/Tessellation vibes.
     
  15. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,948
    Likes Received:
    1,077
    Location:
    msk.ru/spb.ru
    Yeah, because RT is not necessary now apparently.
     
    pharma and PSman1700 like this.
  16. NightAntilli

    Newcomer

    Joined:
    Oct 8, 2015
    Messages:
    104
    Likes Received:
    131
    I held and still hold the position that all current hardware is not strong enough to use RT properly.

    You can go look at the image quality comparison between off, fair, good and high. All three look better than off, but, considering the performance drop, does the high setting justify the performance loss over the good setting? The good setting justifies it over the fair setting, because the framerate cost is like zero. The framerates are playable.

    And then you have to remind yourself that WoW is not a recent game at all, and has extremely outdated graphics. Any game that has graphics that are up to modern standards and also enables RT, inevitably means either atrocious performance, or having to lower settings or resolution. I won't pay $700+ to play at 1080p/60, if that.
     
    ToTTenTranz likes this.
  17. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,948
    Likes Received:
    1,077
    Location:
    msk.ru/spb.ru
    This has been proven false several times already, even on the new consoles.
     
  18. Dampf

    Newcomer

    Joined:
    Nov 21, 2020
    Messages:
    42
    Likes Received:
    90
    That is not how this works at all. The RAM is shared between CPU and GPU, so the CPU will take a considerable amount of these 13.5 GB. Infact, Microsoft expects most games to use 3.5 GB as DRAM, that is why they connected 10 GB via the faster bus to act as video memory. Going by that, 10 GB VRAM is enough for parity.

    However, that is not the end of the story. Next gen games definately will use more than 3.5 GB as CPU related tasks will increase dramatically, given the new Ryzen CPUs are a much more performant baseline than the old Jaguar cores. Actually, Watch Dogs Legion perfectly demonstrates that even for cross gen games it's already the case. Watch Dogs Legion has to run at Medium Texture Settings and DXR at Dynamic 4K (mostly between 1440p and 1600p) on the Series X, while a 16 GB DRAM + 8 GB VRAM PC has zero issues running the game with the high res textures pack installed and DXR at these resolutions, the high res texture pack makes a big difference in the overall visual quality of the game. If the whole 10 GB would be accessible as VRAM only on the Series X, the game could run the high res texture pack with ease, just like the 3080, but that is obviously not the case. This means the CPU takes more RAM than expected, meaning less memory as VRAM as its a shared configuration, on PC you obviously don't have that issue as it has seperate DRAM for the CPU.
     
  19. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    4,270
    Likes Received:
    1,915
    Also, DDR4/5 system ram is having better latency timings too.
     
    pharma and Dampf like this.
  20. NightAntilli

    Newcomer

    Joined:
    Oct 8, 2015
    Messages:
    104
    Likes Received:
    131
    I disagree. But this is not the place to discuss this topic. And even if it was, I'm not interested. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the RDNA2 architecture, rather than pointless repetition that RT is the next messiah of gaming. After all, this is the 6000 series thread.
     
    ethernity, ToTTenTranz and swaaye like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...