AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Discussion in 'Architecture and Products' started by BRiT, Oct 28, 2020.

  1. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,395
    Mining.
     
    PSman1700 likes this.
  2. neckthrough

    Newcomer

    Joined:
    Mar 28, 2019
    Messages:
    138
    Likes Received:
    388
    You're all ignoring the harsh reality of Moore's Law leveling off. We're still getting area shrinks and power reduction, but $/xtor is the problem. Pandemic, mining and tariffs are blips but physics is the long-term issue. Historical GPU perf scaling rates can only be sustained with increased prices.
     
  3. Frenetic Pony

    Regular

    Joined:
    Nov 12, 2011
    Messages:
    807
    Likes Received:
    478
    Too damned right, we're looking at 2nm being the last node ever, even if we get there. There's no roadmap past HighNa EUV, among a thousand other problems. The good news is, some people with funding seem to realize this. There's private company research into carbon nanotube transistors, as well as a really promising sounding DARPA program funding another project.

    CNTs offer much higher clockspeeds for current/heat, and while not the insane speeds of graphene they also seem practically much closer. I'm particularly interested in the DARPA program, which aims for the basic research and industrial processes foundries might then use to create CNT chips, as well as aiming to use stacked logic/ram at the same time. Which makes a lot of sense for clockspeeds measured in the 10s of gigaherz. Current processors are often always struggling against latency, and often bandwidth as it is, trying to raise the clockspeeds several times and more without offering a new solution for latency and bandwidth is asking for severe underutilization.
     
    Lightman likes this.
  4. Frenetic Pony

    Regular

    Joined:
    Nov 12, 2011
    Messages:
    807
    Likes Received:
    478
    Which coincidentally got me thinking. If AMD has chiplets and TSMC has ultra dense SRAM caches, why not make the big caches even bigger for the CPU? The GPU is already overmaxxed on its cache size versus use, but surely the CPU could use a giant 256mb+ LLC with ease.

    Also, most relevant to this thread, I don't see why they're using a cache memory structure for the BVH. I get why the cache memory is being used, but using it as a cache structure for such a giant memory load means latency is severely hurt thanks to sorting through all the tags. Why not use the already assumed static analysis of the code finding the BVH to virtualize the cache as a standard memory address when it comes to the BVH? Lessen the latency penalty just through whatever microcode/drivers controls that without any hardware changes. Also use that same static analysis to see if you can drive whatever separate voltage planes/clocks control the areas are slowing things to most during RT as high as you can go. If they're redesigning cooling for higher power delivery anyway...
     
  5. hughJ

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    861
    Likes Received:
    417
    Yeah, Bob Colwell was beating that drum a lot in various EE talks a ~decade ago, I assume he's left DARPA by now. He's definitely the polar opposite of Jim Keller's optimism.
     
  6. yuri

    Regular

    Joined:
    Jun 2, 2010
    Messages:
    283
    Likes Received:
    296
    AMD Rome (2019) features 256MB of L3 in 16 partitions. AMD Milan (2021) has the same amount in 8 partitions. It's not a true LLC but the amount of SRAM is there. Going forward with the density and advanced packaging techniques, the need for less data movement and locality exploitation should bring even bigger memory pools.
     
  7. xEx

    xEx
    Veteran

    Joined:
    Feb 2, 2012
    Messages:
    1,060
    Likes Received:
    543
    I have no idea why MS is not doing this. Like the Xbox already use Windows and Direct X....Making it a real PC is just some little work.
     
  8. Rootax

    Veteran

    Joined:
    Jan 2, 2006
    Messages:
    2,400
    Likes Received:
    1,845
    Location:
    France
    Why would they do that ? It's not like they have a huge inventory waiting in stock.
     
    BRiT likes this.
  9. Frenetic Pony

    Regular

    Joined:
    Nov 12, 2011
    Messages:
    807
    Likes Received:
    478
    I'm pretty sure that would violate their non compete agreement with AMD. Which no doubt goes something like "We'll (AMD) sell you (MS) these APUs at a super low price as long as you agree not to use them to compete in markets we're already in" AKA using it for a PC.

    I mean who could compete with a decent 8 core CPU, higher end SSD, 16gb of ram, and a solid GPU for $500?
     
  10. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,089
    Its not all that intresting in perspective to whats available today anyway. In a console its a much more intresting package though.
     
  11. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    It runs circles around what most players have and doesn't cost arm, leg and your first born. People on forums often forget that they're just an enthusiast drop in the vast ocean of players.
     
  12. JoeJ

    Veteran

    Joined:
    Apr 1, 2018
    Messages:
    1,523
    Likes Received:
    1,772
    U joking? Nothing is available. I could get some 6900XT for the price of a used car. (I'm not going for 6700 because either i want 6800 for games or something smaller, which would be enough for dev.)
    You sound like PC gaming == high end master race, but that's not the case. On average people always have similar PC specs than actual consoles.
    Actually i'm more worried the situation becomes the reverse, because couch gamers get more chips than chair gamers.
    Though, who cares. High spec seems no longer making such a big difference anyways.
     
  13. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    Cuthalu, Lightman and PSman1700 like this.
  14. Qesa

    Newcomer

    Joined:
    Feb 23, 2020
    Messages:
    57
    Likes Received:
    107
    Well it's the lowest end card they've announced that supports RT, so I wouldn't read a lot into it. If you wanted to take a bigger dig you could also say minimum is 2060/6700XT.

    More curious to me is that recommended for rasterisation is 1070 or 5700. The latter's normally ahead of a 1080 isn't it?
     
    Lightman and Wesker like this.
  15. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    Erm, how exactly RX 6700 XT being the recommended spec suggests AMD isn't confident in their RT implementation?
    As @Qesa pointed out above, it's the slowest RT supporting card they've announced, and even if we disregard that, I simply can't figure out the logic behind your claim.
     
    Wesker likes this.
  16. This is actually an important point that Scot Herkelman raises in his PCGamer live conversation.

    He said that their market research found out that people will start to mentally distance themselves from buying PC gaming hardware - and PC gaming itself - if they go through a long period of time without being able to buy new stuff. They observed it in 2017 and they're observing it now.
    I.e. mining is taking away PC gamers for good, as apparently there's a tendency to not come back when the market stabilizes.

    I'd guess there's also a seasonal tendency to decrease the PC gaming marketshare for a couple of years whenever a new generation of consoles comes out. However, this time it's a perfect storm:

    1 - the new consoles are offering high-end gaming experiences:
    2 - PC CPUs and graphics cards that would be equivalent to said consoles are overpriced. For example, the 6700XT released with a similar MSRP to the PS5 and SeriesX, while back in 2013 the HD7870 GHz was going for less than $280;
    3 - The CPUs/GPUs that would offer a substantially upgrade over the consoles are nowhere to be found (and/or reaching ridiculous prices).

    So while AMD and Nvidia are making record profits out of their GPU sales these quarters, their serviceable available market is shrinking.


    Speaking on a personal level, I had planned to do a major upgrade in the beginning of this year (probably Ryzen 5900 + best deal between Navi21 or GA102 at around $600), but of course I couldn't get any of those.
    Nowadays I'm progressively less inclined to deal with the miner/scalper/availability shitshow and I certainly don't have the time+patience to follow availability on select twitter accounts to rush on to an estore just to see the unavailable red sign on a product that was already way pricier than my initial budget.
    If push comes to shove and they start releasing games I want to play that aren't available on the PS5 (e.g. elder scrolls) and don't play well on my old PC hardware, I think I'll just give up on PC hardware and buy an xbox.


    It usually doesn't, at the start of a new generation of consoles (which tends to increase the baseline by ~8x over the previous generation), but I do agree the "graphical ROI" has been going down.
    Regardless, it's also a good thing for us consumers that high/top-end graphics cards don't provide a big difference from mid-end offerings because the price of the high-end GPUs has been steadily rising at a pace way above the inflation and manufacturing cost.



    Yes, the 5700 is around 25% over the 1070 and even the 5600XT is some 10-15% faster. It's a bit strange they're not mentioning the Vega cards that are contemporaneous with the Pascal models. It's a cross-gen game so there's probably a very optimized path for GCN GPUs that put the Vega 56/64 on par with the 1070/1080.

    Though as we've been seeing, the system requirements lists often have these weird nonsensical comparisons. Cyberpunk 2077's recommended system requirements have something like a 4-core skylake from 2015 or a 6-core Zen2 from 2019.
    Don't read too much into it.


    I bet you can.
     
    JoeJ likes this.
  17. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,395
    System requirements are rarely a sign of how a game will actually perform on different h/w. More often than not they are just some random bullshit slapped together by an intern in the publishing company.
     
  18. JoeJ

    Veteran

    Joined:
    Apr 1, 2018
    Messages:
    1,523
    Likes Received:
    1,772
    Yeah, that's some really good arguments.
    On the long run, or if looking at it from some distance, there is more than that:
    * Looking at that huge box under my desk it really feels old school already. Much worse if somebody maintains this expensive box only to play games. Considering the trend should be smaller boxes not bigger ones, this argument now hits consoles as well. PS5 really is a ridiculous ugly something. This can't be the future, this does not feel modern or efficient.
    * Tech oriented marketing does not help: 'Look! we have RT now! For only 1000$$$!' - 'Meh - can't spot a difference at all. Try harder.' Though, i'm not sure if majority of gamers even pays attention, but they ofc. question the prices and ask for what. Arguments of 4K and RT are too weak.
    * Most important: PC lacks exclusive games developed for that platform, utilizing it's immersion advantage of close display and mouse controls. I feel PC gamers are dominated by memories about Half Life or Diablo, complaining now about console ports while not being heard.

    This latter point is the one i fail to comprehend. The market is big enough, but only Indies target it specifically, which defies the argument game production is just too expensive so cross platform is the only option.
    It feels fragile, and current chip shortenings might indeed be enough to give PC gaming the death sentence after some further years of struggle.

    The obvious solution for now can only be to hold minimum specs low and make good games.
    One option i think is underutilized is platform dependent tuning and design. I don't mean graphics options but gameplay itself. The term 'scaling' is meant only technically, currently. That's not enough i think.
     
    CeeGee likes this.
  19. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,211
    No Capcom listed the 2060 and 6700XT for min RT, then they upgraded to 2070 for recommended, meaning it is equal to the 6700XT. They didn't upgrade to the 2080, the 3060 or anything else. Just the 2070.

    Anyways, Capcom thinks both RTX 2070 and RX 6700XT are applicable for 4K/45fps with RT.

    Only the 3070 and 6900XT can do 4K60 with RT according to Capcom.

    https://www.resetera.com/threads/resident-evil-village-system-requirements-released.397822/
     
    #2519 DavidGraham, Mar 22, 2021
    Last edited: Mar 22, 2021
  20. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,424
    Likes Received:
    908
    The increasing cost of PC hardware relative to consoles combined with the decreasing level of improvement they offer has to be a concern for PC tech companies.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...