AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Discussion in 'Architecture and Products' started by BRiT, Oct 28, 2020.

  1. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    I don't have any sales data...

    Well @pharma found some deficit data for RTX 3050. Certainly not roses with a worst case reported of 9% at 1080p.
     
  2. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,210
  3. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    Are you seriously comparing a 9% drop to a 90% drop? By that assessment the RTX 3050 could be 10x more expensive than the RX6500XT...
     
    DavidGraham and pharma like this.
  4. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    ?? In the linked video, HUB explicitly says "we see absolutely no change for the RTX 3050" when commenting PCIe scaling at roughly 14:05.
    Computerbase comes up within the margin of error as well: https://www.computerbase.de/2022-01...3/#abschnitt_benchmark_mit_pcie_40_vs_pcie_30
    upload_2022-1-28_14-50-36.png
    TPU says, 1-2 % difference: https://www.techpowerup.com/review/gigabyte-geforce-rtx-3050-gaming-oc/40.html

    And when it comes to inverse cherry picking, how about applying that to the 6500 XT as well?
     
  5. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    If you run a game within the memory requirements supported for the card and it gets slower with PCI Express 3.0 versus 4.0, that's cherry picking?
     
    digitalwanderer likes this.
  6. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    So reviews should only look at older games at resolutions under 1080p? In a performance review in 2022? No one cherry picked anything, all reviews used the same methods and definitions they always do...
     
  7. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    There are benchmarks showing the 6500xt doing worse than the 4GB RX 570 and 5500XT both of which are PCIE 3.0. There's really no excuse that can be made for the 6500xt no matter what settings are used.

    [​IMG]
     
    T2098, Lightman, sir doris and 4 others like this.
  8. Putas

    Regular

    Joined:
    Nov 7, 2004
    Messages:
    737
    Likes Received:
    354
    This is why I said practical. Most of the difference comes from settings where the frame-rate is unplayable anyway.
     
  9. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Using a single outlier result out of all the tests done is.

    edit:
    Look, I've got proof, that you need PCIe 3.0 for full performance on RTX 3050. (j/k of course, but I guess you see what I mean).
    upload_2022-1-28_15-56-0.png
     
    #3489 CarstenS, Jan 28, 2022
    Last edited: Jan 28, 2022
  10. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    I disagree. Flipping through the 1080p results in the link, there are 4 games that run below or just above 30 fps (Valhalla, Cyberpunk, RDR2, Watch Dogs Legion) which I would count as unplayable. But plenty of games at or above 50 fps.
    I don't think comparisons at that perf level are purely academic.

    edit: And yes, I would agree that picking data points from 4K results are pointless in this case. But coming from 40.8 fps in Deathloop 1080p and going down to 30.4 fps is very relevant for example. As is the 51 vs 28.6 fps in Detroit Become Human or 75.1 to 53.5 fps in Doom Eternal, which feels laggy with that kind of fps rate already.
     
    #3490 CarstenS, Jan 28, 2022
    Last edited: Jan 28, 2022
    pharma, DavidGraham and Picao84 like this.
  11. Putas

    Regular

    Joined:
    Nov 7, 2004
    Messages:
    737
    Likes Received:
    354
    Too bad they don't have frame pacing data, I expect dips too big even on PCI-E 4.
     
  12. arandomguy

    Regular Newcomer

    Joined:
    Jul 27, 2020
    Messages:
    251
    Likes Received:
    355
    The other additional thing to consider here is reviewers tend to test all max (well except RT, but lets not go there :twisted:) settings while individual setting adjustments are available. If it's memory pressure causing causing the most performance drop then it can be a situation in which this particular card would need to sacrifice more memory sensitive settings (typically texture quality related) relatively more so than other settings to bring performance up when compared to others. However at the same time we could also debate the issue of those max likely texture settings.

    I feel though in general this is an example though of a situation in how while the traditional review format can reflect the academic performance but doesn't necessarily convey the actual usability performance.

    On a semi related note. Has anyone looked into whether or not PCIe limitations may affect games that rely more on streaming of data to manage memory load in terms of things such as more texture pop in?
     
    Silent_Buddha likes this.
  13. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,987
    Likes Received:
    3,529
    Location:
    Winfield, IN USA
    Then get a 6600 by all means! A much better card than either the 3050 or the 6500.

    The only reason the 6500 is still available near MSRP is because of what a piece of shit card it is, even non-PC experts have been hearing about what rubbish it is. The 3050, on the other hand, is rising in price because of the value it offers.

    It's not proof the 6500 is better because it's cheaper or available, quite the opposite.

    Hey! What you talking about? Granted I under volted my 580, but it rarely goes over 120w even in extremes. Is ~200w normal for my card?
     
    sir doris and DavidGraham like this.
  14. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,210
    Please let's go there, RT is coming and coming hard, if the Matrix demo releases on PC tomorrow then the 6500XT wouldn't even be able to run it, an Xbox Series S can perfeclty run it, let that sink in for a moment. An RTX 3050 would run it easily on the other hand.
     
    PSman1700 and Picao84 like this.
  15. Putas

    Regular

    Joined:
    Nov 7, 2004
    Messages:
    737
    Likes Received:
    354
    180 W of reference card is close enough.
     
  16. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,451
    Likes Received:
    471
    Every motherboard released in last 2-3 years has PCIe 4.0, so every new system has at least PCIe 4.0 slot. Current Intel-based systems have even PCIe 5.0. It doesn't make sense to build a new PC on PCIe 3.0 platform once you care about 3D performance.

    As for existing PCIe 3.0 systems, I can't see a reason, why should any owner (of a system with dedicated graphics card) upgrade to the lowest-end part of current generation. That doesn't make sense to me. Most users upgrade their GPUs, because they want more performance. Lowest-end parts were never meant to be a performance upgrade. Have you ever upgraded to GeForce GT710? Or to Radeon HD 6350? Most of these GPUs went to new low-end systems and some of them were bought as a back-up GPU in case of RMA of current graphics card.

    For new systems PCIe 3.0 is not an issue (they have PCIe 4.0/5.0). For current PCIe 3.0 systems, like 99 % of the gamers have no reason to upgrade, either because they already have a faster GPU or because the performance difference is not significant enough (GTX 1650, 1050 Ti, GTX 1060, RX 470, RX 570…). The „PCIe 3.0 problem“ is in fact a niche scenario, theoretical issue. >90 % Navi 24 GPUs manufactured will go to PCIe 4.0 / 5.0 systems.
     
  17. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    Man, it's almost like everyone thinks that there aren't a lot of people out there with a hard budget on how much they can spend on things like this. :p

    Is the 6500 XT a great card? Nope. But then neither is the 3050. If you could get both at MSRP the 3050 is obviously the better card. Unfortunately, unless you're a scalper with automated tools, good luck getting a 3050 at MSRP.

    If someone held a gun to my head and said, buy a graphics card now that you'll actually use or I'm going to shoot you in the head? I'm buying the 6500 XT. It is far and away the better deal compared to the RTX 3050.

    I can currently buy a 6500 XT on Amazon for 279 USD. Good luck with the 3050. Let's see, if I go to Newegg I can find the 3050 for the right bargain of 599.99 USD. Oh goodie.

    Worst case scenario choosing certain games the 3050 is slightly more than 2x faster than the 6500 XT when the 6500 XT is limited to using a PCIE 3.0 interface.

    Hmmm, but I have PCIE 4.0 in my machine. So on average the 3050 will be ~25% faster than the 6500 XT.

    So, I can get the 6500 XT for under 300 USD. Or I can pay over 2x as much for a 3050 which will give me on average 25% more performance. Whoopdie F-ing do.

    Is the 3050 worth twice the price of the 6500 XT? IMO, not even remotely close.

    Now, if both were being sold at MSRP and the 3050 was only 25% more (199 USD versus 249 USD)? Hell yeah I'd get a 3050 instead of a 6500 XT. Unforunately, the reality of current market conditions make the 6500 XT a significantly better buy.

    However, since nobody is holding a hypothetical gun to my head, I'm not buying either card because both cards are shite for how much you have to pay for them.

    Regards,
    SB
     
    Lightman, tsa1, Jawed and 5 others like this.
  18. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,210
    That's not true, Intel is late to the PCIE4 party, only migrating to it a year ago.

    It's the start of a new console cycle, people with very old cards (Fermi, Kepler, Maxwell, GCN1, GCN2, GCN3) need to upgrade to play the latest DX12/Vulkan exclusive titles even at lowest quality settings, console ports upped the requirements significantly during the past two years. And with GPU shortages and high prices, lowest end cards are the only viable option for many people. Also many people upgrade to from the lowest end of a very old GPU family to the lowest end of the a new GPU family, for example from a GTX 750Ti to a GTX 1050 or GTX 1650, this is a widely common practice among PC gamers, just look at Steam hardware survey, the 1050 was the second most common GPU, before being replaced by the 1650.

    Yeah, of course, then spend the next 3 years lowering settings to their absolute lowest (in next gen titles) to avoid the 4GB bottleneck, turn off RT completely, and avoid RT dependent games entirely! What a great LOGIC indeed! It's like the concept of future proofing your purchase is entirely lost on you! 300$ down the drain are certainly way worse than 500$ that will last you MUCH longer and expand your gaming options. But to each is his own I guess.
     
  19. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,424
    Likes Received:
    908
    HUB uses reasonable settings for the performance level of low end cards. 6500XT is just one of the worst video card released in a decade or more.
     
  20. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    So basically the same thing I'd be doing on the 3050. Gotcha. :)

    Only I'd be out less than 300 USD for the 6500 XT so wouldn't feel all that bad. OTOH - I could do the same thing on a 600 USD card and be feeling much worse about it.

    Especially if prices come back down to earth in the next year or so and I end up replacing it within a year or so.

    And, of course, you completely missed the point that BOTH OF THEM ARE BAD CARDS at the price you have to pay to get one right now. Which goes back to. Would I feel as bad if I only spend 279 USD versus 599 USD for a card that isn't even remotely worth that much.

    The answer to that? Don't buy either one.

    Regards,
    SB
     
    Lightman and Jawed like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...