AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Only very small SIs buy channel. Is that a market that's large enough to launch a whole SKU for? I don't know, maybe. Also depends on IGP/no IGP for AM5.
I don't have any sales data...

Since you brought that up for the 2nd time, it seems you want to talk about it, so: How large is the perf deficit for the 8G 3050 with x8 interface going from 4.0 to 3.0 and how does that compare to the 4G 6500 with x4?
Well @pharma found some deficit data for RTX 3050. Certainly not roses with a worst case reported of 9% at 1080p.
 
And this is the PCIE scaling of the RTX 3080.

relative-performance_3840-2160.png


https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/27.html
 
Well @pharma found some deficit data for RTX 3050. Certainly not roses with a worst case reported of 9% at 1080p.
?? In the linked video, HUB explicitly says "we see absolutely no change for the RTX 3050" when commenting PCIe scaling at roughly 14:05.
Computerbase comes up within the margin of error as well: https://www.computerbase.de/2022-01...3/#abschnitt_benchmark_mit_pcie_40_vs_pcie_30
upload_2022-1-28_14-50-36.png
TPU says, 1-2 % difference: https://www.techpowerup.com/review/gigabyte-geforce-rtx-3050-gaming-oc/40.html

And when it comes to inverse cherry picking, how about applying that to the 6500 XT as well?
 
If you run a game within the memory requirements supported for the card and it gets slower with PCI Express 3.0 versus 4.0, that's cherry picking?
Using a single outlier result out of all the tests done is.

edit:
Look, I've got proof, that you need PCIe 3.0 for full performance on RTX 3050. (j/k of course, but I guess you see what I mean).
upload_2022-1-28_15-56-0.png
 
Last edited:
This is why I said practical. Most of the difference comes from settings where the frame-rate is unplayable anyway.
I disagree. Flipping through the 1080p results in the link, there are 4 games that run below or just above 30 fps (Valhalla, Cyberpunk, RDR2, Watch Dogs Legion) which I would count as unplayable. But plenty of games at or above 50 fps.
I don't think comparisons at that perf level are purely academic.

edit: And yes, I would agree that picking data points from 4K results are pointless in this case. But coming from 40.8 fps in Deathloop 1080p and going down to 30.4 fps is very relevant for example. As is the 51 vs 28.6 fps in Detroit Become Human or 75.1 to 53.5 fps in Doom Eternal, which feels laggy with that kind of fps rate already.
 
Last edited:
I disagree. Flipping through the 1080p results in the link, there are 4 games that run below or just above 30 fps (Valhalla, Cyberpunk, RDR2, Watch Dogs Legion) which I would count as unplayable. But plenty of games at or above 50 fps.
I don't think comparisons at that perf level are purely academic.

The other additional thing to consider here is reviewers tend to test all max (well except RT, but lets not go there :devilish:) settings while individual setting adjustments are available. If it's memory pressure causing causing the most performance drop then it can be a situation in which this particular card would need to sacrifice more memory sensitive settings (typically texture quality related) relatively more so than other settings to bring performance up when compared to others. However at the same time we could also debate the issue of those max likely texture settings.

I feel though in general this is an example though of a situation in how while the traditional review format can reflect the academic performance but doesn't necessarily convey the actual usability performance.

On a semi related note. Has anyone looked into whether or not PCIe limitations may affect games that rely more on streaming of data to manage memory load in terms of things such as more texture pop in?
 
Where can you get GeForce RTX 3050 for just $50 more than Radeon RX 6500 XT?

NewEgg has the cheapest „in stock“ Radeon RX 6500 XT for $259, cheapest „in stock“ GeForce RTX 3050 for $639.
In Europe it's slightly better for the GeForce, I saw one for €499, but the Radeon starts at €279-289, so the GeForce is still almost twice as expensive.

You can get Radeon RX 6600 cheaper than GeForce RTX 3050. It's 30-35% faster, cheaper and has lower power consumption. You can get even Radeon RX 6600 XT (50 % more performance) for the price.
Then get a 6600 by all means! A much better card than either the 3050 or the 6500.

The only reason the 6500 is still available near MSRP is because of what a piece of shit card it is, even non-PC experts have been hearing about what rubbish it is. The 3050, on the other hand, is rising in price because of the value it offers.

It's not proof the 6500 is better because it's cheaper or available, quite the opposite.

I would not opt for a ~200 Watt card only because of abstract fears of the bus or the memory capacity not being adequate in practical scenarios.
Hey! What you talking about? Granted I under volted my 580, but it rarely goes over 120w even in extremes. Is ~200w normal for my card?
 
See above, obtaining full performance out of the 6500XT requires a PCI-E4 motherboard (which is an added cost), that is a laughable situation for a dirt slow entry level GPU, not even the RTX 3090 needs that kind of hassle.
Every motherboard released in last 2-3 years has PCIe 4.0, so every new system has at least PCIe 4.0 slot. Current Intel-based systems have even PCIe 5.0. It doesn't make sense to build a new PC on PCIe 3.0 platform once you care about 3D performance.

As for existing PCIe 3.0 systems, I can't see a reason, why should any owner (of a system with dedicated graphics card) upgrade to the lowest-end part of current generation. That doesn't make sense to me. Most users upgrade their GPUs, because they want more performance. Lowest-end parts were never meant to be a performance upgrade. Have you ever upgraded to GeForce GT710? Or to Radeon HD 6350? Most of these GPUs went to new low-end systems and some of them were bought as a back-up GPU in case of RMA of current graphics card.

For new systems PCIe 3.0 is not an issue (they have PCIe 4.0/5.0). For current PCIe 3.0 systems, like 99 % of the gamers have no reason to upgrade, either because they already have a faster GPU or because the performance difference is not significant enough (GTX 1650, 1050 Ti, GTX 1060, RX 470, RX 570…). The „PCIe 3.0 problem“ is in fact a niche scenario, theoretical issue. >90 % Navi 24 GPUs manufactured will go to PCIe 4.0 / 5.0 systems.
 
Man, it's almost like everyone thinks that there aren't a lot of people out there with a hard budget on how much they can spend on things like this. :p

Is the 6500 XT a great card? Nope. But then neither is the 3050. If you could get both at MSRP the 3050 is obviously the better card. Unfortunately, unless you're a scalper with automated tools, good luck getting a 3050 at MSRP.

If someone held a gun to my head and said, buy a graphics card now that you'll actually use or I'm going to shoot you in the head? I'm buying the 6500 XT. It is far and away the better deal compared to the RTX 3050.

I can currently buy a 6500 XT on Amazon for 279 USD. Good luck with the 3050. Let's see, if I go to Newegg I can find the 3050 for the right bargain of 599.99 USD. Oh goodie.

Worst case scenario choosing certain games the 3050 is slightly more than 2x faster than the 6500 XT when the 6500 XT is limited to using a PCIE 3.0 interface.

Hmmm, but I have PCIE 4.0 in my machine. So on average the 3050 will be ~25% faster than the 6500 XT.

So, I can get the 6500 XT for under 300 USD. Or I can pay over 2x as much for a 3050 which will give me on average 25% more performance. Whoopdie F-ing do.

Is the 3050 worth twice the price of the 6500 XT? IMO, not even remotely close.

Now, if both were being sold at MSRP and the 3050 was only 25% more (199 USD versus 249 USD)? Hell yeah I'd get a 3050 instead of a 6500 XT. Unforunately, the reality of current market conditions make the 6500 XT a significantly better buy.

However, since nobody is holding a hypothetical gun to my head, I'm not buying either card because both cards are shite for how much you have to pay for them.

Regards,
SB
 
Every motherboard released in last 2-3 years has PCIe 4.0
That's not true, Intel is late to the PCIE4 party, only migrating to it a year ago.

As for existing PCIe 3.0 systems, I can't see a reason, why should any owner (of a system with dedicated graphics card) upgrade to the lowest-end part of current generation.
It's the start of a new console cycle, people with very old cards (Fermi, Kepler, Maxwell, GCN1, GCN2, GCN3) need to upgrade to play the latest DX12/Vulkan exclusive titles even at lowest quality settings, console ports upped the requirements significantly during the past two years. And with GPU shortages and high prices, lowest end cards are the only viable option for many people. Also many people upgrade to from the lowest end of a very old GPU family to the lowest end of the a new GPU family, for example from a GTX 750Ti to a GTX 1050 or GTX 1650, this is a widely common practice among PC gamers, just look at Steam hardware survey, the 1050 was the second most common GPU, before being replaced by the 1650.

So, I can get the 6500 XT for under 300 USD. Or I can pay over 2x as much for a 3050 which will give me on average 25% more performance. Whoopdie F-ing do.
Yeah, of course, then spend the next 3 years lowering settings to their absolute lowest (in next gen titles) to avoid the 4GB bottleneck, turn off RT completely, and avoid RT dependent games entirely! What a great LOGIC indeed! It's like the concept of future proofing your purchase is entirely lost on you! 300$ down the drain are certainly way worse than 500$ that will last you MUCH longer and expand your gaming options. But to each is his own I guess.
 
Yeah, of course, then spend the next 3 years lowering settings to their absolute lowest (in next gen titles) to avoid the 4GB bottleneck, turn off RT completely, and avoid RT dependent games entirely! What a great LOGIC indeed! It's like the concept of future proofing your purchase is entirely lost on you! 300$ down the drain are certainly way worse than 500$ that will last you MUCH longer and expand your gaming options. But to each is his own I guess.

So basically the same thing I'd be doing on the 3050. Gotcha. :)

Only I'd be out less than 300 USD for the 6500 XT so wouldn't feel all that bad. OTOH - I could do the same thing on a 600 USD card and be feeling much worse about it.

Especially if prices come back down to earth in the next year or so and I end up replacing it within a year or so.

And, of course, you completely missed the point that BOTH OF THEM ARE BAD CARDS at the price you have to pay to get one right now. Which goes back to. Would I feel as bad if I only spend 279 USD versus 599 USD for a card that isn't even remotely worth that much.

The answer to that? Don't buy either one.

Regards,
SB
 
Back
Top