AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Intel said 4 million+ this year IIRC
AFAIR they did not specify, if those where discrete ARC cards still to come or if their mobile (and maybe even DG1-) models do count towards that number (which I hope they do not).

AMD really needs to improve there, all the more reason now Intel also is joining and probably taking a piece of the gpu market. They will fall further and further back tech.wise due to this, NV can do what they do because they have the resources (due to sales).
AMD is doing (very well) what every (public) company does: focussing on revenue, margin and their shareholders. And unfortunately there's much more money to be made by trying to meet the insatiable Data Center demand for (Epyc) CPUs than making a few GPCMR members happy.
 
I think AMD would need to have a performance halo product to turn that around and they don't seem willing to go that far.
This doesn't really work. DIY just buy nVidia.
  1. AMD/ATi releases a new fast'n'shiny gen first => let's wait for nVidia => buy nVidia
  2. nVidia releases a new fast'n'shiny gen first => buy nVidia

Besides, objectively AMD doesn't have resources (R&D) to design top GPU products. They gotta cut the corners - software (horrible drivers, amateurish compute stack), technologically (AI upscaling, RT implementation), etc.

AMD is a tiny GPU shop compared to nVidia. AMD R&D/team size/$$$ budget gets split among many projects: mobile SoC, desktop/server SoC, embedded SoC, console SoC, and also gaming/compute GPUs. Yet, they got less employees, revenue, etc.

Just be glad they finally got funds to ditch that GCN compute abomination and keep trying to innovate in a meaningful way.
 
AMD Radeon RX 6400 Tested on PCI-Express 3.0 | TechPowerUp
May 6, 2022
Recently, GPU vendors have been lowering the PCI-Express lane count of their entry-mainstream GPUs in a bid to lower the pin-count and reduce traces on the PCB.
...
By itself, PCI-Express 4.0 x4 seems like plenty of bandwidth, 8 GB/s per direction, until you realize that you also need a PCIe Gen 4.0 capable processor and motherboard to use it in Gen 4 mode—that's Ryzen 3000 Matisse, Ryzen 5000 Vermeer, Core Rocket Lake, and Core Alder Lake. Entry-level parts from both brands are relegated to PCI-Express Gen 3.0.

If you're experiencing déjà vu, it's because we recently tested PCI-Express scaling of the RX 6500 XT, the slightly bigger sibling of the RX 6400, and drew some interesting conclusions, mainly that PCI-Express Gen 3 bites off a big chunk of performance.

In my RX 6400 review, I suggested that the performance hit would be roughly similar, but several readers requested I actually test this theory, so here we are. We are hence testing the RX 6400 in PCI-Express Gen 3 mode, and evaluating its performance loss to Gen 4 to see just how much it affects the target audience of the RX 6400: those with entry-level platforms that have PCI-Express Gen 3.
 
This doesn't really work. DIY just buy nVidia.
  1. AMD/ATi releases a new fast'n'shiny gen first => let's wait for nVidia => buy nVidia
  2. nVidia releases a new fast'n'shiny gen first => buy nVidia

Besides, objectively AMD doesn't have resources (R&D) to design top GPU products. They gotta cut the corners - software (horrible drivers, amateurish compute stack), technologically (AI upscaling, RT implementation), etc.

AMD is a tiny GPU shop compared to nVidia. AMD R&D/team size/$$$ budget gets split among many projects: mobile SoC, desktop/server SoC, embedded SoC, console SoC, and also gaming/compute GPUs. Yet, they got less employees, revenue, etc.

Just be glad they finally got funds to ditch that GCN compute abomination and keep trying to innovate in a meaningful way.
Once upon a time, ATI was much bigger than NVidia.

A fun marketshare chart over here. I think it was R300 that got them ahead of NVidia for awhile ages ago. Unfortunately NVidia hasn't completely dropped the ball since then.
https://forum.beyond3d.com/posts/1916435/
 
Last edited:
Once upon a time, ATI was much bigger than NVidia.

A fun marketshare chart over here. I think it was R300 that got them ahead of NVidia for awhile ages ago. Unfortunately NVidia hasn't completely dropped the ball since then.
https://forum.beyond3d.com/posts/1916435/
I think there's been a substantial shift in the "culture" and ecosystem of PC gaming, which Nvidia helped cultivate -- and profit from -- which will ensure that AMD/ATI will remain with a minority share of the GPU market, no matter how good of a GPU they release. At least for the foreseeable future, IMO.

Back when ATI was around, the PC gaming and hardware tinkering community was much more exclusive. Sure you had NV and ATI fanboys; but they were on the fringes, and most enthusiasts had no brand loyalty and just purchased whatever piece of hardware was best reviewed. If someone purchased a Radeon 9700, then there was a high chance they would purchase a GeForce 6800 GT or 7600 GT the next generation. Likewise, if someone purchased a GeForce 6600 GT, then they would probably purchase a Radeon X1900 XT or X1950 Pro the next time they went hardware shopping. There was very little persistence in a particular hardware vendor -- like with CPUs.

But I think a lot of this changed with the broadening of PC gaming/hardware, and with initiatives like the Nvidia TWIMTBP program. You had a lot of first-time buyers and builders of PCs with very little knowledge of the PC hardware landscape, and so they would rely on advertisement campaigns and endorsements to drive their purchasing decisions. In a way, PC hardware -- particularly GPUs -- have become very Apple iPhone like: A consumer buying their first smartphone will likely purchase an iPhone based on clever advertising and word of mouth. Once they purchase their iPhone, they are then likely to purchase another iPhone when they upgrade because of the perceived value of remaining as an Apple iOS user. Nvidia does a lot of this with things like DLSS, RT, NVENC, AI-accelerated voice/camera features, GameWorks, and so on. That's not to say that these things don't have value -- Apple puts effort into ensuring that iOS does in fact have a lot of "creature comforts" above Android -- but most enthusiasts would probably be to bypass these features in order to get a more general set of improvements by jumping to another platform. That is, enthusiasts are comfortable in switching back and forth between, say, Nvidia and AMD, or Intel and AMD, or iOS and Android, depending on whatever hardware/software cycle is available when they're looking for a purchase.

Tl;dr: Nvidia have built up an Apple-like reputation amongst PC gamers.
 
I think there's been a substantial shift in the "culture" and ecosystem of PC gaming, which Nvidia helped cultivate -- and profit from -- which will ensure that AMD/ATI will remain with a minority share of the GPU market, no matter how good of a GPU they release. At least for the foreseeable future, IMO.

Back when ATI was around, the PC gaming and hardware tinkering community was much more exclusive. Sure you had NV and ATI fanboys; but they were on the fringes, and most enthusiasts had no brand loyalty and just purchased whatever piece of hardware was best reviewed. If someone purchased a Radeon 9700, then there was a high chance they would purchase a GeForce 6800 GT or 7600 GT the next generation. Likewise, if someone purchased a GeForce 6600 GT, then they would probably purchase a Radeon X1900 XT or X1950 Pro the next time they went hardware shopping. There was very little persistence in a particular hardware vendor -- like with CPUs.

But I think a lot of this changed with the broadening of PC gaming/hardware, and with initiatives like the Nvidia TWIMTBP program. You had a lot of first-time buyers and builders of PCs with very little knowledge of the PC hardware landscape, and so they would rely on advertisement campaigns and endorsements to drive their purchasing decisions. In a way, PC hardware -- particularly GPUs -- have become very Apple iPhone like: A consumer buying their first smartphone will likely purchase an iPhone based on clever advertising and word of mouth. Once they purchase their iPhone, they are then likely to purchase another iPhone when they upgrade because of the perceived value of remaining as an Apple iOS user. Nvidia does a lot of this with things like DLSS, RT, NVENC, AI-accelerated voice/camera features, GameWorks, and so on. That's not to say that these things don't have value -- Apple puts effort into ensuring that iOS does in fact have a lot of "creature comforts" above Android -- but most enthusiasts would probably be to bypass these features in order to get a more general set of improvements by jumping to another platform. That is, enthusiasts are comfortable in switching back and forth between, say, Nvidia and AMD, or Intel and AMD, or iOS and Android, depending on whatever hardware/software cycle is available when they're looking for a purchase.

Tl;dr: Nvidia have built up an Apple-like reputation amongst PC gamers.
The market is much bigger but brand loyalty has always been a thing. I remember plenty of people blindly buying GeForce FX cards based on that. Even when they were slower than their previous GeForce 4 (ie "upgrading" from a Ti 4xxx to a FX 5600). Buy a Radeon? No way! They had a Rage or previous Radeon and it had unbelievably, verifiably terrible drivers and the company didn't seem to give a shit (really didn't seem to until into the R300 years). Or green is just prettier. Who can say.

Nvidia has simply been better at building a brand and coming up with initiatives with perceived and actual value that cultivate loyalty. Probably should single out devrel as super extra notable there. I am looking forward to seeing what Intel does to warp minds. So far not very impressive though.

So yeah I see it the same way for the most part. ;)
 
Last edited:
The market is much bigger but brand loyalty has always been a thing. I remember plenty of people blindly buying GeForce FX cards based on that. Even when they were slower than their previous GeForce 4 (ie "upgrading" from a Ti 4xxx to a FX 5600). Buy a Radeon? No way! They had a Rage or previous Radeon and it had unbelievably, verifiably terrible drivers and the company didn't seem to give a shit (really didn't seem to until into the R300 years). Or green is just prettier. Who can say.

I dunno man. That doesn’t quite line up with historical data. IIRC, in the mid 2000s, ATI were beating (and tied) Nvidia in GPU market share. I noted that both sides had their loyal followers; the broader market were willing to switch sides depending on who had the better part though.

https://www.gamesindustry.biz/articles/ati-edges-past-nvidia-in-graphics-market-share

Nvidia has simply been better at building a brand and coming up with initiatives with perceived and actual value that cultivate loyalty. Probably should single out devrel as super extra notable there. I am looking forward to seeing what Intel does to warp minds. So far not very impressive though.

So yeah I see it the same way for the most part. ;)

Yeah, this is the “Apple hypothesis” I suggested.

I am interested to see how Intel fares though, yes.
 
I dunno man. That doesn’t quite line up with historical data. IIRC, in the mid 2000s, ATI were beating (and tied) Nvidia in GPU market share. I noted that both sides had their loyal followers; the broader market were willing to switch sides depending on who had the better part though.

https://www.gamesindustry.biz/articles/ati-edges-past-nvidia-in-graphics-market-share
Those were just some fun first hand people interactions I recall. :D

Check out this link to an old chart I posted earlier. That early D3D9 era was the last time ATI/AMD was indeed ahead. It seems like that would be because it became clear to the public that GeForce FX was not a good product and hurt the brand a bit.
https://forum.beyond3d.com/posts/1916435/
 
Last edited:
If somehow demand were to drop precipitously, maybe they would fight for market share with a price war like 2008-2010. Those were good times.

But when they can easily sell everything they make yeah we are going to pay.
 
Wow TPU clocked 2600Mhz on the MSI 6950XT. That’s a huge clock advantage over the 3090 Ti yet performance is very similar. Is RDNA 2 bandwidth bound at those clocks?
 
Isn't it just 35 Watt more compared to the 6900XT?
That's the reference design, which adds very modest improvements over the regular 6900XT, most reviewed are made with the Sapphire Nitro+ or with the MSI Trio, and those have their power figures so close to the 3090Ti.

power-gaming.png


https://www.techpowerup.com/review/msi-radeon-rx-6950-xt-gaming-x-trio/36.html
 
AMD follows Nvidia in releasing refresh GPUs solely to increase prices.

I don't think prices are ever going back down in upcoming GPUs.

Perf (or least function) to price is still going to improve. Unless they feel attrition losses and new comers are enough to drive the market they need to convince existing customers to actually upgrade and those on the sideline to buy in. Not to mention at some in the next GPU generation cycle it's likely console availability will completely stabilize. I do feel there is a likelihood that at least short if not even medium term GPU prices might have stabilized until the next market "shock" occurs.

Also a lot of this seems to based on perspective in terms of consumer psychology. Since this is an AMD thread let's say Navi 33 as the 7600 XT with 8GB is $500 but has the same performance as the 6900 XT with better RT performance. Did prices go up or down? We can even throw in the power consumption angle into that as well and say it'll be 250w TGP, did power consumption go up or down? The price (and power) for AMD's smallest die and x600 XT class product did go up. I know some people are purely fixated on this but you are getting more performance for the same price (or the same performance for a cheaper price). Granted the stall in cost reduction for DRAM over recent years adds a wrench into the comparison.
 
That's the reference design, which adds very modest improvements over the regular 6900XT, most reviewed are made with the Sapphire Nitro+ or with the MSI Trio, and those have their power figures so close to the 3090Ti.l

Yeah ok, that's pretty wild.

I can imagine in the near future that graphic cards need to have their own energy rating system in the EU, like we have for TVs, refrigerators, washing machines, etc.
 
Back
Top