AMD Execution Thread [2023]

Status
Not open for further replies.
It seems patently obvious to me that if AMD could have created a profitable product competitive with the 4090, it would have done so. There's clearly a market for such products (Nvidia is selling a bunch of 4090s). Why would AMD not sell to the ultra high-end if AMD could compete there? It makes no sense. None of us should assume that AMD's executives and managers, who have proven very competent over the past several years, are incompetent. Giving them the benefit of the doubt, we must assume that AMD assessed their technology and the relevant metrics - costs, power consumption, performance, features, market size and share, etc. - and concluded they could not make a profitable product in the 4090's product segment.

It is a disservice to AMD's engineers and business people to accuse the company of "not trying" and it is nonsensical to argue that AMD arbitrarily decided "it wasn't worth the hassle" to compete in the ultra high-end. Indeed, if AMD was "not trying" or did not make an educated decision to stay out of the ultra high-end, then its executives are negligent. But, I have a higher opinion of AMD than to conclude the company capriciously abandoned the ultra-high end. Rather, AMD is competing where it reasonably believes it can compete, and it is not competing where it reasonably believes it cannot compete, like in the market serviced by the 4090.
 
What does that even mean? Something doesn't "worth the hassle" when it won't recoup the investments made into it. Why would a product not recoup these investments? Because it wouldn't sell. Why would something not sell? Because it wouldn't be good enough to compete with other products on the same market. It's pretty easy to decipher.

Good enough means more than just silicon that pumps out similar fps. The entire product including features and ecosystem matter. This is where Nvidia continues to kick their butt for years on end now. Even if they could build a $1500 GPU there’s no guarantee people would buy it in enough volume to recoup the investment.
 
it is nonsensical to argue that AMD arbitrarily decided "it wasn't worth the hassle" to compete in the ultra high-end.

Who said it was arbitrary? In order to determine it’s not worth the hassle they would obviously have done some basic ROI analysis of the market opportunity.
 
It seems patently obvious to me that if AMD could have created a profitable product competitive with the 4090, it would have done so. There's clearly a market for such products (Nvidia is selling a bunch of 4090s). Why would AMD not sell to the ultra high-end if AMD could compete there? It makes no sense. None of us should assume that AMD's executives and managers, who have proven very competent over the past several years, are incompetent. Giving them the benefit of the doubt, we must assume that AMD assessed their technology and the relevant metrics - costs, power consumption, performance, features, market size and share, etc. - and concluded they could not make a profitable product in the 4090's product segment.

It is a disservice to AMD's engineers and business people to accuse the company of "not trying" and it is nonsensical to argue that AMD arbitrarily decided "it wasn't worth the hassle" to compete in the ultra high-end. Indeed, if AMD was "not trying" or did not make an educated decision to stay out of the ultra high-end, then its executives are negligent. But, I have a higher opinion of AMD than to conclude the company capriciously abandoned the ultra-high end. Rather, AMD is competing where it reasonably believes it can compete, and it is not competing where it reasonably believes it cannot compete, like in the market serviced by the 4090.

Technically, it is possible to develop a GPU with specs that compete with theirs (NVIDIA) . However, the GPU developed in this way was introduced to the market as a graphics card with a TDP (thermal design power) of 600W and a reference price of $1,600 (about 219,000 yen)'', and was accepted by general PC gaming fans . After thinking about it, we chose not to adopt such a strategy.

The RDNA 3-based GPU "Radeon RX 7900XTX" released this time is targeted at $ 999 (about 136,000 yen), which is considered to be the "upper price" assumed by high-end users among general PC gaming fans. Developed . The "Radeon RX 7900XT" below it is said to be $ 699 (about 95,000 yen).

The price strategy is the same as the previous RDNA 2 (Radeon RX 6000 series), with the top-end "Radeon RX 6900XT" and "Radeon RX 6800XT" targeting $999 and $699, respectively. However, the target price changes for each GPU generation .

We take this strategy to fit into the mainstream infrastructure (hardware environment) utilized by today's PC gaming enthusiasts . At the same time as demanding high performance, it should be possible to operate with an existing common sense'' power supply unit, ``be able to cool the inside of the case with common sense'', and can be installed without requiring an extremely large case.'' ――The Radeon RX high-end product group was designed with these in mind.

AMD EVP, Rick Bergman (Machine Translated via ITMedia)

ie they're doing the same thing they've always done which hasn't worked for them. AMD does not understand PC gaming enthusiasts at all, IMO.
 
The phrase "wasn't worth the hassle" implied a degree of arbitrariness to me (giving me the image of AMD shrugging its collective shoulders and moving onto something else without much though). But, I see that you and I agree on this point, namely, AMD made a calculated decision to abstain from the ultra high-end because it lacked the ability to compete on all metrics.

@Remij I recall that quote. Bergman suggests it was "technically" possible to put out a GPU with specs that compete with the 4090. Even if we assume that to be true (i.e. a discrete GPU card with comparable performance and power as a 4090), his statement says nothing about said product's ability to compete on features, software, price, and profitability. What's the point of making a 4090 competitor if you won't make any money on it?
 
Good enough means more than just silicon that pumps out similar fps. The entire product including features and ecosystem matter. This is where Nvidia continues to kick their butt for years on end now. Even if they could build a $1500 GPU there’s no guarantee people would buy it in enough volume to recoup the investment.
"Good enough" can mean many things. It could be just a product which would be faster at the same price for example (everywhere, not "just in rasterization"). Or it could be a product which would have some unique features ("ecosystem") while being on par in performance. These features don't have to be s/w based either.

The point is that there is no such product from AMD this generation. Which means that overall they could've beaten Nv in performance or price or features (doubtful) but not in all of these meaning that such product would still loose in competition to Nv's, and this is why there is no such product.
 
The phrase "wasn't worth the hassle" implied a degree of arbitrariness to me (giving me the image of AMD shrugging its collective shoulders and moving onto something else without much though). But, I see that you and I agree on this point, namely, AMD made a calculated decision to abstain from the ultra high-end because it lacked the ability to compete on all metrics.

@Remij I recall that quote. Bergman suggests it was "technically" possible to put out a GPU with specs that compete with the 4090. Even if we assume that to be true (i.e. a discrete GPU card with comparable performance and power as a 4090), his statement says nothing about said product's ability to compete on features, software, price, and profitability. What's the point of making a 4090 competitor if you won't make any money on it?
The point is to have the best product.. which is the only thing that will chip away at the mindshare of consumers. By not making the best product.. they're actively ignoring a sizable chunk of the market (The 4090 alone has 0.25% of Steam, which translates to millions of GPUs) and people remember who has the best product.. and naturally the companies who can claim the fastest product will have an easier time convincing people that their other products are better as well.

There's a reason why AMD is not selling. Matching performance and slightly undercutting on price isn't working for them. In those cases people look to all the other advantage Nvidia has. People are willing to spend money to have the best, and they're willing to spend a little more for similar performing products.. when it comes with the guarantee that Nvidia will do anything to differentiate and put their cards out ahead. Technologies like DLSS are a perfect example.. If AMD comes with the same performing product, around the same price, a consumer will still buy the Nvidia GPU because Nvidia pushes support for their technologies, and they are generally better than the competition.

AMD has to stop trying over and over again what hasn't worked for them in ages. We need the 9700 Pro ATI days back...
 
AMD has to stop trying over and over again what hasn't worked for them in ages. We need the 9700 Pro ATI days back...
If they can build equivalent performance at significantly lower price, AMD will surge. The 7900XTX imo is getting close to that point. It reminds of Zen 1. It was almost there, but not quite yet. Zen 2 was when AMD made a significant splash. I feel like AMD is closing in here. I have high hopes for their next 2 generations to continue to match pace but keep that price point down.
 
The point is to have the best product.. which is the only thing that will chip away at the mindshare of consumers. By not making the best product.. they're actively ignoring a sizable chunk of the market (The 4090 alone has 0.25% of Steam, which translates to millions of GPUs) and people remember who has the best product.. and naturally the companies who can claim the fastest product will have an easier time convincing people that their other products are better as well.

There's a reason why AMD is not selling. Matching performance and slightly undercutting on price isn't working for them. In those cases people look to all the other advantage Nvidia has. People are willing to spend money to have the best, and they're willing to spend a little more for similar performing products.. when it comes with the guarantee that Nvidia will do anything to differentiate and put their cards out ahead. Technologies like DLSS are a perfect example.. If AMD comes with the same performing product, around the same price, a consumer will still buy the Nvidia GPU because Nvidia pushes support for their technologies, and they are generally better than the competition.

AMD has to stop trying over and over again what hasn't worked for them in ages. We need the 9700 Pro ATI days back...
To us GeForce and Radeon may appear to be direct competitors but business calculus is very different for these 2 companies.

For Nvidia, GeForce is half their business. They need to win or else the company dies.

For AMD, Radeon PC is noise, and in supply-constrained times this noise actively steals margins away from Zen.

This difference in priority obviously informs their long-term (R&D) and short-term (product planning) investment decisions.
 
To us GeForce and Radeon may appear to be direct competitors but business calculus is very different for these 2 companies.

For Nvidia, GeForce is half their business. They need to win or else the company dies.

For AMD, Radeon PC is noise, and in supply-constrained times this noise actively steals margins away from Zen.

This difference in priority obviously informs their long-term (R&D) and short-term (product planning) investment decisions.
There's a reason why it is this way though. I'm not saying AMD is necessarily wrong in what they're doing... rather I'm attempting to explain why they aren't able to gain PC dGPU market share... and it's going to remain that way, as they've essentially given up. (IMO)
 
There's a reason why it is this way though. I'm not saying AMD is necessarily wrong in what they're doing... rather I'm attempting to explain why they aren't able to gain PC dGPU market share... and it's going to remain that way, as they've essentially given up. (IMO)
While PC gfx is small piece of the puzzle, the architecture is essential for several other product categories and they can't skimp on it and compared to architecture building the chips is cheap (even if it's getting more and more expensive) Consoles are probably around 25-33 % of their business (xbox numbers aren't clear but sony is supposedly 20% so I'd expect the range to fit the bill) and they know if they can't be competitive enough, NVIDIA will push their way back to Sony and Microsoft. x86 is nice but not essential for consoles.
So no, I don't agree with giving up even with just PC space, regardless if they will settle with minority role compared to NVIDIA indefinitely.
 
While PC gfx is small piece of the puzzle, the architecture is essential for several other product categories and they can't skimp on it and compared to architecture building the chips is cheap (even if it's getting more and more expensive) Consoles are probably around 25-33 % of their business (xbox numbers aren't clear but sony is supposedly 20% so I'd expect the range to fit the bill) and they know if they can't be competitive enough, NVIDIA will push their way back to Sony and Microsoft. x86 is nice but not essential for consoles.
So no, I don't agree with giving up even with just PC space, regardless if they will settle with minority role compared to NVIDIA indefinitely.
That's exactly proving the point though.. Nobody is saying anything about consoles.. because regardless of what AMDs business is, it doesn't change the realities of their PC market presence. We just have to be honest about it... AMD has given up on making the highest end chips. They have specifically stated they aren't interested in, and don't want to compete with Nvidia at the >$1000 price range. They can give all the noble reasons they want for why that is.. but it doesn't change the fact. And the fact that Nvidia is selling more 4090 GPUs alone than AMD has the entirety of their 7000 series lineup... shows you exactly where they stand in the dGPU market right now. It's dire.

All I'm saying is that nothing about anything they're doing or have done for the past decade is doing anything to change that.. and if anything they're getting further and further away from what they need to do to gain back that market. I'm not saying it's easy, I'm not saying they aren't making money in other ways.. but this is where they are right now.

If AdoredTV, one of the most staunchest AMD defenders on the internet, can admit to that.. then others should be able to as well.
 
There's a reason why it is this way though. I'm not saying AMD is necessarily wrong in what they're doing... rather I'm attempting to explain why they aren't able to gain PC dGPU market share... and it's going to remain that way, as they've essentially given up. (IMO)

If they'd given up they'd stop releasing PC GPUs (they haven't) and they'd stop R&D on PC graphics IP (again, they haven't). PC graphics IP is still an extremely important segment for AMD even if dGPUs currently isn't.

At the moment semi-custom and integrated GPU designs are far more important to AMD than dGPU shipments and all of that relies on the same graphics R&D. They maintain a presence in dGPU so that they remain at least somewhat relevant in that space and it gives more name recognition to their mobile and even semi-custom parts.

For AMD business WRT GPUs is far larger than just dGPU which seems to be the only thing people want to focus on. And just because currently dGPU is the smaller share of their GPU strategy doesn't mean that in the future it won't be a larger share when they feel they can afford the wafer starts on it (that ROI on those wafer starts will need to be projected to at least be somewhat equivalent to what they get for CPUs, SOCs, mobile and semi-custom wins).

Sure, they could "gamble" on the super high end, but that's a very expensive gamble with a less than 50/50 chance of success due to NV's entrenched stay at the top and mindshare among gamers. Even if they released something that was 20% faster than the fastest NV card, it's unlikely that it would significantly move the needle WRT which cards consumers purchase because it is ingrained at the moment that NV is best.

I mean, hell, I just switched from a GTX 1070 to a Radeon 6800 and holy shit the drivers are infinitely better for AMD GPUs than NV GPUs, but most people still make the false claim (IMO) that NV drivers are significantly better. :p The 6800 certainly isn't perfect (video decode speed still isn't great, IMO) but for its performance in modern games it's just massively better than the more expensive 3070's that I was considering.

Regards,
SB
 
If they'd given up they'd stop releasing PC GPUs (they haven't) and they'd stop R&D on PC graphics IP (again, they haven't). PC graphics IP is still an extremely important segment for AMD even if dGPUs currently isn't.

At the moment semi-custom and integrated GPU designs are far more important to AMD than dGPU shipments and all of that relies on the same graphics R&D. They maintain a presence in dGPU so that they remain at least somewhat relevant in that space and it gives more name recognition to their mobile and even semi-custom parts.

For AMD business WRT GPUs is far larger than just dGPU which seems to be the only thing people want to focus on. And just because currently dGPU is the smaller share of their GPU strategy doesn't mean that in the future it won't be a larger share when they feel they can afford the wafer starts on it (that ROI on those wafer starts will need to be projected to at least be somewhat equivalent to what they get for CPUs, SOCs, mobile and semi-custom wins).

Sure, they could "gamble" on the super high end, but that's a very expensive gamble with a less than 50/50 chance of success due to NV's entrenched stay at the top and mindshare among gamers. Even if they released something that was 20% faster than the fastest NV card, it's unlikely that it would significantly move the needle WRT which cards consumers purchase because it is ingrained at the moment that NV is best.

I mean, hell, I just switched from a GTX 1070 to a Radeon 6800 and holy shit the drivers are infinitely better for AMD GPUs than NV GPUs, but most people still make the false claim (IMO) that NV drivers are significantly better. :p The 6800 certainly isn't perfect (video decode speed still isn't great, IMO) but for its performance in modern games it's just massively better than the more expensive 3070's that I was considering.

Regards,
SB
I'm saying they're giving up competing at the high end.. which they already have admitted to.

Maybe you guys didn't watch the video? Or aren't listening to what I'm saying for whatever reason? Because I'm not saying they're giving up on PC GPUs.. I'm saying they've given up the highest end to Nvidia (it's hard to compete when your competitors can charge $1600 for their GPUs and sell far more than you can your cheaper $1000 GPUs) and they're also giving up charting their own course on pricing. They wait for Nvidia to set prices, and slightly undercut.

It doesn't work..
 
I'm saying they're giving up competing at the high end.. which they already have admitted to.

Maybe you guys didn't watch the video? Or aren't listening to what I'm saying for whatever reason? Because I'm not saying they're giving up on PC GPUs.. I'm saying they've given up the highest end to Nvidia (it's hard to compete when your competitors can charge $1600 for their GPUs and sell far more than you can your cheaper $1000 GPUs) and they're also giving up charting their own course on pricing. They wait for Nvidia to set prices, and slightly undercut.

It doesn't work..
To be fair... look at what they tried to do for over a decade. (I'm going off the AIB GPU marketshare charts)
AMD GPUs since 2008-ish competed extremely well with Nvidia and typically offered better price/performance and sometimes better power efficiency.
Their launches take back some ground but within a quarter Nvidia picks up market share.
Since Maxwell 2 launched in Q3 2014, AMD basically had to redouble their efforts at the mid/lowend to claw back any marketshare.

Zen and the console wins have allowed them to fight back with RDNA.
Renewed R&D since 2019/2020 should hopefully yield some results over the next couple of generations.

It seems like AMD finally realized they don't need to play Nvidia's game, they just need to change the rules again.
They most definitely will keep competing at the "highend."

Edit- Rereading the thread. We are stating the same AMD woes while looking at history but coming to different conclusions.
 
Last edited:
The National Energy Research Scientific Computing Center, one of the key facilities of the US Department of Energy (DoE), has opened up the bidding on its future NERSC-10 exascale-class supercomputer.

IBM is out of the business (they said it is not profitable for them), Intel is also not interested after it's major problems and delays with Aurora (they delayed their future data center GPUs further anyway), this leaves out AMD and NVIDIA. However, The Next Platform seems to think that both are reluctant, as these exascale machines are very hard to make money on. AMD didn't seem to be making much money from El Capitan and Frontier, Intel outright lost hundreds of millions from Aurora, and IBM frankly said the endeavor is not profitable for them.

These are fixed cost deals that are more like research and development cost coverage than a profitable, commercial deal at even cost plus, much less at a reasonable profit.
Building exascale machines is important for national security and for driving information technology innovation, but it is not an easy way to make money. And it has never been. And it almost certainly never will be. Jennifer Granholm, who is the Secretary of Energy, should send flowers, chocolates, and Thank You notes to HPE and AMD for the “Frontier” machine running for the past year at Oak Ridge and for the “El Capitan” machine going into Lawrence Livermore National Laboratory later this year.

The Next Platform seems to think it could go for AMD again if they made a good deal on price (like they did for El Capitan and Frontier), or could go NVIDIA for the sake of diversity and not to rely on a single source, it's practically a toss at this point, but I think AMD might be the more likely contractor here again, especially if they want to land a solid footing for MI300 and the future MI400.

 

The usual large caveat that this is coming from Moore's Law Is Dead, but it's always been the obvious path for AMD, just a question of when they would finally take the plunge. A 256-bit LPDDR5 bus would finally alleviate (mostly) the APU bandwidth bottleneck.

A 40CU, 256-bit bus APU with IF cache is far more exciting to me than the next $1500+ GPU frankly, albeit of course depends on price - but powerful APU's could go a long way towards making PC's more price-competitive with consoles . The question is just if there's a market for them, but starting mobile first is definitely where there's a far greater chance for adoption rather than a 250 + watt desktop chip.

If this comes, would also love to see these sold like NUC kits outside of notebooks. You could have a system smaller than an Xbox Series S with 6700+xt performance, perfect secondary PC for the TV.
 
Status
Not open for further replies.
Back
Top