AMD Execution Thread [2023]

Status
Not open for further replies.
Just watched Jim's new video at Adored and OUCH! Rattled me good, I can't really disagree with him. :(

Considering the source I'd rather not click it, is it anything but the usual doom'n'gloom that comes by every now and then since over a decade ago?
 
Considering the source I'd rather not click it, is it anything but the usual doom'n'gloom that comes by every now and then since over a decade ago?
I'm not much of a fan of his either, but I watched it and he makes some good points.

-AMD's gaming revenue which comes from PC graphics cards sales is incredibly tiny
-Despite the architecture having the capability to do so, (scaling up die size of RDNA3) AMD don't seem willing to compete with Nvidia at the high end
-AMD also have proven that they aren't out to win back market share with pricing as the 4080 fiasco showed that they waited for Nvidia to set the price of the 4080 and just undercut it slightly, instead of pricing it in line with their own increases
-AMD essentially lied to their fanbase about the performance of the RDNA3 GPUs.. losing trust

He goes over how he came to those conclusions and like stated previously.. it's hard to argue with it.


My question is, what happens if AMD gets out of the discreet PC gaming market..?
 
He is basing his analysis on the fact that Sony accounted for 20% of AMD's non Xilinix business in 2022, which means AMD's entire dGPU sales in the whole of 2022 is only around 1 billion$, with profits in the range of 100 mil to 200 mil.

Which coincides with their low market share numbers in both JPR and Steam Survey.

Their presence in PC gaming shares is so low, that Sony accounted for 20% of their (non Xilinix) business in 2022.

Despite the architecture having the capability to do so, (scaling up die size of RDNA3)
That I don't agree with, power consumption of the small 300mm RDNA3 compute die is high, very high infact, doubling that size would incur a power penalty so large, it would easily cross the 500w threshold.
 
That I don't agree with, power consumption of the small 300mm RDNA3 compute die is high, very high infact, doubling that size would incur a power penalty so large, it would easily cross the 500w threshold.
And? AMD could have built a cooler to support it.
 
If AMD wasn't trying they wouldn't have bothered with Infinity Cache or chiplets. They are most certainly trying to innovate. They also know that price cuts won't help them as it'll just cement their reputation as the bargain option and do nothing for revenue or market share. Their main problem is that Nvidia is relentless and has a bigger war chest. RDNA is a good enough architecture where AMD could have easily taken the crown if Nvidia faltered. But they didn't.
 
The main problem would be cost which would be dictated by competition. Ending up 10% slower with 50% more power at the same price wouldn't help anything.
AMD stated that they could have beaten the 4090. Nvidia has proven that you can cool very large chips at 600w. AMD's unwillingness to do that is locking them out of a segment of the market, and it also has an effect on their ability to market their brand as the best to the market as a whole.

And that's essentially the point... AMD need to not just have better price/performance ratio than the competition... that hasn't worked for them for well over a decade.. They need to have the BEST.. and it's clear that they aren't striving to provide that to consumers. They are allowing Nvidia to dictate the market and pricing themselves out of a market their own consumers are willing to pay.

AMD really have to change the perception of their Radeon brand.. and they're not doing anything to suggest that they're going to do that.
 
And? AMD could have built a cooler to support it.
It's not about the cooler, it's about being so power hungry while also being behind in Ray Tracing and other features, no body would have purchased it. Intel could also design a 700w GPU that challneges the 4090, but would also be behind in performance, features and compatibility. It's not about who "could", but who can at what performance level, power and cost. Needing a 500w 600mm 7900XTX to beat a cut down big Ada (4090) with actual power consumption often in the range of 350w to 400w, while also staying behind in Ray Tracing, and upscaling means a dead on arrival product.

Regarding coolers, It's not the first time AMD ignored cooling design altogether (probably not financially feasbile given their low profits), when they need to push products to the limits, they just slap a liquid cooling on top of it and call it a day (Vega 64 LC, 6900XT LC), or put a cheap one when they really want to cut costs (R9 290X, RX 480, Vega 64).
 
It's not about the cooler, it's about being so power hungry while also being behind in Ray Tracing and other features, no body would have purchased it. Intel could also design a 700w GPU that challneges the 4090, but would also be behind in performance, features and compatibility. It's not about who "could", but who can at what performance level, power and cost. Needing a 500w 600mm 7900XTX to beat a cut down big Ada (4090) with actual power consumption often in the range of 350w to 400w, while also staying behind in Ray Tracing, and upscaling means a dead on arrival product.

Regarding coolers, It's not the first time AMD ignored cooling desging altogether (probably not financially feasbile given their low profits), when they need to push products to the limits, they just slap a liquid cooling on top of it and call it a day (Vega 64 LC, 6900XT LC), or put a cheap one when they really want to cut costs (R9 290X, RX 480, Vega 64).
More people are purchasing the 4090 at $1600 than the 7900xtx at $1000.... let that sink in.

And that's because there's a group of people who want THE BEST, regardless of price. Being capable of doing that, is something AMD claims they could have achieved on their current platform.. but they decided not to. That tells customers that AMD isn't trying to be the best.. and whether you agree with me or not.. that really affects the market. People don't give AMD a chance because of stuff like this. Whereas you know with Nvidia they will do absolutely anything to be the best.

From there... AMD aren't dictating their own prices, they are waiting for Nvidia to set theirs and then try to undercut them. When has this tactic ever worked for them? It's clear they aren't doing anything to convert people.. in which lies the problem. AMD need to gain mindshare back with regards to Radeon. The processor side was the same thing for how long... getting beat by Intel having the clear best CPU.. then attempting to undercut them in the mid and low ranges going for price/performance... And that didn't work until they proved they had an architecture which could be the best.. and then scaled that at prices that allowed them to actually steal market share from Intel. Of course intel was floundering at this time and it worked out for AMD.. but same thing applies. They can't just accept a world where Nvidia has the fastest GPUs out there. They have to make the best GPUs they can each and every generation, and then catch Nvidia slipping and steal back market share.
 
Last edited:
Needing a 500w 600mm 7900XTX to beat a cut down big Ada (4090) with actual power consumption often in the range of 350w to 400w, while also staying behind in Ray Tracing, and upscaling means a dead on arrival product.
Both would have lower actual consumption in CPU limited d scenarios. I am not sure they would be behind in ray tracing either. But we can be sure they would come out on top in many other ways. Also, the upscaling situation isn't set in stone. Still, I am glad they don't waste resources on silly big chips.
They can't just accept a world where Nvidia has the fastest GPUs out there.
Is that a fact or your wish?
 
AMD stated that they could have beaten the 4090.
At what price?
I don't believe for a second that they could've beaten 4090 at the same price and decided not to.
Such decisions are always, always rooted in the ability to compete. Saying "we could've beaten that" means nothing when it doesn't continue with "at the same price and power draw". Because otherwise - no, you could not have.
 
Is that a fact or your wish?
It's a fact... considering nothing else they've been doing has gained them any meaningful ground in the PC discreet GPU market.

Do you want them to continue to do the thing that hasn't worked for them for the past decade plus.. or can you admit it's time for them to start doing their best? When they come out and say they could make a GPU that beats Nvidia's best, but don't.. then people are going to understand that they don't want to compete at that level and the enthusiast wont even consider them.

At what price?
I don't believe for a second that they could've beaten 4090 at the same price and decided not to.
Such decisions are always, always rooted in the ability to compete. Saying "we could've beaten that" means nothing when it doesn't continue with "at the same price and power draw". Because otherwise - no, you could not have.
At a similar price to Nvidia's 4090.. and 600w. The entire premise of AMD saying that was "well if we wanted to release a $1600 and 600w power draw like Nvidia, we could have made something better than them".

Do you believe they couldn't have made a GPU that beats the 4090 at $1600 with a 600w TBP? Because that's the issue. Nvidia will do what it takes to be the best.. AMD wont. People respond to stuff like that whether you believe it or not.
 
I believe that if they could have they would. The sole fact that they didn't mean that they couldn't.

Well AMD arguably tried with the 6900XT and came reasonably close to the 3090 and it didn’t help at all. Maybe they really decided it wasn’t worth the hassle. If the Scottish YouTuber is right the Radeon group is barely justifying its existence as is and there isn’t a lot of margin to play with.
 
It's a fact... considering nothing else they've been doing has gained them any meaningful ground in the PC discreet GPU market.

Do you want them to continue to do the thing that hasn't worked for them for the past decade plus.. or can you admit it's time for them to start doing their best? When they come out and say they could make a GPU that beats Nvidia's best, but don't.. then people are going to understand that they don't want to compete at that level and the enthusiast wont even consider them.

It is not a fact. Just because some other approach exists does not mean it would lead to a bigger success.

I don't won't them to continue doing the last decade of not clearly advantagous products than better known competition. Symmetrical strategies won't win this uphill battle.
 
I believe that if they could have they would. The sole fact that they didn't mean that they couldn't.
They literally state they could have.

It is not a fact. Just because some other approach exists does not mean it would lead to a bigger success.

I don't won't them to continue doing the last decade of not clearly advantagous products than better known competition. Symmetrical strategies won't win this uphill battle.
It's a fact.. whether you admit it or not. AMD are not going to gain any marketshare by not putting out the best products their architectures are capable of by "going big" with die size and power draw like their competition does. Not to mention by only reacting to/following Nvidia's insane pricing at the mid and low range instead of setting their own course to gain market share (and mind share).
 
I guess you have some unique definition of the term fact.
Also going big does not imply better products.
 
I'd have to disagree that they aren't even trying. Their semi-custom business model relies greatly on how performant their GPU hardware is and more importantly how energy efficient it is when not pushed to the bleeding edge. IE - the 7900X/TX isn't what you should look at here because it's pushed too hard on the voltage curve in order to look better in benchmarks, it's their mobile solutions based on RDNA3 that's the important bit.

Of course, they have also de-emphasized the discrete GPU market because its profit potential currently for them is far lower than consumer and enterprise CPUs and APUs. But their consumer CPUs also now rely on a performant and relevantly modern GPU architecture. Thus, they can't give up on GPU designs even if they aren't releasing a lot of discrete GPUs. Investment in their GPU arch doesn't mean it has to lead to more discrete GPU shipments.

When considering how to best spend their wafer allocations, discrete GPUs are likely last on the list because they currently have the least ability to generate high margin profits. The only one that might be lower would be semi-custom console SOCs, but that has the benefit of long term contractual manufacturing, in other words, it doesn't have the volatility associated with the discrete GPU market making it preferable revenue source even if the profit margins might be lower.

It's easy for NV to sink everything they have into GPUs, because that's by far the vast majority of their revenue and profit steam. AMD, as much as we'd like to have them dump more into discrete GPU have to think of what allocation of resources would allow them to not only survive but better position themselves to compete in various markets in the future. And while discrete GPUs are rightly on the back burner right now, GPU tech. isn't because multiple product lines other than dGPUs rely on it.

Regards,
SB
 
Well AMD arguably tried with the 6900XT and came reasonably close to the 3090 and it didn’t help at all.
That was when Nvidia used a production node ~1-2 generations behind from what AMD was using. And they still haven't managed to beat them. Not hard to extrapolate from that to the current generation on the same (more or less) process.

Maybe they really decided it wasn’t worth the hassle. If the Scottish YouTuber is right the Radeon group is barely justifying its existence as is and there isn’t a lot of margin to play with.
What does that even mean? Something doesn't "worth the hassle" when it won't recoup the investments made into it. Why would a product not recoup these investments? Because it wouldn't sell. Why would something not sell? Because it wouldn't be good enough to compete with other products on the same market. It's pretty easy to decipher.
 
Status
Not open for further replies.
Back
Top