Considering the source I'd rather not click it, is it anything but the usual doom'n'gloom that comes by every now and then since over a decade ago?Just watched Jim's new video at Adored and OUCH! Rattled me good, I can't really disagree with him.
Considering the source I'd rather not click it, is it anything but the usual doom'n'gloom that comes by every now and then since over a decade ago?Just watched Jim's new video at Adored and OUCH! Rattled me good, I can't really disagree with him.
I'm not much of a fan of his either, but I watched it and he makes some good points.Considering the source I'd rather not click it, is it anything but the usual doom'n'gloom that comes by every now and then since over a decade ago?
Their presence in PC gaming shares is so low, that Sony accounted for 20% of their (non Xilinix) business in 2022.
That I don't agree with, power consumption of the small 300mm RDNA3 compute die is high, very high infact, doubling that size would incur a power penalty so large, it would easily cross the 500w threshold.Despite the architecture having the capability to do so, (scaling up die size of RDNA3)
They did what now?-AMD essentially lied to their fanbase about the performance of the RDNA3 GPUs.
And? AMD could have built a cooler to support it.That I don't agree with, power consumption of the small 300mm RDNA3 compute die is high, very high infact, doubling that size would incur a power penalty so large, it would easily cross the 500w threshold.
The main problem would be cost which would be dictated by competition. Ending up 10% slower with 50% more power at the same price wouldn't help anything.And? AMD could have built a cooler to support it.
AMD stated that they could have beaten the 4090. Nvidia has proven that you can cool very large chips at 600w. AMD's unwillingness to do that is locking them out of a segment of the market, and it also has an effect on their ability to market their brand as the best to the market as a whole.The main problem would be cost which would be dictated by competition. Ending up 10% slower with 50% more power at the same price wouldn't help anything.
It's not about the cooler, it's about being so power hungry while also being behind in Ray Tracing and other features, no body would have purchased it. Intel could also design a 700w GPU that challneges the 4090, but would also be behind in performance, features and compatibility. It's not about who "could", but who can at what performance level, power and cost. Needing a 500w 600mm 7900XTX to beat a cut down big Ada (4090) with actual power consumption often in the range of 350w to 400w, while also staying behind in Ray Tracing, and upscaling means a dead on arrival product.And? AMD could have built a cooler to support it.
More people are purchasing the 4090 at $1600 than the 7900xtx at $1000.... let that sink in.It's not about the cooler, it's about being so power hungry while also being behind in Ray Tracing and other features, no body would have purchased it. Intel could also design a 700w GPU that challneges the 4090, but would also be behind in performance, features and compatibility. It's not about who "could", but who can at what performance level, power and cost. Needing a 500w 600mm 7900XTX to beat a cut down big Ada (4090) with actual power consumption often in the range of 350w to 400w, while also staying behind in Ray Tracing, and upscaling means a dead on arrival product.
Regarding coolers, It's not the first time AMD ignored cooling desging altogether (probably not financially feasbile given their low profits), when they need to push products to the limits, they just slap a liquid cooling on top of it and call it a day (Vega 64 LC, 6900XT LC), or put a cheap one when they really want to cut costs (R9 290X, RX 480, Vega 64).
Both would have lower actual consumption in CPU limited d scenarios. I am not sure they would be behind in ray tracing either. But we can be sure they would come out on top in many other ways. Also, the upscaling situation isn't set in stone. Still, I am glad they don't waste resources on silly big chips.Needing a 500w 600mm 7900XTX to beat a cut down big Ada (4090) with actual power consumption often in the range of 350w to 400w, while also staying behind in Ray Tracing, and upscaling means a dead on arrival product.
Is that a fact or your wish?They can't just accept a world where Nvidia has the fastest GPUs out there.
At what price?AMD stated that they could have beaten the 4090.
It's a fact... considering nothing else they've been doing has gained them any meaningful ground in the PC discreet GPU market.Is that a fact or your wish?
At a similar price to Nvidia's 4090.. and 600w. The entire premise of AMD saying that was "well if we wanted to release a $1600 and 600w power draw like Nvidia, we could have made something better than them".At what price?
I don't believe for a second that they could've beaten 4090 at the same price and decided not to.
Such decisions are always, always rooted in the ability to compete. Saying "we could've beaten that" means nothing when it doesn't continue with "at the same price and power draw". Because otherwise - no, you could not have.
I believe that if they could have they would. The sole fact that they didn't mean that they couldn't.Do you believe they couldn't have made a GPU that beats the 4090 at $1600 with a 600w TBP?
I believe that if they could have they would. The sole fact that they didn't mean that they couldn't.
It's a fact... considering nothing else they've been doing has gained them any meaningful ground in the PC discreet GPU market.
Do you want them to continue to do the thing that hasn't worked for them for the past decade plus.. or can you admit it's time for them to start doing their best? When they come out and say they could make a GPU that beats Nvidia's best, but don't.. then people are going to understand that they don't want to compete at that level and the enthusiast wont even consider them.
They literally state they could have.I believe that if they could have they would. The sole fact that they didn't mean that they couldn't.
It's a fact.. whether you admit it or not. AMD are not going to gain any marketshare by not putting out the best products their architectures are capable of by "going big" with die size and power draw like their competition does. Not to mention by only reacting to/following Nvidia's insane pricing at the mid and low range instead of setting their own course to gain market share (and mind share).It is not a fact. Just because some other approach exists does not mean it would lead to a bigger success.
I don't won't them to continue doing the last decade of not clearly advantagous products than better known competition. Symmetrical strategies won't win this uphill battle.
That was when Nvidia used a production node ~1-2 generations behind from what AMD was using. And they still haven't managed to beat them. Not hard to extrapolate from that to the current generation on the same (more or less) process.Well AMD arguably tried with the 6900XT and came reasonably close to the 3090 and it didn’t help at all.
What does that even mean? Something doesn't "worth the hassle" when it won't recoup the investments made into it. Why would a product not recoup these investments? Because it wouldn't sell. Why would something not sell? Because it wouldn't be good enough to compete with other products on the same market. It's pretty easy to decipher.Maybe they really decided it wasn’t worth the hassle. If the Scottish YouTuber is right the Radeon group is barely justifying its existence as is and there isn’t a lot of margin to play with.