Was AMD a bad choice for the consoles? *spawn

Digital Foundry already shares this view, having both consoles share the same hardware has lead to boring stagnation, the two consoles are identical except for the brand on the boxes, they are no different from each other, worse yet both are also weak compared to modern PCs even based on old Turing hardware, especially as these can use DLSS to compensate for any gap in performance. As a result, both consoles get competition from PCs as well.

Sony's PS5 Pro is an answer to that. A more powerful SoC will be used to differentiate PlayStation with new features that make it stand apart from Xbox and make it compete well with these old and new PCs.

DF is just mega fans regularly talking way over their heads. The world is full of convergence, GPUs included. Someone makes an improvement everyone else copies it. Nvidia doubled up FP units compared to Int, then AMD does that. AMD introduced giant caches to cut bandwidth to main memory, then Nvidia does that.

There's not actually much difference between AMD and Nvidia GPU archs overall. The "Stagnation" between PS4 and PS5 is the hardcore audience not wanting to admit that ROI for ever bigger GPUs is diminishing. Better graphics always have to be better linearly, it's obviously someone else's fault if I don't notice much of a difference!

The only way "bigger GPU" continues is if AMD realizes who buys big, dedicated GPUs. My parents use a spare room to host wealthy exchange students for a nearby highschool. Their current kid bought a 4080, dual monitor setup, RGB everything, and I'm pretty sure exclusively plays League. Selling dedicated GPUs has nothing to do with price for performance for the majority of this market; it's dudes wanting the impression they're buying a bigger tool. Thus AMD needs to get into the marketing game of "Selling the biggest tool", claiming better price for performance makes their tools seem like the weak "second choice" option in comparison, if you see what I mean.
 
How much of it is due to the stagnation of the games desirability and the backwards compatibility requirement from devs/publishers to reach a majority of the market?
That, along with the uncertain economic environment when these consoles launched really seems to indicate the majority of the market possibly moving towards an every other generation adoption rate.

Much like most price/performance conscious PC gamers tend to look for a ~50%-100% GPU performance increase before upgrading which seems to have pushed regular GPU upgrades out from 3-4years to 5-7years.

IMO- AMD picked a good time to skip the highend, since a large part of the market upgraded in late 2020 to early 2022.
2025/2026 will be a prime time to target that particular market and also aligns with the Windows 12 launch.
 
PS5 has arguably the best exclusive games lineup this year yet (Stellar Blade is setting records in user reviews at the moment) so I don't really buy that being the reason.
Best exclusive lineup compared to when? Certainly not of all time, maybe compared to the rest of the PS5's lifetime but that says more about the PS5 exclusive situation than the quality of this year's lineup.

What records is Stellar Blade setting? It looks like another pretty standard action game, this certainly isn't going to have mass appeal and 100% isn't driving console sales (at least not in the west).
 
Do you think that Sony and MS would say "no" to a GPU with DLSS-like capabilities or RT performance on par with what Nvidia offers?
yes absolutely, because gaming consoles are sold at certain price range in a market which is cost sensitive and NV hardware would only make the situation worse, not better. Huang doesn't sell cheap and everyone knows that. Perhaps if Alienware made some $3.000+ console, it would be best for RT/AI/Tensor/PhysX/DLSS king, ideal for niche market and Nvidia zealots. But yeah, perhaps that kind console would sell better than X-box and PS Pro as it seems everyone wants Nvidia

PS: I do agree with one point, it would be nice to see there´s a chance for Intel GPU in console market, it would stir up the stagnant water levels in the console hardware a bit and increased competition
 
Last edited:
In US and Asia, PS5 sales are better than PS4 for the same amount of time. The problem is Europe with economic crisis and inflation.

Xbox sales take a dive…
 
What records is Stellar Blade setting? It looks like another pretty standard action game, this certainly isn't going to have mass appeal and 100% isn't driving console sales (at least not in the west).

Over past day:
...
 
yes absolutely, because gaming consoles are sold at certain price range in a market which is cost sensitive and NV hardware would only make the situation worse, not better.
You wonder who sets the target price, which in turn directly caps the BOM cost on the whole hardware platform.

I do not know if that commenter ever worked in the industry. But in this kind of B2B settings, customers set the core requirements. You then come up what best you can offer within their reqs. Maybe you can try to convince them to do more based on the tricks in your pocket. But in the end, the ball is generally not in your court, especially when (1) the bottom lines like unit cost are involved, and (2) your customers have — surprise, surprise — their own marketing functions and so their own judgement. Oops.

So arguing consoles sold at £399 not being >>£1000 PC-equivalent strong arse platforms to solely be the incompetence of the SoC vendor not doing more magic is logically absurd.

The more ridiculous point is the rhetorical question of “would they say no to DLSS and RT performance”. If you set this question in a vacuum, the obvious is they would say “give me give me”. But if you project it into the ground reality of a £399 MSP target, can you achieve anything reasonably useful given the resulting silicon budget? Feels like the goalpost has been silently moved, hasn’t it?

Look at RTX 3060. What can one do realistically with ML upscaling and ray tracing on 4K 60Hz, that brings enough IQ gain across all genres for a platform vendor to say “bloody hell let’s get everyone onto this”? Moreover, have one ever thought about the opportunity cost of _not_ spending on these stuff at all? Say… better perf-per-area (PPA) — as a controversial figure here touted — for things not RT and ML, within the customer’s MSP target.

P.S. Your anti-recency-bias reminder that PS5 and XBX launched in late 2020, roughly 3.5 years ago.
 
Last edited:
no one stops Sony or MS from using Nvidia HW, perhaps someone clever should enlighten them how to be smart and rich, cause they miserably failed ;)

recently I read , Samsung is going to end cooperation with incompetent AMD and ditching their mGPU, so only one task remains, just to kick them out of both gaming consoles and there will be a paradise on earth ....

"“would they say no to DLSS and RT performance”

@pTmdfx: you are talking like Nvidia is the only in the world who can do upscaling tech and RT ...
 
Last edited:

Over past day:
...
You know the generation is going poorly when people are hyping up a primarily-Japanese-market game with a high Metacritic user score (as if those mean anything) as evidence of a great year for PlayStation.

The critic score is an 82, which is good but not exactly groundbreaking.
 
You know the generation is going poorly when people are hyping up a primarily-Japanese-market game with a high Metacritic user score (as if those mean anything) as evidence of a great year for PlayStation.

The critic score is an 82, which is good but not exactly groundbreaking.
korean-market game, actually. not that it will sell all that well in either country
 

Attachments

  • Screenshot_20240506_123854_Chrome.jpg
    Screenshot_20240506_123854_Chrome.jpg
    156.1 KB · Views: 77
Last edited:
Nothing beats the area efficiency of HW RT and TensorCores.
I agree with you. But it seems you had run past the point in the excerpt.

You can win ray tracing and inference PPA and PPW. These are relative metrics. But how are the absolute metrics doing when you turn all these on?

Say even with the best RT & inference IPs you have, the SoC can do at best 24Hz 4K cinematic slideshows — because your platform MSRP of £399 caps how much you can spend on the SoC silicon and memory.

Then your 24Hz 4K slideshow-ability would have to stack up against “incumbent” non-RT, non-ML techniques like checkerboard rendering, as well as goals like 4K 60Hz & 1080p 120Hz for most genres.

Would you rather keep your PPA-PPW-winning-but-unusable silicon filler, or ditch it to make what worked for you better (or not spend it at all for lower cost)?

If your answer would be “I would beef it up so it can do 4K 60Hz smoothly while delivering PPA and PPW winning ray tracing and inference AA” then I… ehm… appreciate your reply.

you are talking like Nvidia is the only in the world who can do upscaling tech and RT ...
No. But with hindsight, Nvidia does currently hold the bar of RT & inference AA. So it is a reasonable assumption that any vendors with time travelling capability would at best meet that bar.

Now, say, when you were back in 2017-18 finalizing the initial high-level spec for console customers:

1. These customers are dominating your revenue still.
2. Your console customers are not interested in moving their £399 MSRP target to make the slideshows less slideshowy.
3. You estimate your hardware ray tracing IP plus ML super-resolution technique can probably make a 4K slideshow with the silicon budget your customers indirectly dictated.
4. You also “heard” your competitor is about to launch these new unestablished things in the market (both IHVs & ISVs).
5. You just launched your first generation of critically acclaimed (non-graphics) product in a long while, and the road to recovery barely started.

Would you still invest in this then?

Or would you kick the tin can down to a future cycle, while trying to narrow the gap a bit with something cheap, moderately useful & still borderline “ray tracing”?

Sometimes we just have to pick our battles, don’t we?
 
Now, say, when you were back in 2017-18 ....

Sometimes we just have to pick our battles, don’t we?

yes that´s the kind of point, back then someone made a decision they thought was best for them, hindsight is always twenty-twenty .....

 
Last edited:
In US and Asia, PS5 sales are better than PS4 for the same amount of time. The problem is Europe with economic crisis and inflation.

Xbox sales take a dive…
The only reason why the PS5 sold less in Europe compared to PS4 is availability for the first 2 years. You couldn't get a console without going in to telegram groups and getting a notification hoping you were fast enough to buy it. People were starved of consoles for 2 years, and even with the better availability of the past year, you still lost some sales, not all people who wanted to buy it then will buy one immediately.
 
AMD were simply the best choice for the consoles considering their power budget and strict BOM.

With the consoles using a single SoC with the both the CPU and GPU contained, that would not be possible if the design wasn't all AMD and would mean two dies, which means more heat, more power draw and more expense.

I can understand the argument for going with Nvidia for DLSS and more RT performance, but that's also balanced out with the fact that AMD offer better raster performance over the Nvidia equivalent and I think in a closed box like a console, more raster performance is better than more RT performance.
 
The huge problem with this thread is a complete lack of any attempt at a concrete counter. It's just wishy-washy 'what if'. I guess just as we have a 'predict the next gen' thread where people can post their theories with die areas and clocks and prices, we should engage in a retrospective similarly.

If you think there was a better alternative, spell it out. List suppliers, parts, and costs. Without that people are just trading dreams.
 
The only reason why the PS5 sold less in Europe compared to PS4 is availability for the first 2 years. You couldn't get a console without going in to telegram groups and getting a notification hoping you were fast enough to buy it. People were starved of consoles for 2 years, and even with the better availability of the past year, you still lost some sales, not all people who wanted to buy it then will buy one immediately.
Also, I forgot to write, price. At this point in the lifecycle, the PS4 was up to 300€ cheaper. I think that people like to attribute PS5 sales to exclusive games or the graphics, but it's probably much simpler. Changes in consumer spending are rarely driven by ideological reasons.
 
AMD was and still is the best choice for console gaming. They create cheap APUs (very important here) with good x86 CPUs and GPUs that are decent enough for such a closed box.
 
It's difficult to find out the die area for each component now but I've managed to find these

  • XBox SX APU = 360.4 mm2 (Total) - 80 mm2 (Zen 2) = ~280 mm2 - the rest of APU stuff = GPU size (slightly above 200 mm2)
  • Playstation 5 APU = ~305 mm2 (Total) - 80 mm2 (Zen 2) = ~220 mm2 - the rest of APU stuff = GPU size (below 200 mm2)
That's not a lot to play with, an RTX 2060 is 445mm² on a larger 12nm process, could that have been reduced down enough on the 7nm process to work in PS5/XSX?

And this is the thing, even if it could, it would offer noticeably less raster performance than PS5/XSX actually have at the cost of having DLSS, and RT wouldn't really be all that much better to be honest.

At 5nm an RTX 4060 is 159mm², RTX 4060ti is 188mm² and the RTX 4070 is 294mm²

I think getting an Nvidia GPU (and all the extra RT/Tensor cores) in a die size they could have used is impossible, leaving AMD as the only real option for a GPU.

Intel Coffee Lake CPU's from 2028 with 8 cores were 180mm² so much larger than what AMD needed, I'm sure they could have stripped it out for use in an SOC but you're talking about removing 100mm² worth of transistors.

I can't see how not using AMD was a better option.
 
Back
Top