Forbes: AMD Created Navi for Sony PS5, Vega Suffered [2018-06] *spawn*

So why do it? to stick with an architecture that means losing billions in sales in the PC space for some sort of semi-custom market that's not as big and could also use another architecture anyway makes little sense.
AMD's graphics R&D investments were kinda low between GCN1 and RDNA1.
 
So why do it? to stick with an architecture that means losing billions in sales in the PC space for some sort of semi-custom market that's not as big and could also use another architecture anyway makes little sense.
One quick look at their financials will tell you they didn't lose billions with the graphics department. They made a lot less money than nvidia, but they made hundreds of millions nonetheless.

As to why, their R&D expenditure was anemic since they released Southern Islands. In these conditions, until 2017 they were probably 99% focused on Zen. There's a good chance the console semicustom deals paid for most of AMD's GPU development.
 
One quick look at their financials will tell you they didn't lose billions with the graphics department. They made a lot less money than nvidia, but they made hundreds of millions nonetheless.
I said they lost billions in sales, not lost billions of dollars. ;) By choosing semi-custom over the broader industry, they lost making billions from the wider industry.

AMD faced two choices:
1) Remain competitive in the PC and HPC spaces where they could make loads of money.
2) Focus more on servicing the semi-custom market, where they could make some money

Why choose option 2? Why not choose option 1, stay competitive, make more money, have better R&D, and service the semi-custom clients with whatever you're creating in your primary GPU business? The argument that AMD chose to sit on an old tech instead of competing for the more lucrative market makes no sense to me
 
Why choose option 2? Why not choose option 1, stay competitive, make more money, have better R&D, and service the semi-custom clients with whatever you're creating in your primary GPU business? The argument that AMD chose to sit on an old tech instead of competing for the more lucrative market makes no sense to me
With option 2 AMD gets money up front and pretty stable royalties (if that's how the contracts are made up)
They was financially struggling at that point and although option 1 will work out better for them, they wasn't in the position to do it and zen together back then.
Basically couldn't fight on multiple fronts.
 
I said they lost billions in sales, not lost billions of dollars. ;) By choosing semi-custom over the broader industry, they lost making billions from the wider industry.

AMD faced two choices:
1) Remain competitive in the PC and HPC spaces where they could make loads of money.
2) Focus more on servicing the semi-custom market, where they could make some money

Why choose option 2? Why not choose option 1, stay competitive, make more money, have better R&D, and service the semi-custom clients with whatever you're creating in your primary GPU business? The argument that AMD chose to sit on an old tech instead of competing for the more lucrative market makes no sense to me

Risk.
 
With option 2 AMD gets money up front and pretty stable royalties (if that's how the contracts are made up)
They was financially struggling at that point and although option 1 will work out better for them, they wasn't in the position to do it and zen together back then.
Basically couldn't fight on multiple fronts.

I think that was more rhetorical than anything. AMD didn't have a choice. They were cash strapped and limited in spending.

Their CPU designs were much further behind Intel than their GPU designs were behind NV. So the smart play was to focus on bringing the CPU division up to par if possible (they exceeded that) while keeping their GPUs at least minimally competitive (accomplished albeit at great cost - using larger dies to compete with smaller NV dies).

Also note that they've obviously suffered setbacks in GPU development as well as RDNA was scheduled to come out in 2018 a full year ahead of any rumored PS5 launch (2019).

Considering Forbes is the only solitary source for them focusing on semi-custom at the expense of their mainline GPU business...um...yeah.

The reality was that they didn't have many options. Converting their mainline GPU IP into semi-custom blocks for semi-custom solutions (like consoles) combined with minimal investment in GPU R&D (mainline PC) was likely an affordable way to continue bringing in GPU revenue while they worked to get a competitive CPU design out.

The fallout, of course, is that many of their GPU engineers were poached or left on their own to seek better opportunities while this was happening.

IMO, IF (HUGE IF) the Forbes article was true about AMD diverting most of their GPU resources towards consoles or even more unlikely a single console, that makes losing so much engineering talent to focus on a very small margin (console semi-custom) market extremely questionable WRT to upper management.

I don't believe Lisa Su is that stupid or incompetent.

Although it would certainly be amusing if Sony paid for MS to have a better performing GPU. :D

Regards,
SB
 
I said they lost billions in sales, not lost billions of dollars. ;) By choosing semi-custom over the broader industry, they lost making billions from the wider industry.

AMD faced two choices:
1) Remain competitive in the PC and HPC spaces where they could make loads of money.
2) Focus more on servicing the semi-custom market, where they could make some money

Why choose option 2? Why not choose option 1, stay competitive, make more money, have better R&D, and service the semi-custom clients with whatever you're creating in your primary GPU business? The argument that AMD chose to sit on an old tech instead of competing for the more lucrative market makes no sense to me

Nope. Its not as simple as that.

With semi-custom, clients commission chips which allows AMD far greater latitude than how the PC markets works. You ask for a chip with certain features and performance levels, AMD then bases pricing on R&D, manufacturing costs and volume. With MS and Sony that means AMD has two clients locked in for 7-8 years with anywhere from 40-120 million in chips sales for each client. Profit margins aren't super fat but they are there, guaranteed for the most part and those clients are locked in for the long term.

With PC market, AMD goes through R&D investment, puts in mass productions orders and then hopes that its products can compete well enough to sell at profitable margins. AMD competes with Nvidia in the discrete market while competing with Intel in markets where IGPs are applicable. AMD is continually working to win contracts for each new year's iterations of products (laptops, desktops, etc.). Furthermore, AMD has to iterate every 2-3 years and revamp parts of their lineup with no guarantee that the ASP will cover all the effort.

If you have the clients, the semicustom business is like a leisurely jog through the park while the PC market is more like a sprint through a obstacle course.
 
Last edited:
If you have the clients, the semicustom business is like a leisurely jog through the park while the PC market is more like a sprint through a obstacle course.
If you want to stay a small player, sure. Is that AMD's plan? To deliberately play second place because it's easier and safer, and make much less money? Do they really doubt their ability to be competitive with nVidia and have to cross their fingers and hope they can compete, rather than engineer a good platform as they used to?
 
If you want to stay a small player, sure. Is that AMD's plan? To deliberately play second place because it's easier and safer, and make much less money? Do they really doubt their ability to be competitive with nVidia and have to cross their fingers and hope they can compete, rather than engineer a good platform as they used to?

You don't have to be a small player by using the strategy AMD uses.

You couldn't guarantee that AMD would be more competitive in the PC space if it didn't maintain a semi-custom division. AMD would be footing the bill for all of RDNA as it would be investing in a new arch and newer costly nodes without any help. AMD is going to pull in 4-6 million chip sales off the new consoles this year. Could Navi make up that volume if MS and Sony didn't use AMD tech? Probably not.

The semi-custom business allows AMD to justify greater investment in its gpu business because console gpus actually produce revenue and profit that AMD sees not some unrealized revenue and profit if only AMD could cut into Nvidia and Intel IGP marketshare.
 
I'm unconvinced. GCN has evolved but just hasn't got a name change. GCN stopped at GCN 5, which includes things like scheduling and tessellation changes. The core GCN architecture is the SIMD CUs and wavefronts, so we're counting two core architecture changes, VLIW and GCN. In that time, hasn't nVidia had effectively one arch, the CUDA core? So nVidia introduced CUDA with Tesla and have stuck with it, and AMD have used GCN. nVidia has named their different CUDA based generations with different family names, whereas AMD has just named theirs GCN x.

Is there really a difference in behaviour? Both have a long-term architectural DNA as the basis for their GPUs, with refinements happening in scheduling and features across the evolution of that core DNA.

From the standpoint of the ISA, Nvidia's had one major transition when going from Fermi to Kepler. The former was described as a more CISC-like architecture with reg/mem ALU operations, and the microarchitecture had an operand collector, register scoreboarding, and hot-clocked ALUs. Kepler's ISA became load/store and exposed ALU register dependences to static scheduling by the compiler.
The hardware lost the operand collector, scoreboarding for ALU ops, and separate clock domains for the SIMD hardware.
Maxwell introduced the register operand reuse cache, which was a software-visible control for result forwarding.
Volta made a change to the threading model, and also introduced a broader encoding change where each instruction is 128 bits and tracks its own dependence and wait cycle information, versus the way prior generations had control words that governed small packets of instructions. The tensor units introduced a different set of memory and scheduling concerns, and the RT cores are new functionality. There's a little-discussed addition to the architecture for uniform operations, which seem to indicate dedicated data paths for some of the warp-wide value calculations that might align with one of the uses of the scalar unit in GCN.
Nvidia has also gone back and forth in its implementation on the L1 vs shared memory location, the use(s) of the texture cache, how the L1's cache policy works, and other architectural changes outside the SMs.

GCN has added subsets of instructions, and moved things around a bit. The overall encoding has stuck to similar themes, and RDNA does a fair amount to align with it--although a point of churn with opcodes is that it reversed the unexplained remapping Vega did to the vector ISA back to something more similar to earlier GCN.
I'd say the pipeline execution loop, register file, caches, LDS, threading model, and ISA philosophy has been pretty consistent until RDNA changed some if them.

Perhaps some of the debate with ISA changes is how much an alteration should be considered a "change" in terms of semantics. Is altering a few bits for all instructions while they behave similarly a big change versus a small addition of new ops? Both vendors have had this sort of ISA shift, which Nvidia abstracts more with its CUDA layer.


Nope. Its not as simple as that.

With semi-custom, clients commission chips which allows AMD far greater latitude than how the PC markets works. You ask for a chip with certain features and performance levels, AMD then bases pricing on R&D, manufacturing costs and volume. With MS and Sony that means AMD has two clients locked in for 7-8 years with anywhere from 40-120 million in chips sales for each client. Profit margins aren't super fat but they are there, guaranteed for the most part and those clients are locked in for the long term.
The predictable volume also avoids AMD's historically poor handling of inventory, channel vs OEM balance, and product mix, although recent times have shown AMD isn't alone. The most recent crypto glut affected Nvidia as well, though I recall Nvidia hinted that part of its difficulty clearing the channel was related to some unnamed competitor's significant glut being a factor

Semi-custom's more consistent volume does carry risk for AMD, if its cost improvements cannot keep up with the guaranteed reduction in payment schedules over time.

With PC market, AMD goes through R&D investment, puts in mass productions orders and then hopes that its products can compete well enough to sell at profitable margins. AMD competes with Nvidia in the discrete market while competing with Intel in markets where IGPs are applicable. AMD is continually working to win contracts for each new year's iterations of products (laptops, desktops, etc.). Furthermore, AMD has to iterate every 2-3 years and revamp parts of their lineup with no guarantee that the ASP will cover all the effort.

If you have the clients, the semicustom business is like a leisurely jog through the park while the PC market is more like a sprint through a obstacle course.

Having a development pipeline sized for a stroll also means not having the robustness to take advantage of new opportunities or recover from missteps. Nvidia was able to leverage its leadership into professional, HPC, AI, and automotive. While some of this is still speculative or now facing more competition, several of them would be massive revenue streams for AMD if it weren't strolling so far behind.
The software gap in particular is large enough that AMD has no choice but to acknowledge its sub-ordinate position to Nvidia's initiatives in compute, and this anemic support model flows through all its products. Which I suppose in large part is the other weakness semi-custom can offload onto the customer.
 
I said they lost billions in sales, not lost billions of dollars. ;) By choosing semi-custom over the broader industry, they lost making billions from the wider industry.

AMD faced two choices:
1) Remain competitive in the PC and HPC spaces where they could make loads of money.
2) Focus more on servicing the semi-custom market, where they could make some money

Why choose option 2? Why not choose option 1, stay competitive, make more money, have better R&D, and service the semi-custom clients with whatever you're creating in your primary GPU business? The argument that AMD chose to sit on an old tech instead of competing for the more lucrative market makes no sense to me
As mentioned above, the semi-custom markets are a much more predictable source of income. Especially after having landed both consoles, no matter which one "wins" AMD was always going to sell well over 100 million chips over the course of ~7 years at least. After AMD gets the design win for the 2013 Xbox, no one is competing with AMD for the 2014, 2015 and 2016 Xbox.

Remaining competitive in the PC and HPC GPU spaces isn't remotely as linear. AMD could increase its R&D budget by 10x and nvidia could still one-up them. Even assuming more R&D expenditure would automatically mean a linear increase in GPU performance/watt and performance/die-area, they could even "win" the best PC GPUs of 2015 and still be unable to gain marketshare due to less-embedded developer and other B2B relations, and in 2016 they'd need to have another fight with refreshes, and in 2017 another fight with new nodes, etc.

Sure there might be much more money to make in PC and HPC, but the stakes are higher too. And a company like 2013-2018 AMD might not have been in a position to hamper other failures.
 
You couldn't guarantee that AMD would be more competitive in the PC space if it didn't maintain a semi-custom division.
No-one's suggesting the semi-custom market isn't good for AMD. The argument is that AMD is deliberately not advancing its GPUs so it can service that semi-custom market, and it is being led by that semi-custom market, servicing the other GPU industries with console-part designs, instead of creating the best possible GPUs and providing flavours of those for its semi-custom clients.

I understand the logic, but I just don't believe AMD, with their history and being the only other player in the GPU space, would choose not to compete. It goes completely against standard international business practice. Investor calls saying, "this year, we'll be avoiding risk and playing it safe again, so you won't see much growth on your investment." AGM's where the party ends with everyone holding aloft their glasses and cheering, "we're number two! We're number two!"

All the problems talked about AMD having - software stack and wider reach - are solved by having more money and competing better.
 
Ultimately, competition in a capitalist way of thinking, is a displacement race (destruction derby?) to monopol, much like the arms race during cold war was an attempt to dry out the USSR of ressources.

One possible argument is, that AMD surrendered their arms, realizing, that it cost them too much ressources to maintain their tech within striking distance of the competitor's high-end. I'm just not sure right now, if I'd follow that line of thought to the end.
 
She’s not stupid, the console market isnt even a consideration for NV, its a side project for AMD

But an important one. They get a cash injection to develop new GPU IP for, they get a steady revenue stream for years and they get games developed for their hardware.

The last part is important; Look how Nvidia influenced game development from 2002 and forward with their "The Way It's Meant to be Played" program where Nvidia offered great support to game developers in exchange for promoting features Nvidia did particularly well, - like tesselation, small polys etc. Instead, games have been developed and optimized on AMD platforms first since late 2013. On the software side of things it allowed AMD to push Mantle, which was transformed/standardized into Vulcan, rendering Nvidias huge advantage in DX11 moot.

If AMD hadn't been in consoles, they would have shut their GPU division by now.

Cheers
 
Last edited:
It's almost like Cerny wanted to comment on this thread.
From yesterday's presentation, at 25m08s:

"
First, we had an AMD custom GPU based on their RDNA2 technology. What does that mean?
AMD is continuously improving and revising their tech. For RDNA2 their goals were, roughly speaking, to reduce power consumption by rearchitecting the GPU to put data close to where it's needed, to optimize the GPU for performance, and to add a new, more advanced featureset.
But that featureset is malleable, which is to say that we have our own needs for Playstation, and that can factor into what the AMD roadmap becomes.
So collaboration is born. If we bring concepts to AMD they're felt to be widely useful, then they can be adopted into RDNA2, and used broadly, including in PC GPUs. If the ideas are sufficiently specific to what we are trying to accomplish, like the GPU cache scrubbers I was talking about, then they end up being just for us.
If you see a similar discrete GPU available as a PC card at roughly the same time as we release our console, that means our collaboration with AMD succeeded on introducing technology useful in both worlds. It doesn't mean that we at Sony simply incorporated the PC parts into our console.
"

To summarize:
- Sony doesn't just adopt whatever AMD had been working on in a vacuum. The technical collaboration goes deeper than that;
- Ideas coming from Sony do result in features that end up in PC GPUs from AMD;
- He's kind of hinting at a RDNA2 GPU with the same layout as the PS5's GPU coming out this year..? Future midrange card?
 
To summarize:
- Sony doesn't just adopt whatever AMD had been working on in a vacuum. The technical collaboration goes deeper than that;
- Ideas coming from Sony do result in features that end up in PC GPUs from AMD;
- He's kind of hinting at a RDNA2 GPU with the same layout as the PS5's GPU coming out this year..? Future midrange card?

All those points are how it always has been for consoles, i think/hope?
 
ITo summarize:
- Sony doesn't just adopt whatever AMD had been working on in a vacuum. The technical collaboration goes deeper than that;
- Ideas coming from Sony do result in features that end up in PC GPUs from AMD;
- He's kind of hinting at a RDNA2 GPU with the same layout as the PS5's GPU coming out this year..? Future midrange card?
Of course AMD adopts good ideas their customers come up with, if they see them fit for the architecture and at all doable, that's it, ideas, not IP blocks or whatever. Just like they adopt ideas anyone else in the industry comes up with it if they're suitable and doable, like DICE had no doubt influence on some design choices even on architecture development when they were doing Mantle with AMD
There won't be. If AMD releases 36 CU RDNA2, they've built it from the same blocks they've used to build PS5's GPU (and XSX's GPU and every other RDNA2 GPU), but they can't just copy/paste the layout (from either one), you start from scratch.
 
Back
Top