Intel Kaby Lake + AMD Radeon product *spin-off*

  • Thread starter Deleted member 13524
  • Start date
I wonder whether this project started before Ryzen CPU cores were final. Nowadays you have to ask whether it would be simply easier for AMD to integrate Ryzen cores with Vega GPU... Which they did with Raven Ridge. They also now have HBM2 GPU products in Vega 56/64. Why didn't AMD simply do a Raven Ridge APU with scaled up GPU + HBM2?
I think there's a number of factors that might go into this.
AMD's architectures, HBM2, market position, and 2.5D integration tech at present leave it limited room to make a client APU of this scale practical.
AMD's situation has it trying to stretch a limited set of silicon designs over as broad a range of segments/volumes as it can. Raven Ridge is already the size of a Ryzen die, which is as far as AMD is willing to go for a die that ranges from the R3 to EPYC price bands.

Raven Ridge would continue existing for volume, cost, and likely power reasons. That would leave AMD having to create a big APU with non-shared engineering, masks, package/socket, interposer, and HBM2.
The cost and complexity adders would not be acceptable at Raven Ridge's physical footprint, but since Raven Ridge exists AMD would have to use its big chip and HBM2 for a limited volume, more complicated platform and product mix management, and probably inferior return on investment for a big and niche chip.

Intel thinks it can make a somewhat equivalent package product, but this looks like it won't be at a volume AMD needs or price it can command--and that's with integration technology that seems more practical for this space, barely.


A custom deal likely means more of the up front cost AMD could not afford was paid, in exchange for less money per chip in the long run. A non-standard socket, the risk of a product flop, AMD's history of flubbing product mixes, package integration, HBM2 procurement (maybe?), and other risk factors become mostly Intel's problem.

Long run, a somewhat less integrated solution should have been possible with Raven Ridge and a Vega discrete, so long as xGMI or a variant for the Infinity control fabric were available to help share power management. While not quite on the level of this solution, it could have overlap without too much specific investment in the niche.
It's not clear that AMD made any provision, which may mean those methods are not ready yet.
Long-run, AMD may need to figure out how it plans to get 2.5D or 3D integration to be more practical. I'm not sure Intel needed to give much insight into EMIB, and AMD's isn't vertically integrated so that it can implement such a packaging method on its own.

The only thing that would make sense is that Apple ordered this chip for the next Macbooks. They already use Intel CPUs and AMD GPUs. Integration saves cost, space and power. Intel still has slightly bit better single thread performance and process advantage, so it makes sense that this is built on top of their tech. This would also explain the secrecy around the project and no news about the launch products. Apple doesn't like talking about their products before the launch.
If it's customized for Apple, it's also possible there are other tweaks specific to that product that Apple wouldn't want discussed, maybe.
 
Can you really not see the benefit for AMD CPU division working closely with Nvidia in the HPC/AI/scientific/analytics market?
I guess it was pure luck that IBM with Nvidia has some of the largest supercomputer contracts for 2017/2018.
There's an alternate universe in which AMD merged with Nvidia instead of ATI and put Jensen Huang as the new CEO after the merger.
In that alternate universe AMD is currently a much larger and influential company than it is in our universe.
 
Why do you think that ?

Cheers
Other factors slipped my mind and one has just come back.
I doubt Intel would had been so aggressive and negative in their briefings attacking the AMD CPU product if it was concerned about AMD/Radeon squeezing them on this semi-custom in terms of pricing, negotiations probably were similar timeframes.
From what I understand it did raise some eyebrows just how far Intel went with those briefings against AMD's CPU launch and products.
 
Kyle Bennett was right, so what about the rest of his thoughts?

https://www.hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility

Where the plot thickens is when you look at the Koduri’s unwavering ambition. Koduri’s ultimate goal is to separate the Radeon Technologies Group from its corporate parent at all costs with the delusion that RTG will be more competitive with NVIDIA and become a possible acquisition target for Koduri and his band of mutineers to cash in when it's sold. While Koduri is known to have a strong desire to do this by forging a new relationship with Apple on custom parts (no surprise there) for Macbooks, the real focus is on trying to become the GPU technology supplier of choice to none other than Intel. While this was speculated some time ago I can tell you with certainty that a deal is in the works with Intel and Koduri and his team of marauders working overtime to get the deal pulled into port ASAP. The Polaris 10/11 launch, and all of its problems, are set to become a future problem of Intel’s in what RTG believes will be a lucrative agreement that will allow Koduri and his men to slash the lines from Lisa Su and the rest of AMD.

With Intel in the midst of a shake-up under their new chief product guy, Murthy Renduchintala, the consequences of this agreement with AMD and subsequently the Radeon Technologies Group are significant. In the midst of Intel’s most significant layoffs in recent memory, looking to be around 12,000 positions, it has let go a significant number of graphics engineers and related functions (I am told well over 1,000) in anticipation that Intel will hand over many of these functions to AMD. This is a hell of a bet for new executive "Murthy" Renduchintala while under fire to take on an unproven team with a track record of false starts and missed engineering milestones. Whether Murthy will come to his senses before it’s too late remains to be seen. This is not ATI circa 2006, when Murthy (formerly at Qualcomm) bought a talented group of engineers formerly of ATI’s Imageon business. Murthy must know something we don’t about this team and for Intel’s sake I hope he’s right, because to us, it seems to be a bad bet.
 
I think a lot of it depends on whether or not Intel is willing to continuously shovel addiction cash into AMDs coffers by ordering a new semi-custom part every year or so. What works for consoles with their 5-year refresh cycles will definitely not have anywhere near the shelf-life in the PC realm.
 
Why would it be all that expensive? One single stack of HBM without the interposer, is that really going to be so costly? And Intel seems to think their silicon bridge tech is pretty cost-effective as well, as they're essentially promoting it as the way of the future (after slagging off epyc as a "glued-together" processor... :LOL:)

The appeal would be power efficient, space efficient applications, so thin performance-class notebooks, where a traditional solution would occupy significantly more board space and slurp more juice as well.
Yeah it really depends on how Intel prices it. I figure it's the elite of the elite for them and they are going to make you pay but good for it. I suppose I am just not very excited by mediocre mini gaming for big bucks anymore. I need bigger screen and am good with saving money while also getting much more speed from a traditional discrete configuration.

I don't think it would work for Surface Book. That would seem like a place for it but you wouldn't want a 35-45W configuration all in the tablet.
 
Last edited:
I don't think it would work for Surface Book. That would seem like a place for it but you wouldn't want a 35-45W configuration all in the tablet.
35-45W is the power consumption of Intel's H CPUs alone, so this definitely has to be more.
Even with the GT2 GPU disabled, the CPU cores will still probably push some 25W, and this Radeon is likely to push some 35W more.

I think this is probably going to perform between notebook GTX1050 Ti and GTX 1060 Max-Q, while the integration bonus (shared power rails) + HBM2 might put it with a power consumption similar to a Core H + GTX 1050 -> 1050 Ti.

We have yet to see exactly how many CUs there will be, but if there are 24 NCUs at 1.2GHz in the higher-end model as seen in some of the leaked examples, I think this is what to expect.
Though the size of the GPU in Intel's promo screenshot seems to point towards >200mm^2, which makes me question those 24NCUs.

Has anyone tried to calculate the GPU die size using the HBM2 stack at the far left as comparison?
Also with the Kaby Lake H at the right which is 123mm^2. The GPU seems to be twice as big.
 

Well, I was going to comment that corporate games are an unfamiliar field to me, so I would have wondered what such a scenario would look like. Koduri's sabbatical would be ambiguous as to what it could mean, although right now RTG is directly controlled by Su.
I wasn't under the impression that RTG was separate enough to itself be cashed out in order to make the claimed payday. I could see getting paid to come aboard Intel, and perhaps make conditions conducive for some of the key teams to come along.
This recent deal does not seem to be freeing up the IP to follow, and Intel has restructured further since last year.

However, per https://forum.beyond3d.com/posts/2009501/ , the literal claim that Koduri could take RTG with him doesn't seem like it will happen.

https://videocardz.com/73921/raja-koduri-leaves-amd

With that narrative, does this mean the gamble was lost?
 
Whoah.

Well, the writing was kind of on the wall (when a high-ranking guy suddenly takes a long timeout it's usually a sign they're being handed their hat), but still... Whoah!

Koduri seemed to me (who don't really have enough of a clue to truly judge him ;)) to be a competent and dedicated guy. I hope RTG can find someone at least equally well equipped to replace him.

With that narrative, does this mean the gamble was lost?
Doesn't seem to me there ever was a gamble, from reading his farewell letter. Or if there was, he's the most humble loser of all time (or most hypocritical :p) after power-struggling with his boss and failing...
 
Whoah.

Well, the writing was kind of on the wall (when a high-ranking guy suddenly takes a long timeout it's usually a sign they're being handed their hat), but still... Whoah!

Koduri seemed to me (who don't really have enough of a clue to truly judge him ;)) to be a competent and dedicated guy. I hope RTG can find someone at least equally well equipped to replace him.
And perhaps see if there's further attrition/poaching, after a polite delay most likely.
At least some of the narrative about a rogue RTG may have changed or wasn't accurate. As separate at the CPU and GPU initiatives were, there were documented cases of IP cross-pollination in Vega, as indicated by the slides on the GPU's register files.

Doesn't seem to me there ever was a gamble, from reading his farewell letter. Or if there was, he's the most humble loser of all time (or most hypocritical :p) after power-struggling with his boss and failing...
In that regard, I think in executive circles that even acrimonious departures don't appear that way in official announcements. There's just a boilerplate statement to the effect of exploring new opportunities or spending time with family, etc. Something to that effect is in the letter, which could be true. It could also be viewed in light of Koduri's earlier letter where it was intimated that when he came back it would have been with a reduced capacity in an operational/administrative context by refocusing on technical matters.

An executive that flagrantly burns an employer and hurts the company valuation is going to have a harder time getting a similar position after demonstrating such intemperance.
 
Even acrimonous departures are often painted differently yes, but those tend to be more formal and reserved, and not quite so gushingly positive about the company and its leadership. This guy practically stumbles over his feet to praise AMD/Su... :)

I dunno, I can't know for sure, but I doubt he really tried to rebel, like OCP claimed. Surely he would have been fired outright in that case?
 
I would think it's usually within the CEO's power. The whole scenario does involve giving the group and its leader a lot of leverage or self-determination, seemingly about things that I wouldn't expect a subordinate division to have free reign over.

Clashes can be tolerated if the disruption or loss of some potential upside is significant, but there's little to go on.
 
Even acrimonous departures are often painted differently yes, but those tend to be more formal and reserved, and not quite so gushingly positive about the company and its leadership. This guy practically stumbles over his feet to praise AMD/Su... :)

I dunno, I can't know for sure, but I doubt he really tried to rebel, like OCP claimed. Surely he would have been fired outright in that case?
VPs can be quite contentious when it comes to their business and like I said earlier even before this announcement the CEO hopefully manages/controls the VP's strategy and business decisions/contracts for the company as a whole rather than one division fighting another that may end up being detrimental.
I still feel his focus in terms of strategy was Radeon division over that of AMD as a whole and possibly to the detriment of anything that is not the Radeon group; tbh if he did go this route I have a bit of sympathy when just thinking of GPU division as one could say Radeon GPU division never did receive its fair shair of resources and budget but then he needs to be a realist as AMD cannot finance/commit to both divisions fully at the same time.

Going back awhile there has always been rumours or murmers that everything was not happy between the divisions and the more recent and more public examples would be when Radeon teamed up with Intel to do the special CPU+GPU bundle just before Ryzen launched, in response the VP of the CPU division went out of his way with his official announcement about Performance of Ryzen in a certain game repeatedly with Nvidia Titan X :)
Normally the VP would not mentioned repeatedly a competitor (albeit indirectly as it reflects more with GPU division) in such an announcement for that setup, but then normally you would not get another division working with the dominant CPU competitor to do bundles just before a critical launch of your own CPU products.
There has been other things but I would need to dig through some of which others has said and some public decisions.
 
Last edited:
If he has joined Intel then I have the nagging feeling AMD is going to lose some influential engineers now as he will ask them to follow him; our company lost around 60% of an engineer and R&D team working on a specific technology to Microsoft who we work with.this way.
Another possible domino effect maybe (a very big IF but would offset a little the negative impact) more Nvidia engineers could move to AMD afterwards to fill the space, ironically the Radeon gaming VP who did the Intel announcement (I assume because Raja is not there) came from Nvidia himself and so did more recently going back I think now around 16 months an influential engineer for the driver team of Nvidia - since then it is interesting AMD has improved in this area.
It would be interesting to know the statistics of just how many engineers between the two companies have worked now at both, logically it makes sense that there is staff movement between these two companies that is both directions with some also going from AMD to Nvidia, but that is a different story to losing specifically targeted set of engineers by a senior engineer who is an important team leader/VP-senior manager who joins another tech company one is working closely with on a project (that is what happened to us).
 
Last edited:
35-45W is the power consumption of Intel's H CPUs alone, so this definitely has to be more.
Even with the GT2 GPU disabled, the CPU cores will still probably push some 25W, and this Radeon is likely to push some 35W more.

The information so far correlates very closely with the earlier leaks.
https://tech4gamers.com/wp-content/uploads/2017/04/intel-kaby-lake-g.jpg

It has GT2 graphics on die, as the table indicates. The leaks also said it uses PCIe x8 for connecting the CPU to the GPU, which makes sense considering they aren't using a custom die tailored for this product. The package size is 58.5mm x 31mm, which seems to fit with the promo shots they are showing.

65W for one version and 100W for the other. The TDP is exactly what the NUC roadmap said.
http://nucblog.net/wp-content/uploads/2017/09/Intel-NUC-Roadmap-2018-2019.png

35-45W wouldn't make sense for a product that is rumored to get 13-14K in 3DMark11, or comfortably faster than 1050 Ti in performance. The GPU portion would need the extra TDP to do so. The 100W version they even call it Hades Canyon VR. The performance has to be there for it to be aimed at VR. Looks like the 65/100W NUCs even have support for 6 displays, much greater than the other models that don't feature the "discrete" graphics. Probably because there's two GPUs to power the models, one integrated with the CPU die, and other on the same package.

One thing that bothers me is the leaks are showing 694 for the GPU model. That is Polaris isn't it?

However it is fair to say AMD would have greater margins if they actually competed in this high margin space; AMD has destroyed the MSRP of Ryzen (went too far IMO as it is probably their most viable product) and impacted margins and yet still are pretty profitable from it.

When decisions are made that seem questionable to outsiders its probably due to internal politics. I would assume HardOCP's article, even if it doesn't pan out exactly as shown, was correct at some point.

el etro said:
I'm interested to see how Vega performs electrically/thermally in a true HP process, unlinke SS 14LPP is. Results must be good.

The GPU is not on an Intel process. It's whatever AMD GPUs are using.
 
Last edited:
So not a mobile SKU, but meant for all-in-ones (iMac)

How dissapointing.

Cheers
Too early to say what the scope is for the design, the announcement is loose enough to be a diverse range of options, nor importantly does it say how long this ties Intel and Radeon for the project or how it applies if at all to revisions.
 
The Great Leap Forward of AMD’s stock price started about 2 years ago. RSUs vest over 4 years. It’s unlikely that many engineers would leave unvested pots of gold for Intel right now.
This is even more true for Nvidia, of course.

(Also: it’s still Intel that you’ll be working for.)
 
The Great Leap Forward of AMD’s stock price started about 2 years ago. RSUs vest over 4 years. It’s unlikely that many engineers would leave unvested pots of gold for Intel right now.
This is even more true for Nvidia, of course.

(Also: it’s still Intel that you’ll be working for.)
Well they left our company where stock price was strong and so looking great for stock options (has quite a few millionaires from stock options), also by that logic why would one leave Nvidia when it was going from strength to strength 1-2.5 years ago and join AMD that was seen as being in a weaker position; some influential Nvidia engineers did just that.
Hindsight is one thing but no-one would say the stock price would strengthen as it did at AMD back then when engineers were still joining.
Stock option is not the most primary factor for such engineers whether they stay or go as they receive decent options whichever top tech company they join, the project opportunities and challenges/R&D/who working with/etc is what can drive many beyond the comfortable salary they would receive anyway at any of these major tech companies.
Context being we are talking about a select set of engineers here, which Raja IMO will be looking to target to strengthen his new team/R&D - IF it is true he is joining Intel.
 
Last edited:
Back
Top