Intel Kaby Lake + AMD Radeon product *spin-off*

  • Thread starter Deleted member 13524
  • Start date
I would not call it tempting without hard data to evaluate.

Well, speculatively tempting. It's not hard to roughly estimate what 4 Zen cores and 11 Vega CUs can do. But yeah, I'd definitely wait for reviews before buying anything.
 
Well, speculatively tempting. It's not hard to roughly estimate what 4 Zen cores and 11 Vega CUs can do. But yeah, I'd definitely wait for reviews before buying anything.

I'm not worried about what it can do, but about battery life. That will also be dependent on the battery it comes with, so without reviews of specific models, no purchase.

EDIT - In any case we seem to be using the wrong thread :/
 
Furthermore, Intel will now be doing their own discrete GPUs, meaning this semi-custom deal from AMD won't be a long one.
If Intel comes up with their own GPU family in 3/4 years, that's how much time AMD has to develop a high performance APU within the 50-100W range.
Is there any hard evidence of Intel making high performance parts? I've seen lots of news outlets running with it, but upon inspection of Intel's statements it only reads as integrating discrete graphics chips. Not necessarily their own and Raja appears to have taken up edge computing, not graphics.

The last time Intel made their own graphics solution they got sued and lost. I don't see that changing. They may be much better off sourcing semi-custom chips for integration along with their own FPGAs or other coprocessors as required.

One aspect that is fundamentally different between Radeon and the CPU division is the control of information is better within the CPU division, in other words we really do not know much about what the CPU division unannounced product/R&D strategy is.
That may just be an inherent difference in products as GPUs cycle much more quickly. CPUs adopt features much more slowly. Just look at all recent Intel CPUs and their gains over the past years/generations.
 
Is there any hard evidence of Intel making high performance parts? I've seen lots of news outlets running with it, but upon inspection of Intel's statements it only reads as integrating discrete graphics chips. Not necessarily their own and Raja appears to have taken up edge computing, not graphics.

The last time Intel made their own graphics solution they got sued and lost. I don't see that changing. They may be much better off sourcing semi-custom chips for integration along with their own FPGAs or other coprocessors as required.


That may just be an inherent difference in products as GPUs cycle much more quickly. CPUs adopt features much more slowly. Just look at all recent Intel CPUs and their gains over the past years/generations.
I highlighted a bit of the announcement directly from Intel earlier which mentions from Dr. Murthy Renduchintala in context of AI/deep learning/compute/etc along with visual graphics and data.
Sounds more like Tesla and Quadro, and filtering probably down to performance consumer GPUs.
“Raja is one of the most experienced, innovative and respected graphics and system architecture visionaries in the industry and the latest example of top technical talent to join Intel,” said Dr. Murthy Renduchintala, Intel’s chief engineering officer and group president of the Client and Internet of Things Businesses and System Architecture. “We have exciting plans to aggressively expand our computing and graphics capabilities and build on our very strong and broad differentiated IP foundation. With Raja at the helm of our Core and Visual Computing Group, we will add to our portfolio of unmatched capabilities, advance our strategy to lead in computing and graphics, and ultimately be the driving force of the data revolution.”
The whole context of that speech is with regards to Raja joining.
Also remember Intel EOL'd the PCIe coprocessor Xeon Phi last quarter as it was not gaining as much interest as hybrid GPU-CPU solutions, so it makes sense to build something that competes better and IMO looks like they have decided to go discrete GPUs, which like Nvidia can provide benefits top-down.
In a couple of more years I really do think Nvidia's GRID technology/solution will have matured enough to also gain traction, something I am sure Intel is also aware of and interested in albeit playing catchup like they did with Deep Learning against Nvidia
 
It's not strange, because HP, Lenovo etc send review samples to reviewers and not AMD. Some bigger online magazines buy laptops and review them.
And neither HP, Lenovo or Acer have sent review samples to reviewers, despite these laptops being spread within distribution channels for weeks.

Isn't it strange that a new product that should be so important for both OEMs and customers gets zero units sent to reviewers prior to launch?

Well, speculatively tempting. It's not hard to roughly estimate what 4 Zen cores and 11 Vega CUs can do. But yeah, I'd definitely wait for reviews before buying anything.
Clock fluctuations dependent on TDP and temperature are a huge part of the final performance. All we have is a couple of numbers from AMD that could have been taken in ideal conditions.
Knowing how actual products found in shelves will perform is very important.

Is there any hard evidence of Intel making high performance parts?
Some have claimed the original statement is ambiguous, but for me it's really clear:

In this position, Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.
 
LOL I missed out the important bit of the Intel announcement in my earlier post, the two go together :)
Going forward under Koduri’s leadership, the company will unify and expand differentiated IP across computing, graphics, media, imaging and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing.
 
Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.
Does that mean intel will be making graphics cards?
 
Also remember Intel EOL'd the PCIe coprocessor Xeon Phi last quarter as it was not gaining as much interest as hybrid GPU-CPU solutions, so it makes sense to build something that competes better and IMO looks like they have decided to go discrete GPUs, which like Nvidia can provide benefits top-down.
Doesn't that description also cover Kaby-G? As a semi custom chip, it's a MCM leveraging their IP with discrete GPUs. An Intel+Radeon chip atop open source software would still be inline with those statements. Most of Raja's career was spent doing just that.

Some have claimed the original statement is ambiguous, but for me it's really clear:
I've seen that, but again, it doesn't specifically say they are making their own discrete graphics cards. Just making use of said devices.
 
Doesn't that description also cover Kaby-G? As a semi custom chip, it's a MCM leveraging their IP with discrete GPUs. An Intel+Radeon chip atop open source software would still be inline with those statements. Most of Raja's career was spent doing just that.


I've seen that, but again, it doesn't specifically say they are making their own discrete graphics cards. Just making use of said devices.
How does Kayb-G EMIB manage to do:
"under Koduri’s leadership, the company will unify and expand differentiated IP across computing, graphics, media, imaging and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing."

This is a classic definition of Tesla,Jetson, along with Grid and also how it also feeds into Quadro and performance consumer GPUs.
That statement also has to be taken into the initial comment by Dr. Murthy Renduchintala talking about compute,data, and visual graphics that my earlier post mentioned.
Not sure how you see the Kaby-G having context to Xeon Phi with my point they stopped the coprocessor because it has little traction against the CPU-GPU hybrids, which comes back again to building a high performance discrete GPU to compete in the areas the Phi coprocessor failed - they just stopped the coprocessor arch not the Xeon Phi as a more "traditional" CPU HPC installation.
Seems very far fetch to expect Kaby-G EMIB solution to do all that statement I quoted says and also possible replace the failing Phi coprocessor or involved with evolving Xeon Phi.
 
Last edited:
Even a GPU can be a coprocessor in the sense of GPGPU work. In the case of Phi there will inevitably be different accelerators for different tasks. Intel has their acquisitions from Altera to facilitate that. Pairing a Kaby with GPU, FPGA, DSP, etc to accommodate different markets. They can avoid the whole GPU patent situation simply by integrating low margin parts from others. Be it AMD, Nvidia, etc. Three competitors making high end discrete GPUs will destroy everyone's respective margins with competition.

I'm not discounting Intel could make dGPUs, I just current remain unconvinced presented with KabyG.
 
Even a GPU can be a coprocessor in the sense of GPGPU work. In the case of Phi there will inevitably be different accelerators for different tasks. Intel has their acquisitions from Altera to facilitate that. Pairing a Kaby with GPU, FPGA, DSP, etc to accommodate different markets. They can avoid the whole GPU patent situation simply by integrating low margin parts from others. Be it AMD, Nvidia, etc. Three competitors making high end discrete GPUs will destroy everyone's respective margins with competition.

I'm not discounting Intel could make dGPUs, I just current remain unconvinced presented with KabyG.
That is my point.
The GPU is an accelerator (or very loosely similar to the coprocessor), the Xeon Phi that they EOL'd is a coprocessor up against hybrid installations meaning CPU+GPU but the Phi coprocessor never gained traction as it does not have the same appeal.
Which low margin parts are they going to use to replace the Xeon Phi PCIe coprocessor to compete against Nvidia Tesla GPUs and GRID type solutions in the future.
Like I quoted they described everything Tesla/Jetson/Grid is competing in with what is under Raja.
I see a place for KabyG but IMO really does not fit under much of this nor an EMIB traditional CPU+GPU "glued" solution in HPC when importantly scaling up/out.
Especially when Intel boss says:
"under Koduri’s leadership, the company will unify and expand differentiated IP across computing, graphics, media, imaging and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing."

If a segment is worth billions, then it makes sense to compete head-on and take share and revenue away from the others while also weakening them; seems only AMD sadly goes out of their way to destroy their margins, which Nvidia and Intel maintain while still competing aggressively.

Regarding FPGA as a reference.
Intel even only compared the Stratix 10 FPGA to the Nvidia Titan XPascal rather than using the whole Tesla setup in a node/scale up/out real world arch solution.
But that ignores the complexity with regards to diverse architecture solutions being offered by Intel, it takes a lot of work optimising from traditional Xeon CPU just to the Xeon Phi - just to say separately I still cannot see how an EMIB traditional CPU+GPU linked is an effective replacement to the many coprocessors/accelerators HPC builds.
Still has some way to go IMO for Stratix-Arria outside of Edge computing use, which is also more relevant than putting a KabyG EMIB type solution there.
 
Last edited:
Well with Xeon Phi being fully announced as being EOL'd now lot later, puts more weight IMO behind that they are looking to go the more traditional hybrid dGPU route along with also the dev work on Altera and Nervana.
Which if you look back to the Intel statement I quoted makes sense considering what they define is under Raja:
"under Koduri’s leadership, the company will unify and expand differentiated IP across computing, graphics, media, imaging and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing."
So Intel is under an accelerated/aggressive programme to come up with a suitable node/scale out solution that is not a nightmare for the scientists/mathemeticians/coders to use well in terms of optimising and porting existing algorithms/apps/models; not sure the Altera/Nervana products can fully help in this instance apart from more at the edge computing, although said tech may find its way into other R&D products under Raja.

Edit:

Speaking to The Register's sister publication The Next Platform, Intel enterprise and HPC group boss Barry Davis said the Xeon Phi remakewould involve changes within the chips, and would not radically change the surrounding hardware and software running on top. The redesigned silicon is expected to be x86 compatible.

"Since we are on a CPU path here, this is not going to be a strategy that completely disrupts the ecosystem," Davis said
So rules out Nervana/Altera as it stands now as current products but could incorporate IP into a new R&D roadmap-product-tech.
And personally I just cannot see the EMIB Kaby-GPU type solution working well in these instances when compared to hybrid scaled up/out CPU-dGPU accelerator like Nvidia Tesla ecosystem.
Anyway crux is what Intel mentions falls under Koduri leadership near the top of this post, which IMO goes well beyond the EMIB "glue" solution.

Either way, it will be interesting to see just what Intel does.
They need a replacement products-tech for some of their HPC contractual projects by 2021.
 
Last edited:
So AMD was always planning on launching a higher performance mobile APU, the deal with Intel is baffling unless this was driven by Radeon GPU division and the VPs not being fully aligned.
Yeah I know some will say and expect "ah but the APU Vega will be notably more powerful than the Intel solution...", but the problem is Intel is the dominant competitor and it is bloody stupid to provide any support that helps to solidify their perception of having a reasonable gaming/enthusiast mobile-laptop when you are going to enter the field a little bit afterwards.
They really should had ignored any deal with Intel so they could hammer them in gaming/enthusiast mobile segment, now they have made their life that much harder.
Only logic I can think is the competition between the split divisions and politics and ambition of each VP in AMD, which has been ongoing in AMD (happens in most very large tech companies but it really hurts the synergy the two divisions need).
https://videocardz.com/74464/amd-preparing-mobile-ryzen-5-apu-with-vega-11-graphics

Ah well, should be interesting to compare the two product solutions head to head next year.
 
Last edited:
They really should had ignored any deal with Intel so they could hammer them in gaming/enthusiast mobile segment, now they have made their life that much harder.

If AMD hadn't made this deal with Intel, Nvidia would have stepped up. Now Intel foots some of the development cost for mobile GPU development in a next generation package as well as ensures a steady, if modest, stream of revenue for AMD. I really don't see how this is bad for AMD.

It is, however, a serious admission of failed strategy on Intel's part; Having to source a GPU at your biggest (albeit small) competitor, because your own IP doesn't cut the mustard, is an emembarrassment.

Cheers
 
If AMD hadn't made this deal with Intel, Nvidia would have stepped up. Now Intel foots some of the development cost for mobile GPU development in a next generation package as well as ensures a steady, if modest, stream of revenue for AMD. I really don't see how this is bad for AMD.

It is, however, a serious admission of failed strategy on Intel's part; Having to source a GPU at your biggest (albeit small) competitor, because your own IP doesn't cut the mustard, is an emembarrassment.

Cheers
Intel these days will never do a deal with Nvidia, just like Apple will do anything to avoid Nvidia even it is suicide for some of their products.

The only one who loses in this deal is AMD, they could had put Intel under serious pressure but instead screw over their own high end high margin offering, possibly because IMO it comes back to Radeon GPU Division of AMD..
 
The only one who loses in this deal is AMD

These are high end, high margin products going to Intel's most important customer, Apple. I expect AMD to have very healthy margins on these.

, they could had put Intel under serious pressure but instead screw over their own high end high margin offering
AMD doesn't have anything close to this anytime soon, so it would have won them exactly zero design wins.

Cheers
 
These are high end, high margin products going to Intel's most important customer, Apple. I expect AMD to have very healthy margins on these.


AMD doesn't have anything close to this anytime soon, so it would have won them exactly zero design wins.

Cheers
The original argument for the deal being positive in the eyes of some was that AMD never intended to have a mobile performance-enthusiast product (APU) or one that will be years away.
Now we have quite clear evidence AMD was always looking to launch a high performance mobile APU and looking closer than many thought considering the leaked product models involved/TDP shown by VideoCardz.
Also logically it would make sense they always had APUs that were not just low level entry models that recently launched.

The Intel deal will cannibalise sales for AMD's own APU solution, and more critically reinforce consumer perception of Intel as the dominant game mobile CPU with an ok enthusiast model.
Rest of the argument we both could mention has been covered pretty extensively such as potential margins the Intel deal has/etc, so I will leave that for previous posts.

I guess it is perspective; I cannot see how anyone can think this was a good deal while I am sure you and some others are thinking I am being overly pessimistic of the deal and its repercussions.
I am just frustrated AMD missed a huge opportunity to put Intel under pressure, something Intel would have no qualms about doing to AMD if it could.
And the backdrop to this as I mentioned before was how Intel went out of its way to be overly negative of Ryzen and Epyc briefings (the press mentioned it was OTT and eye raising) while at the same time doing this deal for the GPU semi-custom.
 
Last edited:
I guess it is perspective; I cannot see how anyone can think this was a good deal while I am sure you and some others are thinking I am being overly pessimistic of the deal and its repercussions.
I am just frustrated AMD missed a huge opportunity to put Intel under pressure, something Intel would have no qualms about doing to AMD if it could.

I don't think the Intel's system overlaps with anything AMD has currently planned. Intel's solution is just very dense repackaging of discrete graphics. AMD is pushing integrated graphics to the memory bandwidth bound limit.

Intel's system is much higher performance, but also much higher cost, selling into a much smaller market segment. High cost and smaller market implies higher risk; Risk AMD cannot afford to take right now.

I'd love to see a 35/65W AMD APU with a couple of HBM2 stacks. Heck, ditch the DDR4 interface completely and just swap directly to NVMe attached SSDs (using M/TLC flash in SLC mode for the swap partition). Use the savings from the missing DDR4 interface to add more PCIe lanes to storage (16x). With current DRAM prices it might even be cost competitive. It sure as hell would be risky though.

Cheers
 
The only one who loses in this deal is AMD, they could had put Intel under serious pressure but instead screw over their own high end high margin offering, possibly because IMO it comes back to Radeon GPU Division of AMD..
It perhaps marginalizes their high end offering, but consider the current market that will be affected. It's the current 470/480 and 1060 discrete market likely to take a hit. That's a huge chunk of the overall gaming market and with AMD effectively controlling their own and Intel's shares it focuses development on their features and platforms. Nvidia is the real loser in this scenario as the mid range market heads towards APUs and tightly integrated designs. Same way the low end has, only performance has caught up. Largely thanks to HBM solving pin and board complexity issues.

These are high end, high margin products going to Intel's most important customer, Apple. I expect AMD to have very healthy margins on these.
I'm still not sure that's the case. AMD engineers have said they are working in drivers for many platforms. So while Apple is likely in there, a Google Chromebox/book offering seems likely as well. Apple hasn't really discussed the design as far as I'm aware. They just have the MacPro with Vega embedded. It appears a more generic offering to the open market aimed at Nvidia's mid range market share. Plenty of companies up for a piece of that pie.

The Intel deal will cannibalise sales for AMD's own APU solution, and more critically reinforce consumer perception of Intel as the dominant game mobile CPU with an ok enthusiast model.
Potential sales, but most of hit I'd expect to be mid range discrete. Too early to tell what kind of an advantage AMD retains as well. Intel could be back a generation or AMD leverages Infinity to remove DDR4 completely from the design. That would be a huge advantage for designers. Will be interesting to see if AMDs memory controllers are X86 compatible to the point an Intel CPU could leverage them directly. That could very well be the next x86-64 style extension of note.
 
Back
Top