I would not call it tempting without hard data to evaluate.
Well, speculatively tempting. It's not hard to roughly estimate what 4 Zen cores and 11 Vega CUs can do. But yeah, I'd definitely wait for reviews before buying anything.
I would not call it tempting without hard data to evaluate.
Well, speculatively tempting. It's not hard to roughly estimate what 4 Zen cores and 11 Vega CUs can do. But yeah, I'd definitely wait for reviews before buying anything.
Is there any hard evidence of Intel making high performance parts? I've seen lots of news outlets running with it, but upon inspection of Intel's statements it only reads as integrating discrete graphics chips. Not necessarily their own and Raja appears to have taken up edge computing, not graphics.Furthermore, Intel will now be doing their own discrete GPUs, meaning this semi-custom deal from AMD won't be a long one.
If Intel comes up with their own GPU family in 3/4 years, that's how much time AMD has to develop a high performance APU within the 50-100W range.
That may just be an inherent difference in products as GPUs cycle much more quickly. CPUs adopt features much more slowly. Just look at all recent Intel CPUs and their gains over the past years/generations.One aspect that is fundamentally different between Radeon and the CPU division is the control of information is better within the CPU division, in other words we really do not know much about what the CPU division unannounced product/R&D strategy is.
I highlighted a bit of the announcement directly from Intel earlier which mentions from Dr. Murthy Renduchintala in context of AI/deep learning/compute/etc along with visual graphics and data.Is there any hard evidence of Intel making high performance parts? I've seen lots of news outlets running with it, but upon inspection of Intel's statements it only reads as integrating discrete graphics chips. Not necessarily their own and Raja appears to have taken up edge computing, not graphics.
The last time Intel made their own graphics solution they got sued and lost. I don't see that changing. They may be much better off sourcing semi-custom chips for integration along with their own FPGAs or other coprocessors as required.
That may just be an inherent difference in products as GPUs cycle much more quickly. CPUs adopt features much more slowly. Just look at all recent Intel CPUs and their gains over the past years/generations.
The whole context of that speech is with regards to Raja joining.“Raja is one of the most experienced, innovative and respected graphics and system architecture visionaries in the industry and the latest example of top technical talent to join Intel,” said Dr. Murthy Renduchintala, Intel’s chief engineering officer and group president of the Client and Internet of Things Businesses and System Architecture. “We have exciting plans to aggressively expand our computing and graphics capabilities and build on our very strong and broad differentiated IP foundation. With Raja at the helm of our Core and Visual Computing Group, we will add to our portfolio of unmatched capabilities, advance our strategy to lead in computing and graphics, and ultimately be the driving force of the data revolution.”
And neither HP, Lenovo or Acer have sent review samples to reviewers, despite these laptops being spread within distribution channels for weeks.It's not strange, because HP, Lenovo etc send review samples to reviewers and not AMD. Some bigger online magazines buy laptops and review them.
Clock fluctuations dependent on TDP and temperature are a huge part of the final performance. All we have is a couple of numbers from AMD that could have been taken in ideal conditions.Well, speculatively tempting. It's not hard to roughly estimate what 4 Zen cores and 11 Vega CUs can do. But yeah, I'd definitely wait for reviews before buying anything.
Some have claimed the original statement is ambiguous, but for me it's really clear:Is there any hard evidence of Intel making high performance parts?
In this position, Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.
Going forward under Koduri’s leadership, the company will unify and expand differentiated IP across computing, graphics, media, imaging and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing.
Not that I'm a fan of these, but it would have been pretty epic with a Lisa Su unboxing video tho...No unboxing video...
Does that mean intel will be making graphics cards?Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.
Doesn't that description also cover Kaby-G? As a semi custom chip, it's a MCM leveraging their IP with discrete GPUs. An Intel+Radeon chip atop open source software would still be inline with those statements. Most of Raja's career was spent doing just that.Also remember Intel EOL'd the PCIe coprocessor Xeon Phi last quarter as it was not gaining as much interest as hybrid GPU-CPU solutions, so it makes sense to build something that competes better and IMO looks like they have decided to go discrete GPUs, which like Nvidia can provide benefits top-down.
I've seen that, but again, it doesn't specifically say they are making their own discrete graphics cards. Just making use of said devices.Some have claimed the original statement is ambiguous, but for me it's really clear:
How does Kayb-G EMIB manage to do:Doesn't that description also cover Kaby-G? As a semi custom chip, it's a MCM leveraging their IP with discrete GPUs. An Intel+Radeon chip atop open source software would still be inline with those statements. Most of Raja's career was spent doing just that.
I've seen that, but again, it doesn't specifically say they are making their own discrete graphics cards. Just making use of said devices.
That is my point.Even a GPU can be a coprocessor in the sense of GPGPU work. In the case of Phi there will inevitably be different accelerators for different tasks. Intel has their acquisitions from Altera to facilitate that. Pairing a Kaby with GPU, FPGA, DSP, etc to accommodate different markets. They can avoid the whole GPU patent situation simply by integrating low margin parts from others. Be it AMD, Nvidia, etc. Three competitors making high end discrete GPUs will destroy everyone's respective margins with competition.
I'm not discounting Intel could make dGPUs, I just current remain unconvinced presented with KabyG.
So rules out Nervana/Altera as it stands now as current products but could incorporate IP into a new R&D roadmap-product-tech.Speaking to The Register's sister publication The Next Platform, Intel enterprise and HPC group boss Barry Davis said the Xeon Phi remakewould involve changes within the chips, and would not radically change the surrounding hardware and software running on top. The redesigned silicon is expected to be x86 compatible.
"Since we are on a CPU path here, this is not going to be a strategy that completely disrupts the ecosystem," Davis said
They really should had ignored any deal with Intel so they could hammer them in gaming/enthusiast mobile segment, now they have made their life that much harder.
Intel these days will never do a deal with Nvidia, just like Apple will do anything to avoid Nvidia even it is suicide for some of their products.If AMD hadn't made this deal with Intel, Nvidia would have stepped up. Now Intel foots some of the development cost for mobile GPU development in a next generation package as well as ensures a steady, if modest, stream of revenue for AMD. I really don't see how this is bad for AMD.
It is, however, a serious admission of failed strategy on Intel's part; Having to source a GPU at your biggest (albeit small) competitor, because your own IP doesn't cut the mustard, is an emembarrassment.
Cheers
The only one who loses in this deal is AMD
AMD doesn't have anything close to this anytime soon, so it would have won them exactly zero design wins., they could had put Intel under serious pressure but instead screw over their own high end high margin offering
The original argument for the deal being positive in the eyes of some was that AMD never intended to have a mobile performance-enthusiast product (APU) or one that will be years away.These are high end, high margin products going to Intel's most important customer, Apple. I expect AMD to have very healthy margins on these.
AMD doesn't have anything close to this anytime soon, so it would have won them exactly zero design wins.
Cheers
I guess it is perspective; I cannot see how anyone can think this was a good deal while I am sure you and some others are thinking I am being overly pessimistic of the deal and its repercussions.
I am just frustrated AMD missed a huge opportunity to put Intel under pressure, something Intel would have no qualms about doing to AMD if it could.
It perhaps marginalizes their high end offering, but consider the current market that will be affected. It's the current 470/480 and 1060 discrete market likely to take a hit. That's a huge chunk of the overall gaming market and with AMD effectively controlling their own and Intel's shares it focuses development on their features and platforms. Nvidia is the real loser in this scenario as the mid range market heads towards APUs and tightly integrated designs. Same way the low end has, only performance has caught up. Largely thanks to HBM solving pin and board complexity issues.The only one who loses in this deal is AMD, they could had put Intel under serious pressure but instead screw over their own high end high margin offering, possibly because IMO it comes back to Radeon GPU Division of AMD..
I'm still not sure that's the case. AMD engineers have said they are working in drivers for many platforms. So while Apple is likely in there, a Google Chromebox/book offering seems likely as well. Apple hasn't really discussed the design as far as I'm aware. They just have the MacPro with Vega embedded. It appears a more generic offering to the open market aimed at Nvidia's mid range market share. Plenty of companies up for a piece of that pie.These are high end, high margin products going to Intel's most important customer, Apple. I expect AMD to have very healthy margins on these.
Potential sales, but most of hit I'd expect to be mid range discrete. Too early to tell what kind of an advantage AMD retains as well. Intel could be back a generation or AMD leverages Infinity to remove DDR4 completely from the design. That would be a huge advantage for designers. Will be interesting to see if AMDs memory controllers are X86 compatible to the point an Intel CPU could leverage them directly. That could very well be the next x86-64 style extension of note.The Intel deal will cannibalise sales for AMD's own APU solution, and more critically reinforce consumer perception of Intel as the dominant game mobile CPU with an ok enthusiast model.