AMD Radeon finally back into laptops?

I understand Nvidia getting out of Mac Pro to protect high margin sales of professional cards. I do not understand why not competing in laptops and for new iMacs is not okay. Maybe 1-2 million GPU units a quarter. I guess Nvidia on it's quest for higher margins decided sacrificing income is fine.

If nVidia lowers their price to Apple, what do you think their PC-partners are going to think about that?
 
"Slightly" OT but in german it's actually the same word for both meanings (diskret).:!:
Sorry for adding to the OT...
Actually, mainly in mathematical/technical contexts - and not the kind of technical as in differentiating GPU circuitry on an add-in card from those on some CPUs. :)
In everyday language, the german „diskret“ means something along the lines of confidential, secretive, keeping things on a low profile. Tech sites have been using a direct transscription of discrete for some time now because of laziness I guess.
 
Sorry for adding to the OT...
Actually, mainly in mathematical/technical contexts - and not the kind of technical as in differentiating GPU circuitry on an add-in card from those on some CPUs. :)
In everyday language, the german „diskret“ means something along the lines of confidential, secretive, keeping things on a low profile. Tech sites have been using a direct transscription of discrete for some time now because of laziness I guess.

Discreet, discrete, diskret, discret (French), etc., all share a common Latin etymology: discretus, meaning "separated".
My guess is that discreet entered the English language early, through Old French, with the privacy-related meaning that is has in French, whereas discrete was probably introduced later, by mathematicians, with the meaning of "separate, distinct, non-continuous" (that it also has in French) and with a spelling that remained closer to the original French (and Latin) because mathematicians are rigorous people.
 
Last edited:
If nVidia lowers their price to Apple, what do you think their PC-partners are going to think about that?

Nothing. Neither Apple nor Nvidia would disclose their negotiated price to anyone else. That's standard business practice. Just like Intel has standard tray prices for their CPUs but none of their large partners (ones able to negotiate price) pay anything close to that. And none of their large partners knows what any of Intel's other large partners paid for their CPUs.

Regards,
SB
 
Nothing. Neither Apple nor Nvidia would disclose their negotiated price to anyone else. That's standard business practice. Just like Intel has standard tray prices for their CPUs but none of their large partners (ones able to negotiate price) pay anything close to that. And none of their large partners knows what any of Intel's other large partners paid for their CPUs.

Regards,
SB

I'm sorry, but I'm not buying most of that. Standard business practice is to treat your largest customers pretty much equally and Apple isn't even near the largest computer maker. To suggest that nVidia would slash their prices to Apple while keeping prices up for their bigger customers is silly, which is why they are not doing it. Stuff like preferential treatment doesn't stay secret at that level. Plenty of people switching companies will ensure information travels. I am certain Lenovo has pretty damn good idea what HP is paying Intel for the chips.
 
Last edited:
Fact remains that Apple are the 5th largest PC vendor worldwide, according to shipments made in 2015.

And the #1 brand globally as well in 2015.
 
The idea that apple squeezes the margins out of GPU providers or that it will do so on the 14/16FF family is pure conjecture unless someone here is willing to share any verifiable information about those deals.

If all other OEMs gave so much more money per GPU than apple, why would any IHV sell them their chips?
As if apple's laptops didn't have a high enough ASP that comfortably fits the due price for each component and a large margin on top of it.
 
what you have seen reports that Apple does this to all their product suppliers, I think there was one earlier this month on iphone parts.....

https://www.google.com/search?q=apple+sqeezes+suppliers&ie=utf-8&oe=utf-8#q=apple+squeezes+suppliers

http://jameswelch.net/foxconn-pegat...ressure-as-apple-squeezes-margins-9-to-5-mac/

Apple is well know to treat their suppliers like slave labor and reap the benefits for themselves.

http://www.macobserver.com/tmo/arti...ruptcy-shows-its-hard-to-be-an-apple-supplier

http://www.fool.com/investing/gener...onger-buy-apple-suppliers-and-just-buy-m.aspx
 
Last edited:
All I see is news about apple squeezing their suppliers related to the iphone range, and it seems related to cheaper components which have low margins to begin with, or manufacturing deals.

Are there any news about apple squeezing AMD and nvidia for their discrete GPUs?
How about Intel for their CPUs? And Samsung or TSMC for their SoCs with early access to their limited FinFet production capabilities back in 2015?

No? Back to pure conjecture, then.
 
They can't push Intel, cause they have no other supplier to go to, is simple business of economics..... Apple products "premimium" products. The only other option outside of Intel would be AMD CPU's and they will not fit into Apple's needs as no one really wants AMD CPU's at the higher end.

Apple can squeeze AMD and its easy for them to, cause nV is in no rush to get Apple's business, but AMD needs it to keep what they have or loose even more, AMD is in a desperate situation, Apple knows that, everyone knows that, they will try to get the best they can out of them because of AMD's desperation. And this is what Apple does to ever mid size to smaller suppliers in their supply chain cause they know they won't do well without Apple's business

I want to see some logical posting from you ToTTenTranz instead of prove it cause you don't seem to understand the logistics of what AMD does, what Apple does, what basic business principles are when negotiated from a position of higher standing, of need, and what that does when a company is desperate for cash.
 
None of those mental gymnastics will make any of your aforementioned claims any less than pure conjecture.
Maybe that's enough for you to go e.g. write for fudzilla, semiaccurate or wccftech, or write posts on neogaf.

Here on B3D it's still pure conjecture and it will be valued as such.
 
There is no metal gymnastics, i use those tactics every time I know I can take advantage of someone in negotiating a deal, its common sense. I would never go so far and insult a client by under valuating them, but I would get to a point where its pretty much almost all in my benefit and go from there. I don't care about them or their needs, that is their job to do it that, I want the best for my company though.
 
Yes, you will do and say anything to raise your conjecture above its real value, as proven by your last 3 posts in this thread.
They're still conjectures until you're able to present proof that AMD's mobile GPUs are being sold by very low margins to apple and therefore nvidia isn't interested in selling GPUs to them.

I could come here swear on my feet and granny's ashes that it's only because nvidia felt threatened by OpenCL so halted its support (no OpenCL 2.0+ for their chips), therefore only AMD could provide apple with discrete OpenCL GPUs that could run the same code as integrated Intel GPUs. And so AMD actually took advantage of that and charged apple way more for their GPUs than any other OEM could pay.. and it would still be pure conjecture unless I posted the price/GPU for each model on macbooks and other x86 laptops.

Like that person in the last page who came here to sing the demise and death of OpenCL because of apple's metal even though it's the only compute language available for the most used operating system in the whole world (Android) and it's used by the most used web browser in the world for performance enhancements (Chrome). Conjectures.


Regardless, that particular subject has run its course.
The only confirmed model with a Polaris GPU isn't even from apple. It's from HP.
I won't contribute to derailing the thread anymore.
 
I'm sorry, but I'm not buying most of that. Standard business practice is to treat your largest customers pretty much equally and Apple isn't even near the largest computer maker. To suggest that nVidia would slash their prices to Apple while keeping prices up for their bigger customers is silly, which is why they are not doing it. Stuff like preferential treatment doesn't stay secret at that level. Plenty of people switching companies will ensure information travels. I am certain Lenovo has pretty damn good idea what HP is paying Intel for the chips.

Doesn't matter what any of us on the forums think. That's the reality of the situation. No business deal between 2 or more companies is revealed to anyone not involved in said business deals. Because neither side of the deals wants other companies to know the details. The most important detail of the deals being the negotiated price.

Noone is saying Nvidia would or wouldn't slash their price for Apple. But it's undeniable that not all Nvidia partners buy GPUs for the same price. And especially when it comes to large OEM system integrators, not all of them pay the same price for video cards sourced from Nvidia or their partners.

Regards,
SB
 
Yes, you will do and say anything to raise your conjecture above its real value, as proven by your last 3 posts in this thread.
They're still conjectures until you're able to present proof that AMD's mobile GPUs are being sold by very low margins to apple and therefore nvidia isn't interested in selling GPUs to them.

I could come here swear on my feet and granny's ashes that it's only because nvidia felt threatened by OpenCL so halted its support (no OpenCL 2.0+ for their chips), therefore only AMD could provide apple with discrete OpenCL GPUs that could run the same code as integrated Intel GPUs. And so AMD actually took advantage of that and charged apple way more for their GPUs than any other OEM could pay.. and it would still be pure conjecture unless I posted the price/GPU for each model on macbooks and other x86 laptops.

Like that person in the last page who came here to sing the demise and death of OpenCL because of apple's metal even though it's the only compute language available for the most used operating system in the whole world (Android) and it's used by the most used web browser in the world for performance enhancements (Chrome). Conjectures.


Regardless, that particular subject has run its course.
The only confirmed model with a Polaris GPU isn't even from apple. It's from HP.
I won't contribute to derailing the thread anymore.


That isn't the case the reason why nV doesn't do as well in open CL is because of their theoretical lack of flops compared to AMD cards. But they make up with that in cuda because there are features in cuda that their hardware is capable of that overcome that.

nV never has "dropped" OpenCl development from there end, they just never saw the need for it because they have a better compute language that takes advantage of their hardware and is used more widely anyways. And they have been doing minimal support of OpenCl.

Open CL was announced in 2009 if I remember correctly, and with in a few months 2009 to 2010 nV did put samples for Open CL in their SDK, which was later removed from the SDK and put into their website shortly there after, Think that was in 2011, since then they had not focused on Open CL, so pretty much the entire life span of Open Cl they didn't care for it.

The problem with your statement is you read what you want to read, and don't understand what you read. if you can infer things, like your inference on Pascal's lack of async compute due to what did you say "there so far hasn't been any programs to show it" yet there was on this very forum.

Same thing, you read what you want read and dismissed the rest because didn't fit your statement........
 
Last edited:
Back
Top