[Beyond3D Article] Intel presentation reveals the future of the CPU-GPU war

Q6600:
Cost: $180-$220 (best etailer pricing)
Size: 286mm^2
Cost/mm^2: ~$.62 (lowest price)

9800 GTX:
Cost: $260-$300 (best etailer pricing)
Size: 330mm^2
Cost/mm^2: ~$.79 (lowest price)

Advantage: Q6600

However, this is only if you use the closeout pricing, and not established MSRP/average sale price.

So you see, even in this absolute best-case-scenario for Intel, by which all breaks are given to them, and none to NV, they still BARELY come out on top. Any *fair* product comparison will show the opposite, and by quite the large margin. Compare G80 to Kentsfield and it's like G92 vs. Yorkfield all over again. Hmm, I'm seeing a trend there...

Umm...

Doesn't that $260-$300 for the 9800GTX also get you 512MB of the fastest GDDR3 currently manufactured, along with a complex board.

Whereas the $180-$220 for the CPU is just for a CPU.

Does all that other stuff on the 9800GTX come for free?
 
"

But now that you mention it... I did mean to say "retail price", as I said before when making the same statement.


Retail price does get affected, but its trickled down so the effect is limited to market economy which is derived from competition and production capacity or constraint.

The main thing though, Intel won't have an issue making the chip, but will they offset the margins of thier high end CPU production for lower margin GPU's on their best processes. It doesn't seem that would be logical, even a small amount lets say 10% of thier fab capacity would be a great hit on them.

The margins are much higher on high end CPU's then then high end GPU's (easier to look at a similiar performance segment), venture of guess of 50% or more.
 
The main thing though, Intel won't have an issue making the chip, but will they offset the margins of thier high end CPU production for lower margin GPU's on their best processes.

Unlike Nvidia, Intel has a track record of massively slicing into their margins in order to maintain (or bleed less) marketshare when they've deemed it necessary.

IE - Had Intel been in charge of the Geforce FX, it's average selling price would have been significantly lower than the Radeon 9700 Pro since Intel tends to avoid losing marketshare if at all possible.

So if it came down to a war of attrition, Intel would have a bit of an advantage.

However, I still doubt very much that Intel wants to bury Nvidia or take on Nvidia in the enthusiast class of graphics performance. And from the looks of it, neither does AMD. Leaving Nvidia to compete with Nvidia in the enthusiast class.

Where "I" personally believe Intel will target is HPC, specialized computing and the mainstream (perhaps even performance mainstream) with Larrabee. A similar market to what AMD is targetting with Rv670 and Rv770. Although I'm not sure how much effort AMD is putting at targetting HPC and specialized computing.

In those area's Intel has a great chance to succeed and gain marketshare from both Nvidia and AMD if they don't pull out and abandon it as they did back in the i740 days. Even if they end up making lower margins to do so.

After all, big OEMs love having systems with as many components from one manufacturer as possible. There's cost savings, support is easier and less costly, validation is generally simpler (and thus less costly), etc. There's a lot of benefits.

Although Nvidia does have going for it the fact that even IF the above were to come true (and it might not) many OEMs will still buy product from them just to maintain some bargaining position against Intel.

If this were anyone but Nvidia and Intel possibly coming to heads I'd say one or the other would win from pure marketing alone. However, both companies do a magnificent job marketing their products generally.

Intel also has a bit of an advantage in being able to put more pressure on OEMs through deals and discounts for other Intel Products (CPUs/Chipsets) if they were to bundle a "Larrabee GPU".

So even though Intel is at a disadvantage now, you just simple cannot discount or ignore them. And you can never ever say something is "impossible" in the tech world.

Once a company believes it's impossible for their competition to one up them, they've lost.

Regards,
SB
 
good point but did the price war really help AMD, last quarter they didn't gain any marketshare vs nV in desktop even though AMD's margins and average selling prices were less

Also I don't think Intel will try to pressure OEMs the way they did with AMD, its a bit risky right now and in the near future to do somethng like that.
 
The main thing though, Intel won't have an issue making the chip, but will they offset the margins of thier high end CPU production for lower margin GPU's on their best processes. It doesn't seem that would be logical, even a small amount lets say 10% of thier fab capacity would be a great hit on them.
It's worth remembering that in a few years Intel won't be making chipsets like today as most of what we call a chipset will be on the CPU die.

Also, Larrabee SKUs, top-to-bottom in all their variations, should produce more revenue than chipsets.

Of course at some distant point Larrabee will also disappear "onto the CPU".

Jawed
 
that is true but the problem is there is a lead time for that, if Larrabee doesn't pick up market share or isn't competitive enough in the mid range and high end it will be relegated to a IGP replacement. So the first few itterations of Larrabee, probably will have to be sucessful I would say at least break even (which would still be a lose for Intel since what I mentioned earlier) to be sustained.
 
Where "I" personally believe Intel will target is HPC, specialized computing and the mainstream (perhaps even performance mainstream) with Larrabee.
Medium term I don't see how NVidia can keep up in HPC. For HPC who wants to buy a data-parallel processor with loads of graphics-baggage when Larrabee is practically designed for HPC with just a few ounces of fat?

And long term all of HPC is going to be commodity CPU based, which means x86+integrated-DPP.

Jawed
 
ShaidarHaran: I don't think cache vs logic is really the main factor here, and clearly you've got a die size budget, not a transistor budget.
Scali: Once again, do you have ANY idea how high the ASPs are for Montecito? I agree there's no technical challenge for Intel here, but that's not the point. I'm most definitely not the one disregarding basic economics here. In fact...
Unless that was a typo or brainfart of epic proportions, this conversation is now over.

If you are trying to make the point that it is likely that GPUs would have lower margins than CPUs....

Well shocker, its been like that since the beginning of time. This isn't shocking or controversial.

But that has little to do with costs of the wafers and chips nor with their eventual price to end users. If you want to argue that TSMCs sells wafers cheaper than Intel can produce them, then you'll have to come up with some facts.

Aaron spink
speaking for myself inc.
 
That was one of Carmean's slides tho as to what was driving them.

Image3-big.jpg
 
ShaidarHaran: I don't think cache vs logic is really the main factor here, and clearly you've got a die size budget, not a transistor budget.

One must also consider the very nature of the MPUs in question. At this moment in history, GPUs tend to have far more threads (and subsequently instructions) in-flight than CPUs, due to their inherent parallelism of their respective workloads. Given the average (expected) workload of each MPU in question, CPUs tend to benefit more from a larger cache because of this. Data is far more likely to be contained within the cache of a CPU which is likely only to have a single "heavyweight" thread (or perhaps a few) to deal with at any given time. Compared to the modern GPU which is likely to have hundreds or even thousands of threads in-flight at any given moment.

I didn't figure that one out on my own, some brilliant engineer at Intel (and NV) got that one, I'm just an observer :p That being said, I'm no E.E. so I could be seeing things that simply aren't there.
 
Umm...

Doesn't that $260-$300 for the 9800GTX also get you 512MB of the fastest GDDR3 currently manufactured, along with a complex board.

Whereas the $180-$220 for the CPU is just for a CPU.

Does all that other stuff on the 9800GTX come for free?

My point was to make a comparison that would be as favorable to scali's argument as possible, and even then it's still a close call, and as you said, one must ignore the nature of the devices in question to do so and disregard the PCB, RAM, and other components that make up a graphics card.

If you look at the previous FAIR comparison I made between Yorkfield and G92, you will see the tables are turned, and by a much wider margin. The same can be said for any current-process MPU comparison in which a GPU is compared to a CPU. I suspect this is likely to remain the case ad infinitum.
 
The margins are much higher on high end CPU's then then high end GPU's (easier to look at a similiar performance segment), venture of guess of 50% or more.

I was going to mention productization/SKU segmentation earlier, but lack the proper data to do more than make general claims based on assumptions. All we have to go by are gross margins, which are useful enough, especially in this case given that both companies seem to have very similar margins on their bread-and-butter products.
 
Unlike Nvidia, Intel has a track record of massively slicing into their margins in order to maintain (or bleed less) marketshare when they've deemed it necessary.

IE - Had Intel been in charge of the Geforce FX, it's average selling price would have been significantly lower than the Radeon 9700 Pro since Intel tends to avoid losing marketshare if at all possible.

I hear what you're saying here, and theoretically you're correct. However, history is not on your side. During the post-K7 Netburst era (which you refer to by implication), Intel's ASPs were NOT lower than AMD's. They did fall compared to the previous era when there was no real competition though (particularly at the high-end), so you're right in a sense (just not as you stated).
 
good point but did the price war really help AMD, last quarter they didn't gain any marketshare vs nV in desktop even though AMD's margins and average selling prices were less

Correct me if I'm wrong here but didn't the market itself grow by quite a substantial margin? So even though AMD didn't gain marketshare, they still shipped more units than the previous quarter, which of course lead to higher revenues (and very close to profitability).

Also I don't think Intel will try to pressure OEMs the way they did with AMD, its a bit risky right now and in the near future to do somethng like that.

Pressure? No more so than the usual co-marketing agreements and discounts for exclusivity.
 
I cleaned up the back end of this thread too, to remove a whole bunch of pointless noise and bickering. Please, keep it friendly and knowledgeable.
 
Correct me if I'm wrong here but didn't the market itself grow by quite a substantial margin? So even though AMD didn't gain marketshare, they still shipped more units than the previous quarter, which of course lead to higher revenues (and very close to profitability).


The market did grow, but over all % share stayed the same. Pretty much all the price cuts did was stop the bleeding, to gain share back there has to be a reason to gain back, price differential has to be great enough to over come the performance difference. Or performance is higher and can't be substituted by price.

Pressure? No more so than the usual co-marketing agreements and discounts for exclusivity.


That sounds all rosy but doesn't always look good with the courts when there is only one company that is eating most of the pie ;)

Edit: forgot about the whole system builders love to buy from one vendor, thats only if the entire system has a performance to price advantage or vice versa. Thats why we don't see system builders that are selling exculsive AMD + ATi parts yet.
 
Last edited by a moderator:
Arun, sorry for the late reply but I've had a busy week. I'm also tired as hell...

I've tried to think out a long, involved reply to your points, but I just can't get them worded right. I'll just breifly address the main points of the conversation:

Firstly; we're looking at the entire "GPGPU/CUDA vs CPU" argument from totally different angles. You're focused on chip revenue margins of GPU's and CPU's, where as I'm focused on the revenue margins of end user system integrators of Dell/HP/etc...

I know that that CPU revenue margins are falling, while GPU revenues are rising (in fact someone posted a slide by Intel saying so).
But how does this affect sales on the consumer side of things? What does this all mean for the average Joe user, looking for a new PC?

My point is that OEM's would be able to make more profits by including "bonus" components such as a faster CPU or a larger LCD display, rather than replacing an integrated Intel GPU (which can be regarded as a fixed cost) with a discrete NVIDIA/ATI GPU.

If Joe was faced with the decision of either choosing a system with a faster CPU or a system with a GPU (with all other components held constant), he would go for the faster CPU each and every time. Further, it costs OEM's less to upgrade CPU speed rather than include an integrated GPU--Consumer demand favours faster CPU's and it's cheaper for OEM's to offer faster CPU's.

As you said, though, more advertising is needed to make people more aware about decisions such as these, so that we do get a change in consumer preferences toward more balanced systems (to help spread CUDA/GPGPU).


Once again, my apologies if this reply sounds really slack and bland.
 
Arun, sorry for the late reply but I've had a busy week. I'm also tired as hell...
No problem, less frequent replies means less time required for me to reply too! ;)

My point is that OEM's would be able to make more profits by including "bonus" components such as a faster CPU or a larger LCD display, rather than replacing an integrated Intel GPU (which can be regarded as a fixed cost) with a discrete NVIDIA/ATI GPU.
As I said, I agree about LCDs up to a certain (but rather large) extend, but disagree about CPUs.

If Joe was faced with the decision of either choosing a system with a faster CPU or a system with a GPU (with all other components held constant), he would go for the faster CPU each and every time. Further, it costs OEM's less to upgrade CPU speed rather than include an integrated GPU--Consumer demand favours faster CPU's and it's cheaper for OEM's to offer faster CPU's.
I couldn't disagree more. You cannot expect consumers to blindly continue paying for more expensive CPUs even when they don't get more from it. Same for DRAM. Same for HDDs to a lesser extend. These kinds of changes are slow; but they do happen, one customer at a time. The key is this claim from Jen-Hsun in yesterday's CC:
Jen-Hsun Huang said:
The VIA platform, we are still in the design and so it’s a little bit too early to make any significant announcements. We are not planning to have production until spring of next year. I think it highlights that the VIA processor is very good and it is -- and it runs Vista Premium. It tells you that if you want a Vista Premium platform, a processor less than $20 certainly and soon, in another year or so, less than $10, with a sufficient GPU will give you a great experience on Vista Premium.
Joe Consumer might not change his habits when the high-end stops adding value; but he will when the low-end becomes overkill and you can do just fine with an ultra-low-end CPU for eight or nine dollars. Does that mean he won't buy a more expensive CPU? No, but he certainly will consider whether he gets 20x more value from buying a $160 CPU; and the answer will clearly be no. So he's likely to settle for something between the two.

It is important to understand that the both low-end commoditization and high-end commoditization go together. It's a two-way halo effect. If you see that a GPU can add more value than a CPU in the high-end, you're likely to go for a mid-range GPU instead of an IGP in the low-end even if it's not as important in that segment of the market. And if you see that an ultra-low-end CPU is good enough, it'll also affect the popularity of mid-range and high-end CPUs, not just low-end CPUs.

As you said, though, more advertising is needed to make people more aware about decisions such as these, so that we do get a change in consumer preferences toward more balanced systems (to help spread CUDA/GPGPU).
http://adage.com/agencynews/article?article_id=126871 - not perfect, but a good start. We'll see how it turns out.

Once again, my apologies if this reply sounds really slack and bland.
No problem! :) It's not always easy if even possible to come up with highly original ideas.
 
No problem, less frequent replies means less time required for me to reply too! ;)

As I said, I agree about LCDs up to a certain (but rather large) extend, but disagree about CPUs.

I couldn't disagree more. You cannot expect consumers to blindly continue paying for more expensive CPUs even when they don't get more from it. Same for DRAM. Same for HDDs to a lesser extend. These kinds of changes are slow; but they do happen, one customer at a time. The key is this claim from Jen-Hsun in yesterday's CC:
Joe Consumer might not change his habits when the high-end stops adding value; but he will when the low-end becomes overkill and you can do just fine with an ultra-low-end CPU for eight or nine dollars. Does that mean he won't buy a more expensive CPU? No, but he certainly will consider whether he gets 20x more value from buying a $160 CPU; and the answer will clearly be no. So he's likely to settle for something between the two.

It is important to understand that the both low-end commoditization and high-end commoditization go together. It's a two-way halo effect. If you see that a GPU can add more value than a CPU in the high-end, you're likely to go for a mid-range GPU instead of an IGP in the low-end even if it's not as important in that segment of the market. And if you see that an ultra-low-end CPU is good enough, it'll also affect the popularity of mid-range and high-end CPUs, not just low-end CPUs.

http://adage.com/agencynews/article?article_id=126871 - not perfect, but a good start. We'll see how it turns out.

No problem! :) It's not always easy if even possible to come up with highly original ideas.

i think it is somewhere in the middle. At any rate, how do i get to read that article? i am stopped by a login page?

Would you summarize it please? Or is it really worth registering on that site?
 
Looks like Nvidia has a soft spot for ray tracing aftera all.

http://www.tomshardware.com/news/nvidia-intel-larrabee,5458.html

PC Perspective is reporting Nvidia will be acquiring RayScale, a ray tracing software company. An official announcement is expected in the next few days if not sooner.

According the RayScale’s website the company “provides ray tracing photo-realistic solutions,” and makes products for Autodesk Maya that utilizes a hybrid rendering technique. No other details were revealed about the acquisition or how Nvidia will be utilizing RayScale’s expertise.
 
Back
Top