[Beyond3D Article] Intel presentation reveals the future of the CPU-GPU war

Discussion in 'Architecture and Products' started by Arun, Apr 11, 2007.

  1. andypski

    Regular

    Joined:
    May 20, 2002
    Messages:
    584
    Likes Received:
    28
    Location:
    Santa Clara
    Umm...

    Doesn't that $260-$300 for the 9800GTX also get you 512MB of the fastest GDDR3 currently manufactured, along with a complex board.

    Whereas the $180-$220 for the CPU is just for a CPU.

    Does all that other stuff on the 9800GTX come for free?
     
  2. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    Retail price does get affected, but its trickled down so the effect is limited to market economy which is derived from competition and production capacity or constraint.

    The main thing though, Intel won't have an issue making the chip, but will they offset the margins of thier high end CPU production for lower margin GPU's on their best processes. It doesn't seem that would be logical, even a small amount lets say 10% of thier fab capacity would be a great hit on them.

    The margins are much higher on high end CPU's then then high end GPU's (easier to look at a similiar performance segment), venture of guess of 50% or more.
     
  3. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,248
    Likes Received:
    5,201
    Unlike Nvidia, Intel has a track record of massively slicing into their margins in order to maintain (or bleed less) marketshare when they've deemed it necessary.

    IE - Had Intel been in charge of the Geforce FX, it's average selling price would have been significantly lower than the Radeon 9700 Pro since Intel tends to avoid losing marketshare if at all possible.

    So if it came down to a war of attrition, Intel would have a bit of an advantage.

    However, I still doubt very much that Intel wants to bury Nvidia or take on Nvidia in the enthusiast class of graphics performance. And from the looks of it, neither does AMD. Leaving Nvidia to compete with Nvidia in the enthusiast class.

    Where "I" personally believe Intel will target is HPC, specialized computing and the mainstream (perhaps even performance mainstream) with Larrabee. A similar market to what AMD is targetting with Rv670 and Rv770. Although I'm not sure how much effort AMD is putting at targetting HPC and specialized computing.

    In those area's Intel has a great chance to succeed and gain marketshare from both Nvidia and AMD if they don't pull out and abandon it as they did back in the i740 days. Even if they end up making lower margins to do so.

    After all, big OEMs love having systems with as many components from one manufacturer as possible. There's cost savings, support is easier and less costly, validation is generally simpler (and thus less costly), etc. There's a lot of benefits.

    Although Nvidia does have going for it the fact that even IF the above were to come true (and it might not) many OEMs will still buy product from them just to maintain some bargaining position against Intel.

    If this were anyone but Nvidia and Intel possibly coming to heads I'd say one or the other would win from pure marketing alone. However, both companies do a magnificent job marketing their products generally.

    Intel also has a bit of an advantage in being able to put more pressure on OEMs through deals and discounts for other Intel Products (CPUs/Chipsets) if they were to bundle a "Larrabee GPU".

    So even though Intel is at a disadvantage now, you just simple cannot discount or ignore them. And you can never ever say something is "impossible" in the tech world.

    Once a company believes it's impossible for their competition to one up them, they've lost.

    Regards,
    SB
     
  4. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    good point but did the price war really help AMD, last quarter they didn't gain any marketshare vs nV in desktop even though AMD's margins and average selling prices were less

    Also I don't think Intel will try to pressure OEMs the way they did with AMD, its a bit risky right now and in the near future to do somethng like that.
     
  5. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    It's worth remembering that in a few years Intel won't be making chipsets like today as most of what we call a chipset will be on the CPU die.

    Also, Larrabee SKUs, top-to-bottom in all their variations, should produce more revenue than chipsets.

    Of course at some distant point Larrabee will also disappear "onto the CPU".

    Jawed
     
  6. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    that is true but the problem is there is a lead time for that, if Larrabee doesn't pick up market share or isn't competitive enough in the mid range and high end it will be relegated to a IGP replacement. So the first few itterations of Larrabee, probably will have to be sucessful I would say at least break even (which would still be a lose for Intel since what I mentioned earlier) to be sustained.
     
  7. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    Medium term I don't see how NVidia can keep up in HPC. For HPC who wants to buy a data-parallel processor with loads of graphics-baggage when Larrabee is practically designed for HPC with just a few ounces of fat?

    And long term all of HPC is going to be commodity CPU based, which means x86+integrated-DPP.

    Jawed
     
  8. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    If you are trying to make the point that it is likely that GPUs would have lower margins than CPUs....

    Well shocker, its been like that since the beginning of time. This isn't shocking or controversial.

    But that has little to do with costs of the wafers and chips nor with their eventual price to end users. If you want to argue that TSMCs sells wafers cheaper than Intel can produce them, then you'll have to come up with some facts.

    Aaron spink
    speaking for myself inc.
     
  9. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    That was one of Carmean's slides tho as to what was driving them.

    [​IMG]
     
  10. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    One must also consider the very nature of the MPUs in question. At this moment in history, GPUs tend to have far more threads (and subsequently instructions) in-flight than CPUs, due to their inherent parallelism of their respective workloads. Given the average (expected) workload of each MPU in question, CPUs tend to benefit more from a larger cache because of this. Data is far more likely to be contained within the cache of a CPU which is likely only to have a single "heavyweight" thread (or perhaps a few) to deal with at any given time. Compared to the modern GPU which is likely to have hundreds or even thousands of threads in-flight at any given moment.

    I didn't figure that one out on my own, some brilliant engineer at Intel (and NV) got that one, I'm just an observer :p That being said, I'm no E.E. so I could be seeing things that simply aren't there.
     
  11. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    My point was to make a comparison that would be as favorable to scali's argument as possible, and even then it's still a close call, and as you said, one must ignore the nature of the devices in question to do so and disregard the PCB, RAM, and other components that make up a graphics card.

    If you look at the previous FAIR comparison I made between Yorkfield and G92, you will see the tables are turned, and by a much wider margin. The same can be said for any current-process MPU comparison in which a GPU is compared to a CPU. I suspect this is likely to remain the case ad infinitum.
     
  12. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    I was going to mention productization/SKU segmentation earlier, but lack the proper data to do more than make general claims based on assumptions. All we have to go by are gross margins, which are useful enough, especially in this case given that both companies seem to have very similar margins on their bread-and-butter products.
     
  13. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    I hear what you're saying here, and theoretically you're correct. However, history is not on your side. During the post-K7 Netburst era (which you refer to by implication), Intel's ASPs were NOT lower than AMD's. They did fall compared to the previous era when there was no real competition though (particularly at the high-end), so you're right in a sense (just not as you stated).
     
  14. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    Correct me if I'm wrong here but didn't the market itself grow by quite a substantial margin? So even though AMD didn't gain marketshare, they still shipped more units than the previous quarter, which of course lead to higher revenues (and very close to profitability).

    Pressure? No more so than the usual co-marketing agreements and discounts for exclusivity.
     
  15. Rys

    Rys PowerVR
    Moderator Veteran Alpha

    Joined:
    Oct 9, 2003
    Messages:
    4,158
    Likes Received:
    1,439
    Location:
    Beyond3D HQ
    I cleaned up the back end of this thread too, to remove a whole bunch of pointless noise and bickering. Please, keep it friendly and knowledgeable.
     
  16. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    The market did grow, but over all % share stayed the same. Pretty much all the price cuts did was stop the bleeding, to gain share back there has to be a reason to gain back, price differential has to be great enough to over come the performance difference. Or performance is higher and can't be substituted by price.


    That sounds all rosy but doesn't always look good with the courts when there is only one company that is eating most of the pie ;)

    Edit: forgot about the whole system builders love to buy from one vendor, thats only if the entire system has a performance to price advantage or vice versa. Thats why we don't see system builders that are selling exculsive AMD + ATi parts yet.
     
    #216 Razor1, May 9, 2008
    Last edited by a moderator: May 9, 2008
  17. Wesker

    Newcomer

    Joined:
    May 3, 2008
    Messages:
    103
    Likes Received:
    0
    Location:
    Australia
    Arun, sorry for the late reply but I've had a busy week. I'm also tired as hell...

    I've tried to think out a long, involved reply to your points, but I just can't get them worded right. I'll just breifly address the main points of the conversation:

    Firstly; we're looking at the entire "GPGPU/CUDA vs CPU" argument from totally different angles. You're focused on chip revenue margins of GPU's and CPU's, where as I'm focused on the revenue margins of end user system integrators of Dell/HP/etc...

    I know that that CPU revenue margins are falling, while GPU revenues are rising (in fact someone posted a slide by Intel saying so).
    But how does this affect sales on the consumer side of things? What does this all mean for the average Joe user, looking for a new PC?

    My point is that OEM's would be able to make more profits by including "bonus" components such as a faster CPU or a larger LCD display, rather than replacing an integrated Intel GPU (which can be regarded as a fixed cost) with a discrete NVIDIA/ATI GPU.

    If Joe was faced with the decision of either choosing a system with a faster CPU or a system with a GPU (with all other components held constant), he would go for the faster CPU each and every time. Further, it costs OEM's less to upgrade CPU speed rather than include an integrated GPU--Consumer demand favours faster CPU's and it's cheaper for OEM's to offer faster CPU's.

    As you said, though, more advertising is needed to make people more aware about decisions such as these, so that we do get a change in consumer preferences toward more balanced systems (to help spread CUDA/GPGPU).


    Once again, my apologies if this reply sounds really slack and bland.
     
  18. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    No problem, less frequent replies means less time required for me to reply too! ;)

    As I said, I agree about LCDs up to a certain (but rather large) extend, but disagree about CPUs.

    I couldn't disagree more. You cannot expect consumers to blindly continue paying for more expensive CPUs even when they don't get more from it. Same for DRAM. Same for HDDs to a lesser extend. These kinds of changes are slow; but they do happen, one customer at a time. The key is this claim from Jen-Hsun in yesterday's CC:
    Joe Consumer might not change his habits when the high-end stops adding value; but he will when the low-end becomes overkill and you can do just fine with an ultra-low-end CPU for eight or nine dollars. Does that mean he won't buy a more expensive CPU? No, but he certainly will consider whether he gets 20x more value from buying a $160 CPU; and the answer will clearly be no. So he's likely to settle for something between the two.

    It is important to understand that the both low-end commoditization and high-end commoditization go together. It's a two-way halo effect. If you see that a GPU can add more value than a CPU in the high-end, you're likely to go for a mid-range GPU instead of an IGP in the low-end even if it's not as important in that segment of the market. And if you see that an ultra-low-end CPU is good enough, it'll also affect the popularity of mid-range and high-end CPUs, not just low-end CPUs.

    http://adage.com/agencynews/article?article_id=126871 - not perfect, but a good start. We'll see how it turns out.

    No problem! :) It's not always easy if even possible to come up with highly original ideas.
     
  19. apoppin

    Regular

    Joined:
    Feb 12, 2006
    Messages:
    255
    Likes Received:
    0
    Location:
    Hi Desert SoCal
    i think it is somewhere in the middle. At any rate, how do i get to read that article? i am stopped by a login page?

    Would you summarize it please? Or is it really worth registering on that site?
     
  20. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    Looks like Nvidia has a soft spot for ray tracing aftera all.

    http://www.tomshardware.com/news/nvidia-intel-larrabee,5458.html

     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...