Intel discrete graphics chips confirmed

Discussion in 'Beyond3D News' started by Tim Murray, Jan 22, 2007.

  1. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Uttar buried my mistake, not me, btw. :lol: Hiya, Charlie.

    Intel's pages say what they say. And they say Larrabee is starting out as Discrete for high-end. Where the convergence comes in could be interesting tho. Are you suggesting that the first Larrabee cards won't be AMD-compatible (AMD cpu, that is)? Or if they are, they will be much faster with Intel CPUs than comparable AMD cpus?
     
  2. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    Well, Intel hasn't touched its Eurasia IP yet, has it? I definitely wouldn't be surprised if Charlie's correct in the long run, but if it's going to try to compete within the next year or two, I wouldn't be surprised if it borrows heavily from that chip.
     
  3. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Or maybe ATI wanted to get into the converged market before AMD and Intel did so and decimated their business? Maybe it didn't want to be in the position that Nvidia is in, ie competing directly against AMD and Intel?

    I think you're missing a couple of things:

    - Intel heavily licensed Imagination tech last year. It was big deal. It's not a standing start, and even if it was, Intel has the money and expertise to invest for results several years away.

    - Intel is a monster. It has lots of money, and it's own fabs. It is always 1-2 process nodes ahead of Nvidia. It has a ton of engineering expertise, and the money to buy more. You'd have to be very foolish to assume that Intel won't be able to compete effectively. AMD buying ATI has just taken away any regulatory fears Intel might have about being seen as a monopoly in the graphics market.

    Just like a Microsoft or an IBM, Intel isn't just a big fish, it's a honking big shark. Nvidia isn't somehow going to be magically safe from competition just because Intel chose not to swim in the same waters up to this point.
     
  4. wolf2

    Newcomer

    Joined:
    Jan 23, 2007
    Messages:
    29
    Likes Received:
    1
    Well between us, we have outlined the two ends of the argument, and either could be right.

    Some thoughts:
    Intel is undoubetedly a marketing shark. There is no argument on that point. I question their ability to dominate in advanced technologies and especially ones which are outside their fairly narrow expertise of conventional CPU CISC design.

    Intel's node advantage only comes into play when a core can be produced for a number of years. Nvidia will see to it, that Intel cannot use that node advantage in the 3D graphics space where major core redesigns will emerge every 9-18 months. This model can go on for another 20 years or more easily.
     
  5. glw

    glw
    Newcomer

    Joined:
    Aug 29, 2003
    Messages:
    64
    Likes Received:
    0
    To describe Intel's expertise as being in 'conventional CPU CISC design' ignores a history of non-CISC microprocessors such as the i960, i860, Itanium, XScale; network processors and a lot of chipset expertise extending into IGPs, ethernet controllers, storage subsystems. Their work on libraries and compilers and support for ISVs. A don't forget that Intel has inherited and attracted some of the best people in the industry who worked on class leading microprocessors elsewhere.

    If anyone can catch up with a ATI and NVIDIA it's Intel.

    That said I think Groo is right and this isn't about building a new GPU as such.
     
  6. Voltron

    Newcomer

    Joined:
    May 25, 2004
    Messages:
    192
    Likes Received:
    3
    Oh yes. Intel with it terrifying speed to market will easily catch up. Let see, it took the Itanic how long to come out?

    Wolf's point about the product cycles are a huge advantage for NVIDIA. And its not like they haven't been anticipating this day since they formed in 1993.

    Then there is the software. From top to bottom, Gelato, CUDA, drivers, compilers, plus all the in-house design tools that they have been building to make such large chips so quickly. This doesn't grow on trees.

    Anyhow, the market for streaming processors looks like its going to be much larger than most people are anticipating, if you buy into NVIDIA's contention that they will be the DSP of the 21st century, so there is probably going to be room for multiple players.
     
  7. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    The post above mine explains how Intel has a finger in many, many pies. If you think Intel is just x86 CPUs, you don't know much about them. And if you recall, until the P4 debacle, Intel dominated for years, have been able to take a massive hit from the A64 (during which time they still dominated), and have come back with the undeniably impressive Core 2 dual and quad cores.
    Intel invest massively in R&D, and can dominate anything they put their mind to.

    IMO, the biggest reason for Intel to stay away from graphics has been the risk of an anti-trust investigation, but now ATI and AMD have merged, that barrier has gone. Combined with the new graphic requirements of Vista meaning every PC has to be of a much higher graphical power by default, I think Intel decided now was the time to get a slice of the market.

    Intel is already 1-2 nodes ahead of Nvidia. That means they can leverage those nodes into advanced designs, better margins due to more chips per wafer, and all in their own fabs. IIRC, Intel is already making .65 chips, and working well on .45 . That's a big process advantage.
     
  8. Groo The Wanderer

    Regular

    Joined:
    Jan 23, 2007
    Messages:
    334
    Likes Received:
    2
    I'll probably write it up in the near future. I have been sitting on it for months now at the request of certain people, but the word is getting out, so I might have to out it. Lets just say I am not speculating here.

    As for the crediting, not sure why he didn't, but I am not Dean. Write him and bitch. If you do follow the links, we did report much the same thing in August:
    http://www.theinquirer.net/default.aspx?article=33836
    I do agree that if they felt it was enough of a story to write about, you should have been credited, but I don't think the news is all that new.

    -Charlie
     
  9. wolf2

    Newcomer

    Joined:
    Jan 23, 2007
    Messages:
    29
    Likes Received:
    1
    Well I hate to clue you in on this, but they're all gone. Sold or buried. Everyone of them except IGP where they've protected the business with bus licensing and royalties.


    Yeah. I agree with this. Its really about new graphcs paradigm's. Here again, new paradigm's are not Intel's strength. I say again, they're great at manufacturing, great at marketing, but they're not in the business of identifying and developing to new unproven markets.

    Now if this was an argument that said Intel is actually gearing up to produce a better IGP than I'd say that makes some sense because realistically, I think that's all they can pull off at best.


    If chipsets are any indicator of future performance, Intel has historically used their most advanced processes for CPU manufacture. Chipsets have historically been used to fill up older fabs and extend a fab's EOL. By that measure, Intel will be at least 1/2 and perhaps as much as 2 nodes behind.
     
  10. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    SGX545 (which I believe to be "Muse") is supposed to be sampling in silicon in H2 07'. Sounds more like a Laptop/PC IGP thingy to me.

    SGX555 (which I'd say is "Athena") is being mentioned "as another core on the roadmap" sounds more like =/>2008.

    Watch the lines in the right part of the diagram:

    http://www.imgtec.com/Investors/Presentations/Interim06/index.asp?Slide=33

    SGX510 (lowest end mobile phone SGX) and SGX555 are the two cores that are on the roadmap beyond H2 2007 and marked as "longer term".
     
  11. crystall

    Newcomer

    Joined:
    Jul 15, 2004
    Messages:
    149
    Likes Received:
    1
    Location:
    Amsterdam
    Entering the GPU market requires much more than just having the fab capacity to do it. People usually seem to forget that modern GPUs are only one part of the equation, the other part being drivers and if current Intel drivers are any indication of what they could do in the future then nVidia shouldn't bother much.
     
  12. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Yes, I know, that was in direct reply to a point about fabs and processes. Nvidia would be very foolish to not to "bother much" when a 900 lb gorilla like Intel comes into your market. The same thing that Nvidia has done to other companies is what Intel could do to them. Pretending they don't matter is the worst thing that a company could do.
     
  13. Thorburn

    Regular

    Joined:
    Oct 8, 2006
    Messages:
    323
    Likes Received:
    19
    Location:
    UK
    Itanium certainly hasn't.
     
  14. crystall

    Newcomer

    Joined:
    Jul 15, 2004
    Messages:
    149
    Likes Received:
    1
    Location:
    Amsterdam
    Intel may be a large company but they are basically starting from scratch in that market whilst nVidia (and now AMD though I'd like to still call it ATi) are well established. Now historically Intel has already tried to enter various markets with pitiful results, they are very good at their core business (MPUs and related chipsets) but for whatever reason fail to do well in other ones. The most recent example of this was them selling their ARM license and entire XScale line of processors to Marvell after their poor results in the embedded arena.
     
  15. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Resources will eventually produce a decent (if not necessarily dominant) result in the longterm, so long as stick-to-it-ness is applied. That has usually worked out for MS, for instance ("wait for 3.0" has been the joke about MS for as long as I can remember). Intel has historically lacked the will power to keep slugging at what they perceive to be "peripheral" parts of their business when they aren't successful in the short to mid term. After a few years some senior exec bangs his hand on the table, mutters about blowing billions on something that doesn't help them with cpus and isn't making money on its own, and that's that.

    I think what some of us think may be happening that's different this time, is that Intel may have come to the conclusion that this *is* central to staying competitive long-term on the cpu side. And that could make quite a large difference in their willingness to throw sufficient resources and long-term tenacity at things like driver development. . .
     
  16. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Intel is not just a load of guys sitting around saying "wouldn't it be nice if we made our own graphics chips".

    Intel can afford to keep going at it for however long it takes, and they will do so now that strong graphics have become part of the core processing necessary in a PC, and soon with Fusion style products will actually become part of the CPU. The market is big enough, their direct competitors are doing it, and Intel will do it too. GPUs are now going to become part of their core buisness. Maybe not for a few years, but it will happen, and as I said in an earlier post, they are not starting from scratch. They already have their own IP, engineers and fabs, and they heavily licenced the Imagination technology last year.

    For a long time Intel has been talking about moving graphics back to the CPU, and have been unsuccessful in that the industry has not been going that route. Now they've gone at it from the other angle of taking the GPU and making it part of the CPU, just as AMD are doing. It's just another core on the continued expansion of multicore CPUs, and an inevitable course once you realise that you can't just keep making bigger and faster chips, and have to use parallelism and multiple cores to continue forwards.
     
  17. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Which is why Intel is the driving force in wireless data communications?
    Which is why Intel has the most advanced semiconductor process in the world?
    I could go on and on.

    Intel has dominated in advanced technologies longer than anyone else. No one has been able to compete long term.

    The rate at which new product introductions are made has little to do with a process node advantage. New GPUs could be on a 1 week tick rate, it wouldn't change the fact that Intel is 1-2 generations ahead of Nvidia when it comes to semiconductor manufacturing technology. I really don't think you understand how semiconductor manufacturing works. The advantage is always there for a better process, it doesn't matter if you are making a particular product on the line for 9 weeks or 9 months.

    Aaron Spink
    speaking for myself inc.
     
  18. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    I complete agree. Think about all the other non-CPU businesses that Intel has been able to complete dominately after investing untold billions: embedded processors, telecom chips (spanning the full range from backbone sonet/fiber to DSL), WLAN, WiMax (may still work out, who knows?), consumer video, DSP's etc etc. It's a long list and I'm sure there are more.

    Call me a bit sceptical: Intel simply hasn't proven to be able to expand into new fields, despite pumping tons of money into it.
    To be honest, it's a bit of a mystery why that is and your reasoning is definitely very, well, reasonable.:wink: It just that somehow all their initiatives fail miserably.

    Well, they're not. They tried for years to make their own WLAN chips, but eventually had to use TI and Atheros for the Centrino platform. And WiMax is still not the success they have been predicting for a long time. Maybe it will play out. Maybe not...
    Because that's what's needed to produce kick-ass CPU? But not much else...
    Please do.
     
    #58 silent_guy, Jan 26, 2007
    Last edited by a moderator: Jan 27, 2007
  19. zsouthboy

    Regular

    Joined:
    Aug 1, 2003
    Messages:
    563
    Likes Received:
    9
    Location:
    Derry, NH
    Here's hoping that Intel puts something out that puts both nVidia and ATi on their toes.

    And drivers to match.


    I don't actually have anything to add to the discussion. :)
     
  20. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Intel reinvents the transistor.

    Looks like .45 products in 2H 07, several years ahead of previous esitimates.

    Some of this could be over hyped in answer to last month's press release on their new ultra-low-k materials.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...