Intel discrete graphics chips confirmed

Discussion in 'Beyond3D News' started by Tim Murray, Jan 22, 2007.

  1. Techno+

    Regular

    Joined:
    Sep 22, 2006
    Messages:
    284
    Likes Received:
    4
    do u guys remember the Terascale chips? I've heard many people speculating that Intel's GPU is in the from of a CPU, and that 'many-core' saying fits right in.Or 'many cores' could mean a GPU with several execution units with each execution unit having a no. of pipes, if memory serves, this is also imagination technology's way of describing their architecture. GPUs are becoming more and more general pupose and the G80 proves it (there are rumours that G80 is as programmable as a CPU and can emulate x86 code well enough ). I won't really doubt it if they make an X86 compatible GPU , or just add gfx instructions to the CPU.
     
  2. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Finally, some competition for Nvidia!

    Warning for Nvidia, unlike ATI, Intel will not play to lose..mark my words..
     
  3. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    It isn't, and we already discussed that in one of your own threads recently, so stop coming back with that rumour! ;)
    Well, if you look at Terascale, it has very nice performance/transistor in terms of GFLops for massively parallel workloads. It also has better branching coherence than a GPU. But at the same time, it's likely much worse for latency hiding!

    And remember, even that is not x86, it's a custom instruction set. and Intel said themselves it would have been bigger/more power hungry had they made it work with x86 (duh!) - but it'll be interesting to see how general-purpose Intel's architecture is, definitely.


    Uttar
     
  4. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    This puts Nvidia between a rock and a hard place, with Intel on one side, and AMD/ATI on the other, especially when we start seeing multi-core CPUs with GPU cores. It's not like Nvidia is going to be able to compete by making CPU/GPU hybrids. They don't have the engineers, the fabs, or the patents.

    Nvidia's out in the cold while the two big players head off into a future of multicore GPU/CPU/Physics alongside their discrete graphics and chipset products.
     
    #24 Bouncing Zabaglione Bros., Jan 23, 2007
    Last edited by a moderator: Jan 23, 2007
  5. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    All aboard the NV/Intel merger rumor train then, with JHH as CEO?
     
  6. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Supposedly this has always been the thing that would kybosh an Intel buyout (and what was rumoured to have killed the rumoured AMD/Nvidia merger). None of the bigger companies will accept Jensun as taking over. Intel has a pretty rocky relationship with Nvidia, despite them being called in to fill the chipset gap when ATI went to the dark side.

    It might all be moot in a few more years if Nvidia gets squeezed as hard as I expect them to be. Before it was Nvidia vs ATI. Now it's Nvidia vs AMD and Intel. Once those Intel and AMD fabs come on line for graphics chips, Nvidia could be hurting on the margins.
     
  7. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Don't forget Jakob's scenario where NVIDIA just buys AMD/ATI outright! :smile:
     
  8. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Maybe Nvidia will just become an IP licensing company? :wink: When Nvidia have to close their doors, Intel can just pick over the bones of any interesting patents.

    I wonder if Intel looked at buying Nvidia, but just thought they could compete better and cheaper doing the work themselves and using the heavily licensed Imagination tech. Sorry if I'm restarting a "return of the deferred renderer" rumour.
     
  9. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    And don't forget my favorite little scenario, where NVIDIA buys Chartered. It's $2B market cap, so well within their reach right now. Think of how happy AMD would be to lose their extra capacity there, in addition to NVIDIA getting a process partnership with IBM (if that would stay true through an acquisition...)

    On the negative side of things, Chartered currently doesn't have very nice margins, and it'd increase NVIDIA's risk (and damage their relationship with TSMC...) - still, it's worth pondering upon.


    Uttar
     
  10. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    There's still the issue of what Nvidia could do with a fab. Sure, they can make discrete graphics and chipsets, but without the ability to compete on the CPU front. At the same time other companies will be doing discrete/chipsets, and making CPUs at the same time as going down the GPU/CPU convergence route.
     
  11. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    What proportion of NVidia's current production of GPUs and chipsets could Chartered satisfy?

    Jawed
     
  12. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    ~150% through Fab7 alone. The other Chartered fabs aren't quite as cutting-edge though - so the best model there might be to keep producing customers' chips in those other fabs. It's hard to judge Chartered's relative defect rate etc. though, for example.
    BZB: If NVIDIA only loses the super-low-end part of the market, I don't think they'll cry. Remember AMD (and probably Intel!) is thinking of that as a way to reduce the barrier of entry and increase market share. The idea is to "make more money selling CPUs by bundling in GPUs" - not the other way around!


    Uttar
     
  13. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Fusion-type convergence is not just about the low super low-end though, it's about taking the low end and medium markets, leaving the super high-end for discrete. Can Nvidia survive on just the super high-end at the same time as competing on chipsets and with no answer to the CPU/GPU convergence?

    Convergence is not just about making more money by bundling GPUs in order to sell CPUs, it's about (a) having something to do with all those extra cores now that the industry's only way to higher performance is through increasing the number of cores, and (b) taking a slice of the market and money that ATI and Nvidia are making.

    Intel and AMD will be aiming not to just make money, but to actively detract from Nvidia's markets for their own benefit. Nvidia was relatively comfortable while AMD and Intel were not competing in their core markets. Now those big companies are doing exactly that, with the benefits of their other CPU business to help them, which is a market that Nvidia cannot attack.
     
    #33 Bouncing Zabaglione Bros., Jan 23, 2007
    Last edited by a moderator: Jan 23, 2007
  14. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    AMD says that laptops are going to get Fusion first, because this is where there's most demand on power/heat/space.

    So mid-range desktop is a long way off as far as Fusion/Larrabee are concerned. Perhaps 3 or 4 years at the earliest.

    Jawed
     
  15. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    AMD's vision of convergence does not including the midrange markets, and there is no indication that Intel's does.
    AMD's vision of convergence is not to use general-purpose cores for graphics processing, and there is no indication that it is Intel's either. On the other hand, there are serious indications that both AMD and Intel want to take GPGPU seriously, and generalize it as stream/parallel computing. NVIDIA has no disadvantage in that area.
    Definitely - as I said, it's a way to take market share; the idea is to grow the CPU market, and simultaneously increase share in that larger market. At the same time, this has the side-effect of shrinking the low-end GPU market, and arguably impact the midrange GPU market if the performance is "good enough".
    Clearly so, but not with their integrated CPU-GPU offerings. Intel's job offer page clearly implies they are *also* looking at midrange/high-end markets!
    That's true - it's very unlikely that NVIDIA can attack either AMD or Intel's CPUs. The best thing that could happen for them is if 'GPGPU' gains in importance. Arguably, if it becomes important enough, the actual CPU(s) will become relatively less important, too.

    This whole thing must be getting NVIDIA worried though. Chances are, either they'll get badly burned, or everyone else will. And in terms of bundling, they're already getting a bit screwed I'd imagine. Anyhow, please don't take my above responses too personally :) I do agree with your points, but I thought it was worth insisting that you were basing a good bit of your reasoning on things that contradict AMD's official vision of Fusion. We'll see!


    Uttar
     
  16. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    I could certainly see Nvidia getting "cosy" again with IBM.
    They do have some of the best tech/patents in the business, and they have an x86 license.

    Other point in common: Linux.
     
  17. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Remember that AMD presentation diagram that explained Fusion/multicore in low, medium and highend configurations? If you're a gamer, you get more graphics cores. Given the amount of money to be made in the mid-range, and the current abilities of what we see intergrated into chipsets, I can't see that either Intel or AMD will ignore the lucrative midrange market.

    They both have to find something to do with those cores, and when they are shipping quad or oct core chips, they'll have a hard job feeding those with CPU-relevant data. A 8-core chip configured as 4 graphics cores and 4 CPU cores seems like an obvious route.

    All the more reason to put more powerful GPU-style cores on your CPU package to take advantage of their parallel qualities. Once more treading on a market that Nvidia would like for itself.

    As I said, I'm thinking of that slide with the multicore diagrams showing different Fusion products with the cores configured differently depending what market they want to address. If Fusion style products are initially successful, I can't see any other route for them to go other than expanding into higher market segments. It's the only thing they can do, otherwise they will be selling 8 core CPUs with at least four of them idling even in the middle of a demanding game. Intel and AMD will take this route, Nvidia won't be able to follow.
     
  18. Groo The Wanderer

    Regular

    Joined:
    Jan 23, 2007
    Messages:
    334
    Likes Received:
    2
    Told ya

    http://www.theinquirer.net/default.aspx?article=31618

    I do remember being ridiculed about this one here and several other places. Same with the AMD/ATI merger. And.... well anyways.....

    Getting back to the story, none of the writeups about Larrabee are even close to what it is, and calling it a GPU is, well, a misnomer. Think of it more as a convergence product.

    -Charlie
     
    Jawed likes this.
  19. wolf2

    Newcomer

    Joined:
    Jan 23, 2007
    Messages:
    29
    Likes Received:
    1
    Let me add a perspective on Intel vis-a-vis NVDA.

    ATI has had advanced 3D capability since about 1997. Sometime in 2000 it bought ARTX in Mountain View. This was a small 3D technology company composed of some of the best SGI 3D graphics engineers. That was when Dave Orton came onboard ATI.

    Since 2000 ATI has competed with NVDA. Sometimes well and since 2005, not so well.

    The decision for ATI to sell out to AMD was because it had become apparent that ATI was not going to be able to compete effectively against NVDA. In 2006 this was validated as ATI lost notebook and desktop market share as the AMD acquisition completed.

    So with that as a backdrop, I truly question how Intel would expect to compete against NVDA starting from scratch as they are. Intel is further hampered by a "quality of recruiting" issue. That is, they are not able to offer stock option quantities or returns of NVDA which is in a growth stage.

    Any comments or thoughts are welcomed.
     
  20. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    I can see what you are thinking of there; many-core and alu-intensive. But keep in mind Intel wouldn't stand a chance in the actual GPU market with just that. They also need fixed-function units for things like texturing, if they want to stand a chance in terms of performance - unless we are thinking of a 2012+ timeframe, perhaps? That doesn't mean the "guts" of this project wouldn't be the ALU part of it, anyway...


    Uttar
    P.S.: The Inq is one of the only sites not giving direct credit to us... TBH, I'm more surprised so many sites even did, considering this is mostly public info, but heh, won't complain about that! ;)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...