Larrabee delayed to 2011 ?

Discussion in 'Architecture and Products' started by rpg.314, Sep 22, 2009.

  1. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,139
    Likes Received:
    577
    They are not going to tell us ... some people who like the architecture will say it's because of the delays, some people who don't like it will say because it didn't perform up to snuff.

    As I said, I wonder if the developer kits will come with the rendering stack so we can judge the performance per mm2 proper ...
     
  2. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82

    Because that's what's required if you really want to enter the graphics market, and Intel are one of the few companies that can afford it. They have to find something to do with all those cores they are churning out, and more cores are the only direction they can go in order to increase performance.
     
  3. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,003
    Likes Received:
    51
    Other than the clockspeed/thermal wall consumer CPUs hit in 2003/2004, it seems like every other wall gets broken down before it becomes an issue. I guess time will tell but if history is our guide I think bandwidth will continue to scale as it always has. It may take some new development, but remember that DDR and GDDR were once new technologies as well.
     
  4. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,582
    Likes Received:
    624
    Location:
    New York
    Having lots of money isn't a good enough reason. How can they work on the evolution of an architecture if the baseline is still half-baked? We don't even know how committed they were to it in the first place - sounds like there was a considerable difference of opinion about the whole thing internally. But it is refreshing to see such abundant optimism after the doom and gloom in every other post in the Fermi thread :)
     
  5. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    5Gbps is out on market, 7GBps/pin (afaik, the fastest part out yet) is already pushing the limits of copper, which are about 10GBps. Light peak with similar bandwidth is optical, and afaik, so is 10GbE.

    Only the walls that exist in our minds can be broken down/scaled. Rest have to be respected.
     
  6. darkblu

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,642
    Likes Received:
    22
    just like everything else, TBDR is a trade-off - you give up something somewhere to gain something else, elsewhere. but TBDR is not a more flexible scanconverter per se (not versus an IMR, that is).

    as re xenos (which, apropos, is not a TBDR, at least not in hw), the fact that its mountain-top-touted 'free fsaa' was designed for SD resolutions, while devs were expected to deliver at HD surely helped that gpu gain sympathy. /off-topic sarcasm

    does anybody else feel like there's a certain amount of fear among intel's high echelon from non-x86 tech? i mean, LRB fell (partially) victim to intel's stubbornness with x86 (yes, i do believe that), similarly with atom (relatively-poor vertical vocational mobility, but at least running windows is a valid goal). maybe because each time intel have tried to break away with something new, they have stumbled and eventually failed (the shot-by-friendly-fire 960 notwithstanding). i'm starting to wonder, could intel ever produce a viable (as in market-viable), non-x86-derived and/or x86-bolted isa?
     
  7. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    LRB is a matter of survival of Intel's monopoly in cpu space. As gpu's merge with cpu's, Intel needs to have a high throughput part. The only way to sell it in volume is make it rock at graphics.

    AMD with it's fusion and nv with it's gpgpu push will eat intel's high volumes and margins from both ends. Winning at 3D is going to be critical for any company that wishes to dominate in cpu's as well.
     
  8. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    How so? And if it is just as flexible a method as IMR, how is that a problem?

    Well, so far their record is spectacularly craptacular on this front.
     
  9. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82

    Intel, like AMD are chasing the full platform. They want all that revenue. They are not going to let the graphics cash get away from them so easily.

    And who says v3 has to be an evolution of v2? There's been many instances where a new generation of CPU or GPU has come from a different internal team and been markedly different from it's predecessor. You don't need a baseline if by definition what you are doing is new.

    As for the optimism, in general Intel has been executing pretty well in the last few years, whereas Nvidia has not. It's no surprise people expect Intel to bring something to the graphics party as GPUs and CPUs merge.
     
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,582
    Likes Received:
    624
    Location:
    New York
    Agree completely. So I guess we need to agree on what we consider "Larrabee" to be. If it's referring to a specific chip or architecture then it was killed/cancelled. If it represents the idea of a discrete Intel GPU then it can never die.
     
  11. darkblu

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,642
    Likes Received:
    22
    i never said it was a problem ; ) i was just commenting on ShaidarHaran's mentioning it in the context of 'we need something more flexible'. TBDR was not designed to provide higher flexibility, but to exploit a particular trait of the rasterization process.

    it shows, eh?
     
  12. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,003
    Likes Received:
    51
    Just to clarify: I didn't mean TBDRs are more flexible than IMRs. Just saying that the traditional rasterizer is here to stay.
     
  13. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I don't really think it matters a whole lot what we think...;) Intel coined the nomenclature "Larrabee", and Intel has pronounced Larrabee stillborn--if we can even say that much, since Larrabee never made it too far beyond a vague concept. A "discreet Intel gpu" could be anything, and as the next discrete Intel gpu won't be Intel's first discreet gpu, calling something like that "Larrabee" seems fairly bizarre. If Intel ever does get back into the discrete gpu business I'm sure they'll coin another name for the project--and it won't be "Larrabee". I'm thinking that right now Intel would like to distance itself from the name "Larrabee" entirely. I mean, the very fact that Intel waited for the weekend to announce Larrabee's demise indicates to me that the company isn't interested in a lot of further publicity on the subject.
     
  14. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    But it's not alive unless it's something that Intel is pursuing, no matter what the internal code name is. Intel wasn't pursuing this until a couple of years back, and now they are. Just because they've reorganised their focus on the thing that looks like it's the best bet, I don't think we can discount them or claim that they are out of the Larribee/Fusion idea.

    As I keep saying, Intel want that graphics/platform/HPC money. They are committed to multiple cores, and they need to find things to do with those cores. Graphics will be one of the major uses.
     
  15. IdaGno

    Newcomer

    Joined:
    Jul 27, 2007
    Messages:
    11
    Likes Received:
    0
  16. MrGaribaldi

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    611
    Likes Received:
    0
    Location:
    In transit
    If they are abandoning Larabee for graphics, what does this mean for the companies they've bought to showcase it?
    I.e. they bought a company in Lund, Sweden to work on rasterisation techniques that takes advantage of the architecture. How will that affect them? (Of course, some of it could possibly be used with the SSC, but still)
     
    #256 MrGaribaldi, Dec 5, 2009
    Last edited by a moderator: Dec 5, 2009
  17. sunscar

    Regular

    Joined:
    Aug 21, 2004
    Messages:
    343
    Likes Received:
    1
    I doubt it'll affect them at all - Intel's goal still hasn't changed. Intel still wishes to become a major player in the GPU market, while leveraging an X86-like design as much as they possibly can. Regardless of when this finally mannifests, it will still require software, so these companies are still in the clear I'd say.
     
  18. V3

    V3
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    3,304
    Likes Received:
    5
    Are they abandoning Larrabee version 1 (32 cores version ?) or Larrabee architecture and will go to something more traditional like NV or AMD type GPU ?
     
  19. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,845
    Likes Received:
    3,053
  20. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,231
    Likes Received:
    566
    Location:
    en.gb.uk
    They're not bonkers decisions, they're very sensible decisions if they can pull them off without a meaningful negative impact on performance, however you might want to measure that (FLOPS per programmer dollar might be an interesting metric).
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...