Larrabee and Intel's acquisition of Neoptica

Discussion in 'Rendering Technology and APIs' started by B3D News, Nov 28, 2007.

  1. B3D News

    B3D News Beyond3D News
    Regular

    Joined:
    May 18, 2007
    Messages:
    440
    Likes Received:
    1
    On October 19th, Neoptica was acquired by Intel in relation to the Larrabee project, but the news only broke on several websites in the last 2 days. We take a quick look at what Intel bought, and why, in this analysis piece.

    Read the full news item
     
  2. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    Yeah, I must say my first thought was "well, well. . . 'the graphics adults' have arrived on the scene at Intel. Thank God."

    :lol:
     
  3. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    Any indications that there were any other acquisitions/hirings Intel made for the sake of the Larrabee project?

    The graphics one would have generated more news on the net because it's graphics. Other hirings in other fields may have not garnered as much attention.

    If the acquisition mode has focused on graphics for Larrabee, there shouldn't be many other such additions. Otherwise, this could be some kind of cyclical thing where the group goes on a graphics kick for a month, then a simulations phase, then a data mining phase, and so on.

    If the focus is not yet evident, then the project still hasn't decided on a problem for its solution, though heading off GPGPU seems to be best served for pretty much any acquisition related to HPC.
     
  4. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Hmm.. Intel saying "well. yeah.. that's nice Microsoft.. but we'd like to pass on your .. direct X thingy there"

    Whatever they're going to do, rafting upstream requires a low-cost product. not something that Intel is fond of..
     
  5. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Those adds were all over the place last year right?

    Neoptica's takeover looks more like they're starting with the API design
     
  6. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    I should have been more specific about what I was curious about.

    I should have asked whether or not Intel has been gathering employees for non-graphics applications for Larrabee.
     
  7. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,875
    Likes Received:
    767
    Location:
    London
    A CPU with 16 PCI Express 2.0 lanes has 16GB/s aggregate bandwidth to a GPU.

    Jawed
     
  8. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,875
    Likes Received:
    767
    Location:
    London
  9. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    Even when we see an integrated PCI-E bus on the CPU, I'm curious about the latency effect with serious back and forth traffic between the CPU and GPU, thanks to the physical trip on the bus and the serialization/deserialization delay of PCI-E.

    How current setups deal with the PCI-E bus seems rather coarse, compared to the more involved communications being talked about in the story.

    I've seen mention of it, though not that specific pdf.

    I'm more curious about the activity of the organization that is working on Larrabee, such as its hiring patterns and other factors that might give a hint on how focused or non-focused the development effort is.
    I'm hoping to glean from this how focused the design effort really is.

    As a historical example, Merced was a victim of broad design targets that left it hobbled after a forced "die diet", and it took several designs before Itanium started to hit its more modest performance goals.

    What Larrabee will achieve upon release will be determined by what has already happened or is happening now.
     
  10. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,875
    Likes Received:
    767
    Location:
    London
    Intel's plans seem to have a GPU and PCI Express arriving on the CPU at the same time. So PCI Express won't necessarily be relevant, at least once the "GPU" is Larrabee.

    Also, I think Neoptica's concept of a "software rendering pipeline" must have implicit within it the hiding of latency between CPU and GPU.

    It might be better to ask how close was Neoptica to getting a ray tracer running jointly on CPU and GPU, before being subsumed... Is that a meaningful path?

    Jawed
     
  11. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    Intel is on a roll when it comes to buying up any possible consumer-level routes for GPGPU, such as Havok or the possible future consoles Neoptica may have targeted.

    Maybe Intel is going to junk the GPU part, like it's dumping Havok FX.
     
  12. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    IMHO Intel try to buy any knowledge they can get about implementing a modern shader rasterizer on a multicore x86 like processor. There are not many people out there that have worked lately in this area.
     
  13. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    It really does seem like Intel feels a real threat from GPGPU.

    I guess they think trying to delay the initial commercial uptake of the GPU as a computation platform, Larabee will look like a more attractive alternative when considering its x86 roots. At the same time, the technology of these acquisitions would promote HPC in general.

    Imagine if the BluRay camp had a means of delaying HD-DVD's launch and price reduction timeline by 6 months. Not only would they lose a big chunk of the cost difference appeal (particularly for this holiday season), but they may not even be able to convince those studios to switch. Long term outlook would have been bleak.
     
  14. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Anwar Gholoum's blog entry of today planting seeds for a "new and parallel programming language."
    Is this what you're looking at 3dilettante? sounds like a piece written with confidence and hints of current developments.
     
  15. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    It does point to a future where Intel can sidestep current APIs, and by extension it is working to cut GPU products off.

    GPGPU has lost Peakstream to Google, GPU physics in games has lost Havok FX, or at least AMD seems to be rolling over because of it.

    It doesn't offer much evidence that the Larrabee organization's heart is in consumer graphics, but that seems to be how most everyone is reading the situation.
     
  16. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know

    I doubt it though.. getting Larrabee to work under anything we have now would be futile.

    I guess the most logical explanation would be a computational device with an advanced software based rasterizer used for medical, engineering or entertainment purposes.

    The blogs seem to follow up, so take the one of October 18th.
    the word rasterization links to another blog article of him (October 10)
    It's like following Dave's hints.. Why the hell would intel develop a raytracing tool that just runs on their cpu's without making extra money, somewhere?
     
  17. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    That's what I said, though it was rather convoluted.

    To reword what I stated:

    It doesn't offer much evidence that the Larrabee organization's heart is in consumer graphics.
    That lack of focus on consumer graphics is how most everyone is reading the situation.


    Intel's PR with regards to actual graphics work through raytracing is rather poor compared to the rigor put behind other applications of Larrabee.

    If we're going by what Intel's blogs say on raytracing, realtime graphics that is up to snuff with dedicated GPUs is still an afterthought.
     
  18. santyhammer

    Newcomer

    Joined:
    Apr 22, 2006
    Messages:
    85
    Likes Received:
    2
    Location:
    Behind you
    I think Intel hired those guys ( and the Havok ones http://www.fool.com/investing/value/2007/09/17/intel-wreaks-havok.aspx and the Ct laguage team) to make the Larrabee's API.

    Perhaps gonna be DX and OGL compatible + use that API for things not yet implemented in DX ( for example, raytracing or physics )... although I would prefer just to use Ct to program my own raytracing or physics.

    One thing is clear... Intel is moving chess figurines before launching the serious attack!
     
  19. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,140
    Likes Received:
    577
    Compared to the pipeline depth of ye average GPU how is that relevant?
     
  20. Killer-Kris

    Regular

    Joined:
    May 20, 2003
    Messages:
    540
    Likes Received:
    4
    Of course you're also assuming that in a company as large as Intel, that one hand knows what the other is doing.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...