Larrabee at Siggraph

From Ars
Intel will claim that Larrabee has 20x the performance per watt of a Core 2 Duo and half the single-threaded performance.
20x perf at what? texture sampling? :)
It also has a 4MB coherent L2, and three-operand vector instructions.
Only 128kb of L2 per core? I'd double that..
 
I highly doubt that as well. There was to much advance in theory since the Pentium that it would be of any benefit to base Larrabee on the original Pentium Design.

It's very likely that Gelsinger was misunderstood and said something along the lines "It's based on the design principles of the P54C".

Atom is not based on the P54C, why should Larrabee be?
 
Man, anytime Intel wants to step up to our interivew, it's been out there for them to do so.:LOL:
Hehe, agreed, although the problem is that you'd get different answers depending on who you interview... you already know what I'd say (and you'd get similar answers from most of the group that I work in), and you can probably guess what Daniel Pohl would say... none of whom speak for "Intel as a whole" (it's a ~85k person company). So who's answers do you want? ;)

The god's honest truth is I want them to succeed because I love high-end graphics and the more serious deep-pocket players there are the happier I am.
Amen to that! Hence why I love that AMD/ATI is back in the game this round (my 4870 is sitting in the box itching to be installed)!

My only point is they need to be psychologically prepared to get their nose bloodied in round one, because they probably will, and it will likely be on the software side no matter how sweet their hardware is on the theoreticals.
Fair enough, although should I take that as an insult? :)
 
Hehe, agreed, although the problem is that you'd get different answers depending on who you interview... you already know what I'd say (and you'd get similar answers from most of the group that I work in), and you can probably guess what Daniel Pohl would say... none of whom speak for "Intel as a whole" (it's a ~85k person company). So who's answers do you want? ;)

What answer can be most easily twisted into an epic flame-thread that causes the ban-hammer to come down swiftly and rightfully while maximizing ads on the site ?

Ehm... I meant... multiple interviews ?
 
Fair enough, although should I take that as an insult? :)

Did I miss the news that you're at Intel now working on Larrabee software?

And, no, that's not an insult in my mind, that's just the reality of how big the mountain they have to climb is. And the "psychologically prepared" thing is less about the guys actually doing the work (who presumably have a much closer appreciation for how big the mountain is) than for senior management who typically aren't nearly as granular as they need to be for this kind of situation. And by "senior management" I mean north of Doug's paygrade.

Said another way, everything I know and have observed over the years about software in general and graphics software in particular tells me there are absolutely no shortcuts to just a sh*tload of iteration/feedback cycles over many years to get where they need to go.
 
Did I miss the news that you're at Intel now working on Larrabee software?
I casually mentioned it in my LVSM thread, but retrospectively that would have been easy to miss. In any case, yes I'm now at Intel working with the "Advanced Rendering Team" which grew out of the Neoptica acquisition.

And, no, that's not an insult in my mind, that's just the reality of how big the mountain they have to climb is.
[...]
Said another way, everything I know and have observed over the years about software in general and graphics software in particular tells me there are absolutely no shortcuts to just a sh*tload of iteration/feedback cycles over many years to get where they need to go.
I was just teasing :) And yes I agree that there's a ton of iteration required and such, but at least this time around - speaking in a general industry sort of sense - there's some past experience with GPU designs, renderer designs, etc. to be drawn on rather than starting totally from scratch.

In any case I'm sure it'll be a fun ride :) To draw this back on topic, you guys will certainly enjoy the SIGGRAPH paper in a few more weeks!
 
I casually mentioned it in my LVSM thread, but retrospectively that would have been easy to miss. In any case, yes I'm now at Intel working with the "Advanced Rendering Team" which grew out of the Neoptica acquisition.

Ah, well congratulations then! :smile: I will say this --I'm much more optimistic on the software side for Larrabee than I was at the time we posted Carmean's slides. . . . because its since become much clearer by hires and acquisitions that they are taking the software side seriously and staffing up with good people to tackle it. The early word from our ninjas talking to Intel acolytes was along the lines of "oh, Microsoft will take care of that", and that scared the bejesus out of me.

I was just teasing :) And yes I agree that there's a ton of iteration required and such, but at least this time around - speaking in a general industry sort of sense - there's some past experience with GPU designs, renderer designs, etc. to be drawn on rather than starting totally from scratch.

That would be a good 'un to add to the interview list, actually. :smile: "How much applicability/transferability from your IGP gpu history are you finding in your software support for Larrabee, and in what areas?" I'm sure there is some.

In any case I'm sure it'll be a fun ride :) To draw this back on topic, you guys will certainly enjoy the SIGGRAPH paper in a few more weeks!

Looking forward to it.
 
50% the single thread performance of a Core 2 Duo? Vectorize or be slow!

And on the topic of vectorization, hope that paper touches on scatter and gather performance (and functionality). IMO one of the most important aspects of the hardware which we have no accurate info on...

Also, why hasn't Intel picked up Nick (SwiftShader) yet?
 
I still think there will be dedicated rasterization hardware on Larrabee and therefore there is no need for a software rasterizer.
 
That would be a good 'un to add to the interview list, actually. :smile: "How much applicability/transferability from your IGP gpu history are you finding in your software support for Larrabee, and in what areas?" I'm sure there is some.
Well definitely true on that front, but I even meant from more of the point of view of "the industry in general understands GPUs and rendering a lot better than - say - the 3dfx days". The graphics hardware/software industry is not entirely disjoint along company boundaries, if only because employees themselves move around. Rest assured there are people at Intel who know how GPUs work, myself included :)
 
Rest assured there are people at Intel who know how GPUs work, myself included :)

Oh, I understand. And it still took Michelangelo four years to paint the ceiling of the Sistine Chapel. :p And that was a relatively easy undertaking compared to what y'all got on your plate.
 
I wonder if Abrash&Co. had the opportunity to ask for specific hardware optimizations that would speed up software rasterization.

Yeah, they asked for pentium cores, so they could reuse the 10-year old rasterizer they had on dusty floppies in the kitchen cupboard.

SCNR
 
Yeah, they asked for pentium cores, so they could reuse the 10-year old rasterizer they had on dusty floppies in the kitchen cupboard.
This is a great joke but I doubt Larrabee's cores are pentium based.
 
Oh, I understand. And it still took Michelangelo four years to paint the ceiling of the Sistine Chapel. :p And that was a relatively easy undertaking compared to what y'all got on your plate.
I agree, and that's what makes it so much fun for a software guy. If we end up with the equivalent of the Sistine Chapel I'll be happy :)
 
Last edited by a moderator:
Back
Top