Larrabee delayed to 2011 ?

DK has a great write-up:

Reading between the lines, the rationale for canceling graphics cards based on Larrabee 1 is primarily performance, time to market and the competition. Intel will not enter a new market with an uncompetitive product. To be competitive in graphics, the performance for the combined hardware and software stack would need to be in-line with contemporary ATI and Nvidia discrete GPUs. The time to market plays a big role by determining what the contemporary GPUs from Nvidia and ATI will be – delays are quite an issue. Moore’s Law says you get about twice the area every 18 months due to a process shrink, and for GPUs, that translates directly into performance. Conceptually, every month of delays is equivalent to losing 3.9% performance.
 
Well he would say that. IMO too much (x86) overhead, not enough compute power :p

I wonder if the developer kits will come with the rendering stack so we can judge the performance per mm2 proper and put an end to that discussion.
 
Last edited by a moderator:
Hmm, I thought v1 was already dead and Intel has just effectively killed v2. Pretty disappointed, all the same.

I have to admit I was expecting the first consumer Larrabee to be sold at a loss, i.e. despite being a huge die it would be priced according to its graphics performance (e.g. $200).

Also, how much of this is architectural? e.g. the ring bus is no good or texturing is way too slow/thrashing caches.

Jawed
 
So, at Semi Accurate he is talking about Larrabee "version 3" still slated. Versions 1, and he infers 2, are canceled.

I guess I'm confused, has Intel canceled all high end graphics parts for consumers?

I mean, if theres still some version of Larrabbee coming at some point, I dont see how Larrabee can be said to be canceled. This would then be almost just another delay.
 
OK, let me clear a bit up. There was an LRB prior to the one we are calling LRB 1, aka the one that was shown at IDF and SC09. The previous one, LRB 0 was a 65nm part that was never intended to be sold, dev platform only. 16 cores, 65nm, and it never taped out.

LRB 1 was about 4890/285 level performance, which wouldn't cut it as a GPU next spring or so. It was still VERY competitive as a GPGPU product, so it will be launched as that. Also, if you are a dev, they will likely sell you one to code agains.

LRB 2, as David Kanter said in his article linked above, was a 32nm shrink of LRB 1. It is unlikely to come out as anything more than a GPGPU version, it isn't needed for code development, and HPC.... well maybe.

LRB 3 is a very different beast. Very different. If they sink time, money and developers into LRB 2, they are just taking them off 3, and delaying it. 2 does not carry over to 3 nearly as much as you would think, so I am pretty sure it will be killed. I don't KNOW that it will, but continuing on that path makes little sense.

This isn't Intel running away, it is them realizing where they are at, and doing the smart thing for a better long term prospect. They were also really adult about how they did it, and messaged it.

Other companies could learn a lot from that, but I won't hold my breath.

-Charlie
 
I wonder if the developer kits will come with the rendering stack so we can judge the performance per mm2 proper and put an end to that discussion.

I don;t think they'd give out even the beta version of their rendering pipeline. It'd make it a lot easier for the competition to get better than semi-accurate estimates of performancce.
 
LRB 3 is a very different beast. Very different. If they sink time, money and developers into LRB 2, they are just taking them off 3, and delaying it. 2 does not carry over to 3 nearly as much as you would think, so I am pretty sure it will be killed. I don't KNOW that it will, but continuing on that path makes little sense.

Let's hope it means the end of full cache coherency in hw.
 
OK, let me clear a bit up. There was an LRB prior to the one we are calling LRB 1, aka the one that was shown at IDF and SC09. The previous one, LRB 0 was a 65nm part that was never intended to be sold, dev platform only. 16 cores, 65nm, and it never taped out.

LRB 1 was about 4890/285 level performance, which wouldn't cut it as a GPU next spring or so. It was still VERY competitive as a GPGPU product, so it will be launched as that. Also, if you are a dev, they will likely sell you one to code agains.

LRB 2, as David Kanter said in his article linked above, was a 32nm shrink of LRB 1. It is unlikely to come out as anything more than a GPGPU version, it isn't needed for code development, and HPC.... well maybe.

LRB 3 is a very different beast. Very different. If they sink time, money and developers into LRB 2, they are just taking them off 3, and delaying it. 2 does not carry over to 3 nearly as much as you would think, so I am pretty sure it will be killed. I don't KNOW that it will, but continuing on that path makes little sense.

This isn't Intel running away, it is them realizing where they are at, and doing the smart thing for a better long term prospect. They were also really adult about how they did it, and messaged it.

Other companies could learn a lot from that, but I won't hold my breath.

-Charlie

For LRB3, they're learning it's going to take a lot more to remain competitive with NV and ATI going forward than just shoving a load of tiny X86-like cores on a die and calling it a GPU, though LRB2 arguably could've been semi(?)competitive earlier this year.
 
There is a difference between delayed and cancelled, a pretty huge one at that. He got nothing right yet.

The fact that consumer lrb gpu's will not be out in 2010, he got it right. And since the delay is such a long one, it is almost the same as cancellation, as it is expected that people will start pushing the v2/v3 to the front, cancelling work on v1
 
I'm jack complete lack of surprise on this one. The current project got delayed way too long to be a viable contender in the high-end GPU race. Even if the part had any merit to it, the inevitable negativity surrounding its underachieving performances in games would have been damageable to the brand/market Intel is trying to tap into.

Now, do I really believe that Intel will still be still "dead serious" about competing in the high-end discrete graphics card business after sinking so much time, marketing and, of course, money into LRB? I honestly couldn't answer that question since it makes as much sense business-wise for Intel to let go and stop that costly venture, as it makes sense to try and expand onto new markets.
 
I'm jack complete lack of surprise on this one. The current project got delayed way too long to be a viable contender in the high-end GPU race. Even if the part had any merit to it, the inevitable negativity surrounding its underachieving performances in games would have been damageable to the brand/market Intel is trying to tap into.

Now, do I really believe that Intel will still be still "dead serious" about competing in the high-end discrete graphics card business after sinking so much time, marketing and, of course, money into LRB? I honestly couldn't answer that question since it makes as much sense business-wise for Intel to let go and stop that costly venture, as it makes sense to try and expand onto new markets.

For the foreseeable future, the only way Intel will make money off lrb is they can sell it as a consumer gpu.
 
I'm not sure I buy the notion that Larrabee consumer version was canceled because of lack of performance. Since when did that stop Intel from selling boatloads of GPUs? I believe that if Larrabee worked and had even pathetic performance compared to its contemporary competition, Intel would be releasing it in at least some form.

I say it just doesn't work at all as a traditional GPU, meaning it can't run anything in the DX and OpenGL library without looking like a failed science experiment.
 
I wonder if a new approach would help with iteration: target mainstream and mid-range (e.g. GPUs in the 100m-250mm range), small enough to integrate onto a die with CPUs, and move from there. Going for broke at the high end is hard due to size and competition.
 
I'm not sure I buy the notion that Larrabee consumer version was canceled because of lack of performance. Since when did that stop Intel from selling boatloads of GPUs?

People don't buy discrete graphics cards to *not* accelerate their games. People also don't choose their IGP.
 
Back
Top