Larrabee delayed to 2011 ?

Larrabee Announcement

I love the nice die photo of Aubrey Isle, which looks like a Larrabee I part to me (count the 32 cores), as well as the picture of the Aubrey Isle board.

http://www.intel.com/pressroom/archive/releases/2010/20100531comp.htm

Intel says Knights Corner, a 22nm Larrabee with 50+ cores, will come out next year. I'm looking forward to its release, although it remains to be seen if they can hit their release targets.

Certainly all the verbiage about how you can just program one of these things like a Xeon seems wrong to me, I think getting performance out of Larrabee will require programming it more like a GPU than a Xeon. And I still think a 22nm Nehalem-EX derivative with 24-32 Nehalem cores would walk all over a ~50 core Larrabee for any workloads that don't use the vector instructions.

But, nonetheless, it's good to hear an update on the project.
 
I love the nice die photo of Aubrey Isle, which looks like a Larrabee I part to me (count the 32 cores), as well as the picture of the Aubrey Isle board.

http://www.intel.com/pressroom/archive/releases/2010/20100531comp.htm

Intel says Knights Corner, a 22nm Larrabee with 50+ cores, will come out next year. I'm looking forward to its release, although it remains to be seen if they can hit their release targets.

Certainly all the verbiage about how you can just program one of these things like a Xeon seems wrong to me, I think getting performance out of Larrabee will require programming it more like a GPU than a Xeon. And I still think a 22nm Nehalem-EX derivative with 24-32 Nehalem cores would walk all over a ~50 core Larrabee for any workloads that don't use the vector instructions.

But, nonetheless, it's good to hear an update on the project.

This thing must be smoking hot. It's only 1.2 GHz. :oops:

Also, where did they they say that the real LRB will come out next year?

Also, there seem to be a lot of empty spaces spread all over the place. Clearly, not many folks bothered to make it dense.
 
I love the nice die photo of Aubrey Isle, which looks like a Larrabee I part to me (count the 32 cores), as well as the picture of the Aubrey Isle board.

http://www.intel.com/pressroom/archive/releases/2010/20100531comp.htm

Intel says Knights Corner, a 22nm Larrabee with 50+ cores, will come out next year. I'm looking forward to its release, although it remains to be seen if they can hit their release targets.

Certainly all the verbiage about how you can just program one of these things like a Xeon seems wrong to me, I think getting performance out of Larrabee will require programming it more like a GPU than a Xeon. And I still think a 22nm Nehalem-EX derivative with 24-32 Nehalem cores would walk all over a ~50 core Larrabee for any workloads that don't use the vector instructions.

But, nonetheless, it's good to hear an update on the project.

End of 2011 should be feasible, although it sure won't be in high volume.

I'm impressed they were able to get fab space...22nm will be scarce for 2011, and I would have expected it to all go to volume products.

David
 
It was hopelessly outdated literally a week before it came out... The console graphics cards have definitely agedmore poorly than the CPUs IMHO (or equivalently, PC graphics cards have made leaps and bounds since the DX9 generation).

But wasn't that to be expected? Isn't it natural that Moore's Law would benefit parallel architectures more than serial ones? Are RSX's shortcomings to be blamed on NVIDIA, or on Sony for releasing the console so late?
 
It was hopelessly outdated literally a week before it came out... The console graphics cards have definitely agedmore poorly than the CPUs IMHO (or equivalently, PC graphics cards have made leaps and bounds since the DX9 generation).
At the same process and die-size, would a Geforce 8 GPU really help performance at all?
comparing a GF8600 GT and a GF7600 GT the amount of transistors nearly doubled (289 Mio. vs 146 Mio.) whereas the perfomance improvement in games was more like 5-10%.
 
But wasn't that to be expected? Isn't it natural that Moore's Law would benefit parallel architectures more than serial ones? Are RSX's shortcomings to be blamed on NVIDIA, or on Sony for releasing the console so late?
Sure, I'm not saying that I necessarily expect console hardware to be or remain "top of the line" for very long, it's just that contrary to what most people say (i.e. that consoles start as high end and are only surpassed by PCs somewhere in their lifetime) the latest generation was surpassed before it even came out.

From that point of view, I guess I'd say that it just came out too late... or conversely, if expected to come out that late, I would have liked to see Geforce 8-class tech.

At the same process and die-size, would a Geforce 8 GPU really help performance at all?
Hard to say, but the 7xxx series has *not* scaled well in terms of newer games. More importantly DX10-class hardware brought with it a ton of orthogonality (texture filtering, blending, MSAA) and a lot of important features (integers) in addition to much more uniform and predictable performance (end to the stupid stuff like rendering to 3 render targets gives more bandwidth than 4, etc).

Anyways I'm not going to question the economics, but as a tech guy the current generation consoles are not impressive (and really never have been to me) in terms of graphics hardware. Just too little too late.
 
Sure, I'm not saying that I necessarily expect console hardware to be or remain "top of the line" for very long, it's just that contrary to what most people say (i.e. that consoles start as high end and are only surpassed by PCs somewhere in their lifetime) the latest generation was surpassed before it even came out.

From that point of view, I guess I'd say that it just came out too late... or conversely, if expected to come out that late, I would have liked to see Geforce 8-class tech.

Well, that also isn't always the case. Xenos in the X360 was arguably ahead of the PC curve in some aspects at the time it came out in 2005. This was around the time of the R520.

In that light it makes RSX seem even more dated and old.

Regards,
SB
 
Well, that also isn't always the case. Xenos in the X360 was arguably ahead of the PC curve in some aspects at the time it came out in 2005. This was around the time of the R520.
Sure, it had about 1 year lead time on significantly superior tech (GF8). While I agree that it in some aspects it was "ahead" of the PC hardware when it came out, it wasn't anything major IMHO. My biggest beef with the current so-called "HD consoles" though is that their graphics chips are underpowered enough that they can barely handle 720p w/o MSAA, while on PCs that's a laughably low resolution that even R520 could handle with no issues. Hell it's hard to write anything not 100% CPU limited at 720p on PC! :)

Again, I'm not arguing the economics... obviously what they can do in that size and price range is limited, but I don't buy the argument that consoles are really significantly "ahead" of PC hardware even at the start of their lifetimes. They are at best on par overall in most cases.
 
Again, I'm not arguing the economics... obviously what they can do in that size and price range is limited, but I don't buy the argument that consoles are really significantly "ahead" of PC hardware even at the start of their lifetimes. They are at best on par overall in most cases.
Well, with Xenos there was a huge leap in vertex processing ability. If you look at console games today and PC games targetting R5xx/G7x, there's a large disparity in detail.

Overall I agree with you about consoles never overshadowing PC tech over the years, but IMO Xenos was an exception. Hell, I felt that it even outshone R600 architecturally when considering cost.
 
Sure, it had about 1 year lead time on significantly superior tech (GF8). While I agree that it in some aspects it was "ahead" of the PC hardware when it came out, it wasn't anything major IMHO. My biggest beef with the current so-called "HD consoles" though is that their graphics chips are underpowered enough that they can barely handle 720p w/o MSAA, while on PCs that's a laughably low resolution that even R520 could handle with no issues. Hell it's hard to write anything not 100% CPU limited at 720p on PC! :)

Again, I'm not arguing the economics... obviously what they can do in that size and price range is limited, but I don't buy the argument that consoles are really significantly "ahead" of PC hardware even at the start of their lifetimes. They are at best on par overall in most cases.

Sure I agree to an extent. Even more so going forward that consoles unlike perhaps in the past won't be able to be ahead of PC hardware when it comes out. It can however be argued that when the X360 launched it was perhaps more capable than the "majority" of computers in use at the time.

It does make me wonder if Xenos had access to as much dedicated graphics memory (512 mb) as R520 if it wouldn't have proven more capable? As it is having to share the limited memory pool with programs is going to severly limit what can be done with it, IMO. Not to mention the small memory pool available to programs itself after sharing with the GPU.

But yes considering cost factors, I don't see consoles one-upping PC hardware going forward. Even with the ability to target only one hardware configuration, nothing on the HD consoles (PS3/X360) is better than what could be accomplished on PC.

Regards,
SB
 
Well, with Xenos there was a huge leap in vertex processing ability. If you look at console games today and PC games targetting R5xx/G7x, there's a large disparity in detail.
I wouldn't say "huge", but sure. Still, vertex processing isn't everything and I still can't get over how low resolution stuff that the consoles spit out is. When comparing apples-to-apples the same game on the same TV (let alone a real PC monitor) true 1080p output from a PC just looks way better than the consoles.

Overall I agree with you about consoles never overshadowing PC tech over the years, but IMO Xenos was an exception. Hell, I felt that it even outshone R600 architecturally when considering cost.
Fair enough, and I do agree to some extent. I wasn't a huge fan of R600 compared to the GF8's (but more recently I still love the 5870 more than the GTX 480) so I won't disagree their either. Still, I especially feel for console developers when I hear what they have to put up with compared to even the GF8/DX10-level hardware.

It can however be argued that when the X360 launched it was perhaps more capable than the "majority" of computers in use at the time.
Yeah sure. Like I said, economically it's often a good deal and the "mainstream" PC market respects that.

It does make me wonder if Xenos had access to as much dedicated graphics memory (512 mb) as R520 if it wouldn't have proven more capable? As it is having to share the limited memory pool with programs is going to severly limit what can be done with it, IMO.
Yeah I can't speak from experience as I haven't played much with the 360 but from what I've heard the EDRAM in particular is pretty constraining on the 360 (Humus had a blog post recently about this). Texture and scratch memory is a problem too. I wonder if the hardware designers would have done anything differently (or at least added more EDRAM) retrospectively.
 
Texturing and ROP throughput wasn't so bad on G70-class products, and that's IMHO the main thing Sony needed the RSX for, not shaders.

"not shaders" .. You mean that just don't do any advanced light or surface effects and only do simple DX7 style effects?

Cell cannot work as pixel shader processor, the pixel shader processors have to be in the graphics chip.
 
Texturing and ROP throughput wasn't so bad on G70-class products, and that's IMHO the main thing Sony needed the RSX for, not shaders.
It's *all* too slow for my liking and still has tons of esoteric limitations and performance cliffs! Obviously I'm jaded, but how am I supposed to be interested in RSX when the 8800 comes out the week before? :p Anyways I think this is getting a bit off topic... I should know better than to insult any part of a "current-gen" console in an unrelated thread.
 
I wouldn't say "huge", but sure.
Well it was a factor of six for VS heavy verts, e.g. skinned characters w/ bump mapping, and a factor of 20+ for vertex texturing unless you could spare the memory for R2VB. I call that pretty damn huge.

(but more recently I still love the 5870 more than the GTX 480)
That's surprising to me. I thought you just liked it for all the little things it did right compared to last gen, and figured that you'd get more of a kick out of the new stuff that Fermi brought to the table.
 
Well it was a factor of six for VS heavy verts, e.g. skinned characters w/ bump mapping, and a factor of 20+ for vertex texturing unless you could spare the memory for R2VB. I call that pretty damn huge.
Sure, theoretically. My response was related to the qualitative difference in games... I don't see a huge difference in geometric complexity in favor of the consoles even for games of that generation of PC graphics cards. And sure, vertex texturing sucked at the time so I definitely given the 360 a few points for beating the PC market somewhat, but it was short-lived... at *most* a year until the 8800's came out.

That's surprising to me. I thought you just liked it for all the little things it did right compared to last gen, and figured that you'd get more of a kick out of the new stuff that Fermi brought to the table.
Yeah it was surprising to me too. Don't get me wrong: both are good, but so far the 5870 is more predictable and I've run into fewer performance glass jaws, let alone just annoying stuff (on NVIDIA compute workloads lag the desktop and are show highly variable performance over *seconds* even, driver scaling still broken, HDMI to TV is hit and miss, more hands and crashes)... mostly stuff they can hopefully fix, but when it's rarely significantly faster and it's bigger, hotter, more expensive and 6 months late, I just can't bring myself to care that much. Gotta play more with the L1$ to be fair, but I've seen no miracles yet from the architecture.

Again, pretty off-topic though I suppose :)
 
Back
Top