You raise some very good and interesting points, Nick! Just wanted to say this right at the start, so you don't think I'm being overly negative towards you.
It's a tech demo. The first demos of pixel shaders also didn't look impressive at all
Hm, I remember the Doom3 tech demo running on the GF3 looked
quite impressive to me, certainly far more impressive to me then than this does now...
You really have to look at what has been technically achieved here.
I don't see why I should worship this tech demo just because it's a technical achievement; it's a technical achievement that doesn't have the
faintest hope to even come close to the best rasterization IQ we have today.
Yes, pixel shading was pretty primitive 6ish years ago, but the computing power of today's GPUs is so vastly higher today than it was then... The GF3 that ran the Doom3 tech demo had some 60 million-ish trannies as I recall, damn impressive for its time. The 5870 has well over 2.1 billion... That's a factor of ~35x. Of course pixel shading pretty much has to look much better today with such a vast increase.
What I'm trying to say here (perhaps not obvious, since I tend to be so damn verbose when I write) is that larrabee raytraces a tech demo with a framerate somewhere in the 20s and not particulary impressive visuals even if you ignore that's it's a brown-textured id Software game - apart from the fact it's raytracing in realtime that is... So if you're gonna bump up the quality of the raytracing to say Crysis at enthusiast level settings, antialiasing, motion blur, depth of field, tone mapping, the whole enchilada...what's the framerate gonna be THEN? Obviously not in the 20s anymore, but more like...5. ...If that much even.
Admittedly I'm pulling numbers out of thin air here, so I could be completely wrong of course. But it doesn't seem out of the realm of possibilities, considering what a strain Crysis is on traditional rasterizers that can render much faster than a raytracing setup can.
For a long time achieving higher framerates and higher resolutions was the main goal, but this 'fillrate-race' of GPUs is about to end.
IMO that race has already ended, it ended around when the G80 was released and real focus was placed on shading power rather than rasterizing power. It's true that rasterizing increased then as well and continues to increase even now, that's really a neccessity to drive increased performance, but it's not the main focus at all I think most would agree.
You need more rasterization power to increase framerates with high levels of anti-aliasing. We're not yet where Crysis runs at 200fps or anything even close to it, especially now that ATI is dabbling in supersampling AA again, that's the first time a major GPU vendor does that in more than half a decade or so...
An architecture like Larrabee is far better at coping with this, and no matter how you feel about the actual graphics that's proven by their raytracing demo. So even if raytracing itself is not the future, Larrabee allows the developers to go in any direction they like, unrestricted by rasterization APIs.
I don't see what actual NEED larrabee fulfils with its raytracing demo. Are any devs at all asking to be liberated from traditional rasterization, truly?
I seriously doubt it. Pretty much every devtool in existence is geared towards the polygon-based rendering method. Every major console and PC API are exclusively rasterization-based. I truly doubt, with all the pressing concerns regarding development time and costs that major new methods of rendering is going to be implemented independently by devlopers ANY TIME SOON. If ever really.
Especially not on a platform that's proprietarily owned and controlled by one single manufacturer, namely Intel, whom has no real prior experience in the high-end consumer 3D graphics and dev-relation/support field, has NO market share and won't have any market share to speak of for the next half-decade after Larrabee releases even if they stick a Larrabee co-processor into every single CPU or chipset they sell (unlikely).
Unless both AMD and Nvidia go out of business entirely, which developer would plow down millions into inventing their own completely custom larrabee-based rendering setup that won't run on any other graphics processor out there?
But Intel certainly has a head start.
They certainly have a head start with
something, but if it's a head start that's actually going to lead them to victory rather than towards a gigantic left turn into a bog is a different matter entirely. Lots of question marks regarding larrabee's actual viability (as opposed to theoretical potential) remain, and intel's continuing bumbling on the matter and pushing back deadlines isn't helping.