No one is talking about *switching* to raytracing, but if you follow the literature it's pretty well accepted that it's a useful tool to have in the box, and thus an architecture that can both raytrace and rasterize "efficiently" is clearly a win.
This is a technical demo that showcases one particular application of the flexibility that Larrabee affords, not an attempt to reinvent the graphics pipeline overnight. NVIDIA has given similar demos recently of raytracing running on their architecture, and similarly they weren't trying to imply that we're going to stop using rasterization any time soon...
(As an aside, it's really ironic that I'm the one who has and will continue to argue strongly that rasterization isn't going away and it would be stupid to replace it with raytracing, but conversely people seem to have swung so far the other way that they thing raytracing isn't useful *at all*, which is an equally stupid stance. To dredge up and old example, it's funny to me that people continue to argue about whether apples or oranges are "better"... let's just have both
)
I see this comment thrown around a lot and superficially it seems to hold water... but in reality, I wonder whether people have the facts/numbers to actually back this up (particularly from the hardware point of view), or whether people are just making a lot of assumptions. I tend to give hardware designers the benefit of the doubt with respect to making good decisions, but hey I'm no hardware expert and maybe the people making these comments are
Still, if that's the case, I'd be interested in seeing the facts/logic backing up the assertion rather than more vacuous statements.