B3D is older than that.
I meant in it's current incarnation.
B3D is older than that.
More seriously though, why do companies divulge their roadmaps three years ahead? That never made sense to me, unless you're in a situation where you have nothing else to lose. AMD, I can understand. Intel doesn't seem to be in that position though.
I meant in it's current incarnation.
Easy answer and one that bears constant repetition. The point of such statements has nothing to do with what the state of things will be like in three years; rather, it has to do with how the company is perceived by investors *today*.
I think you might be putting some misplaced faith in Larrabee--it has to win meaningful market share to get developers to write for it (since I'm guessing it's not just going to be Direct3D), but there have to be games for it in order for anyone to buy it as a graphics board.My ultimate point is this: I'm betting that when people look at the kinds of games you can build with Larrabee vs. the kinds of games you can build with the NVIDIA GXX, they may actually like the Larrabee option.
They spent a lof of time hyping RT but they really didn't tell us why we should get hyped about it. I don't think gamers will switch to Larrabee just cause they can get local reflection here and a bit of dynamic ambient occlusion there, given that rasterization can fake both right now.The promise of Larrabee as a GPU, it seems to me, is in the real-time ray tracing stuff that Intel is developing, and in how that combines with raster and other techniques to yield the possibility of new types of eye candy (or physics, AI, etc.). At least, Intel spent enough time hyping RTRT a their Research@Intel day that this is clearly the direction they're headed with Larrabee.
I think you need to make some pratical example here cause I don't see what RT will enable us to do that we can't do right now (or in 2 years).My ultimate point is this: I'm betting that when people look at the kinds of games you can build with Larrabee vs. the kinds of games you can build with the NVIDIA GXX, they may actually like the Larrabee option.
So what I envision as the post-Larrabee horserace won't so much be an Intel Larrabee GPU vs. an NVIDIA GXX GPU duking it out for FPS scores on the exact same game engine, as it will be Larrabee (which is "GPU" only in scare quotes) plus whatever "software" engine someone builds with it vs. a bone fide, raster-rending GPU from NVIDIA plus the state-of-the-art hardware-accelerated raster engine at the time.
My ultimate point is this: I'm betting that when people look at the kinds of games you can build with Larrabee vs. the kinds of games you can build with the NVIDIA GXX, they may actually like the Larrabee option.
OpenMP is a mixed bag in my experience. My personal feeling is that one of its major failings is that it's too easy to use (paradoxical as that may seem). It has some serious shortcomings in certain areas (eg. memory placement). It's a great way to scale from 1 to 10 threads, it's not so hot going from 10 to 100. IMO it hides too much from the programmer for it to be viable for extreme scalability on a NUMA architecture.
This is OT - I'm just starting to play with OpenMP a bit. There's one area that's really confusing me atm: how are OpenMP runtime errors communicated to a user program? For instance, say the runtime fails to create the desired number of threads, or a mutex etc... are there any standard ways to hook in an error handler (I'm not seeing any), or does the program just fall over and die?
The promise of Larrabee as a GPU, it seems to me, is in the real-time ray tracing stuff that Intel is developing, and in how that combines with raster and other techniques to yield the possibility of new types of eye candy (or physics, AI, etc.). At least, Intel spent enough time hyping RTRT a their Research@Intel day that this is clearly the direction they're headed with Larrabee.
So what I envision as the post-Larrabee horserace won't so much be an Intel Larrabee GPU vs. an NVIDIA GXX GPU duking it out for FPS scores on the exact same game engine, as it will be Larrabee (which is "GPU" only in scare quotes) plus whatever "software" engine someone builds with it vs. a bone fide, raster-rending GPU from NVIDIA plus the state-of-the-art hardware-accelerated raster engine at the time.
My ultimate point is this: I'm betting that when people look at the kinds of games you can build with Larrabee vs. the kinds of games you can build with the NVIDIA GXX, they may actually like the Larrabee option.
The argument used above is that Larabee will make NVIDIA and AMD obsolete because it's x86 and it can do ray tracing. As soon as you go DX, you lose both of these "advantages". However, to be compelling for gaming, Larabee must support DX-next.Dave said:Why presume that it wouldn't / couldn't use DX?
This is entirely key to any headway this architecture makes in the consumer graphics market. It's also pretty clear that we just don't know enough about the architecture to even guess at its performance at this point. A huge amount of detail needs to be fleshed out before we can find a fit for it in the graphics space.I'm not presuming it can't use DX at all. I asked the question because I don't see any reason to presume it *can*, or that if it does that the DX support will be the competitive advantage they are relying on. Most pointedly, I want to know how they intend ISVs to integrate the ray-tracing stuff on the software side. . . thru bog standard x86, thru DX, thru some proprietary api/sdk/etc that leverages x86, or thru some propietary software stuff that leverage something proprietary on the hardware side? (Did I miss an option? :smile: )
Do you mean using Larrabee as a gamer's CPU?Maybe they'll sell first gen as a combo CPU and ray-tracing accelerator. Sort of an integrated PhysX. . .and you still use your Radeon/GeForce for the DX stuff.
While I don't disagree with this, I think it isn't always about investors. If you need major support elsewhere in the chain you need to rally/interest/educate those other sectors too, and that also takes time. Whether it be ISVs, OS API, tools makers, whatever.