I suppose 225W is being considered as a total CPU+GPU thermal envelope, but dealing with that heat from one chip strikes me as a Bad Move. Thermal reduction over the life of the platform will be retarded due to decreasing process-shrink returns, so the box will be looking at being hot for all SKU iterations, even at mainstream prices. This would need an expensive cooling solution either pushing the cost up, or decreasing profit margins significantly at the lower price end.
If Larrabee's to be used, I see it more as a Wii-like role, in a platform that's 'good enough' but not 'cutting edge' in performance but very developer friendly, for a lower cost platform. This would fit in better with the New World Order of mainstream, wide gaming appeal. If we're gonna have a box that everyone's playing, keeping it cheap to suit all pockets makes sense. A single CPU/GPU solution would fit very nicely into that system offering developers supreme flexibility in processing budget, whether to throw the majority of that power at visuals or to reign back on the visuals for a title that uses the processing power for other things.
Also, as i understand it, the preferred rendering model will be TBDR. This will reduce RAM BW requirements, good for reducing overall system costs. Will that actually become a preferred rendering model for other architectures though? I'm not seeing a huge difference between Larrabee and Cell in overall processor structure architecture. A Cell-based PlayStation could presumably manage much the same. What about other GPUs? Would a honking great nVidia GPGPU require more RAM BW overall, or will we be seeing more flexibility in how the GPU IHVs handle rendering, offering effective low-bandwidth solutions?
And could Larrabee be paired up with eDRAM for some very fast framebuffer options?