Well, fixed function is obviously slowly on the way out, to some extent. All GPUs are moving the way of unified shaders and programmability. You have CUDA, OpenCL and compute shaders on the PC side. DX11 allows for computer shaders to be grouped and share information.
I'm not knowledgable about GPU hardware, really, but I'd be interested in sort of a birds-eye comparison of an SPE vs a compute shader or other GPGPU implementation. I'm assuming the GPGPU APIs would lack some of the flexibility of the instruction set for a SIMD processor. That might be a good starting point for the conversation.
I guess something like AMD's Fusion is also very similar to a console implementation, where the CPU and GPU are closely tied, and the line between the two is blurring. I guess the point of the SPE and GPGPU are mostly the same - fast processing of parallel data. I guess the solution ends up being, in theory, the same from a very high level. The PPU in Cell is more like a traditional CPU, the SPEs are SIMD and RSX is GPU function. In Directx 11 you have a CPU and instead of at general purpose SIMD, you get similar functionality from the GPU side. So, in my mind, what you have with the PS3 and DirectX 11 are both heading in the same direction from opposite ends of the spectrum, I suppose.
Well, I have always been a big fan of Larrabee...despite that it was obvious it would never succeed in the PC world because most of its features would be unused. In that sense I agree with Shifty, closed hardware is completely exciting to develop for.
That's what I find appealing about Cell now that Battlefield 3 is going to come out. Cell was initially meant to be a CPU and a GPU at the same time so it's not an accident that they are using it for graphics techniques.
Even so, I am a big follower of ATi and their GPUs and they will always be there to help the main CPU achieve results that you can't reach using software solutions only.
Also I've been a fan of software rendering since the voxels days, when programmers created a game running on voxels using the CPU. I can't remember which game it was. Plus, PC games like the memorable and extraordinary Need for Speed 3 let people choose between Software Rendering and 3D hardware accelerated GPU Rendering.
Of course GPUs are unbeatable beasts since a decent graphics chip and its GPU based processing can surpass complexity optimization for a CPU, so the game looked better running on 3D accelerated PCs, but the software option allowed a few effects not available on the GPU render engine, as far as I remember.
As I said I love software rendering, and I was hoping Larrabee -or a similar chip by AMD/ATi- would be included in next gen consoles, along with a GPU with some necessary fixed functions. It would allow some crazy stuff that I long for this generation and it's certainly exciting to know that your machine has a chip inside with 50+ cores.
Speaking as someone who definitely and once and for all jumped into the software rendering bandwagon, after reading an iD Tech employee -most specifically Todd Hollenshead- stating that Software rendering was the future...and how much better it is compared to ONLY hardware rendering, I can only say I became a big fan of DICE and their engines. I loved how they used the eDRAM in Trials HD to achieve some extra effects to greatly improve image quality while still keeping the overall flawless smoothness of the game, for instance.
Crytek, iD and DICE are nowadays my favourite developers when it comes to engines. iD and Carmack have been always a classic in my list.
Battlefield 3 will show that Cell was a great and really differentiating choice made by Sony and it's possibilities weren't fully explored... even in the case of exclusive titles. "Playing God" in an attempt to help Cell and avoid RSX is very much a good thing. Anyway, the engine is written by humans so that tells how factual their work is in relation to what they want to achieve, and the PS3 is not as powerful as some people used to believe.
Cell in itself will be seen as a pioneering CPU in the future. Despite this, for a game such as Battlefield 3 I don't see it as a magic bullet that will make the graphics look great compared to the PC version or even the 360 version. It's an old CPU competing against top notch PC full-fledged GPUs. Too weak to be a great general purpose CPU and too meek to act as a GPU. But well, the potential is there.
I have a feeling that the Xbox 360 version will be also outstanding from a console point of view. The Xbox 360 CPU, while not being as versatile and powerful as Cell, is light-fingered and might steal the show in its own way. It has half of Cell's flops, performance wise, which is a great amount of power. And Xenos
is as always of great help.
Common sense tells me this, but maybe it's just me. (The same happened after the PS3 rampant piracy began. I was using common sense when everything seemed lost and thought that if it was just about some numbers, why couldn't Sony simply change those numbers and things would be fine? -turned out to be that the solution was the simpler thought- I didn't say a thing because I am not an expert on software and hardware subjects, and I was unable to properly explain my theory)
Anyway, DICE developers, welcome to my top 3 best developers ever list! Feel free to hang out on here -repi, Christina, etc- and look around the boards. Great stuff. Yay for you!