Do we have any figures supporting this view, or some research? I know muliplatform issues make a lot of noise in the internet, but are sales really impacted by the knowledge one platform is inferior to another?
Well, you can't really release the same game twice, once with and once without issues. But at the end of the day, this is more of a marketing thing and we're engineers,. We try to deliver what marketing wants, and they don't want inferior versions.
So the pressure may not come from the market itself, but it' still there.
On a less hearsay basis, I promise you that you will see a lot less cross-marketing opportunities from the manufacturer of the inferior version. This may or may not impact sales, but it will impact your own marketing budget/clout.
Including the install and persistent data space, or just system cache?
I have to wave the NDA-flag here. But think about a limited HDD and unlimited numbers of games and you'll get an idea. Sorry.
I believe at least on Linux, PS3 hypervisor reroutes SPE IO calls to OS@PPE via soft interrupts. I think standalone performance is pretty similar to PPE IO calls.
Well, inside an SPE, the SPUs can only communicate through channels. Those are very minimalistic and basically lead to the MFC. The MFC is mostly an overglorified DMA-processor, so it moves data to and from local store. If you want to access OS-services, you can of course use a background thread that waits for commands from the SPEs and then processes it. But what do you use the SPU for then? (Disclaimer: I'm not an expert in writing operating systems on Cell. I may be missing something here)
I'm still not convinced that project directors are misallocating resources or making bad technical decisions, which is a fundamental but unstated component of your point of view.
It's not misallocating, it's just sub-optimal. Basically, people who have to decide methodology for large shops *should* be risk-averse. It's a good thing, I guess, most of the time. People need time to switch methodologies, some take longer than others. Now if you have a 30 programmer team and your hot-shot engine progammers go fully job-driven, this will cause trouble for the less technically inclined people. And on the other hand, your hot-shot tech dudes don't usually make these decisions.
You can't make something Cell-friendly for free, and you can't get Xenon benefits for free.
I'm not totally sure if I'm on the same page as nAo (sorry for inserting myself into your discussion, BTW), but I don't see it as Cell vs. Xenon. Yes, writing the SPU-code takes time and writing the VMX128 code takes time as well. These are the platform specifics. The rest is architecture you would even want on the PC. Interestingly, one of the first instances of "we have to do it like that" in our project came from the PC tech-god, btw.
For any given multiplatform budget, you will spend more on coding and less elsewhere if you want PS3 parity. This, and this alone, is my point.
Point taken. It all depends on where you want to compete. I can make awesome XBLA titles that run on one core. If I want to compete with the big boys, I'll have to get as much out of the machine as possible. But sure, if I compete on gameplay and not on tech, PS3-like coding is overkill, no doubt. Even on the PS3.
Okay, what you're saying is:
SMP-optimal-model = Cell-optimal-model, and vice versa.
But Non-Cell optimal model != non-SMP-optimal model. So there is still a difference between these 2 models.
Of course! There are more models than the two. Hell, computer science is so full of models, you'd think it's fashion week.