...and look how it all turned out. Quality content from day one, tool and dev support from day one, etc, because xenos did all the work. If a company has the ability to shove a totally customized cpu *and* gpu in there and somehow manage the financials and heat issues then power to them, go for it and get the best of both words. If not then screw the cpu and go nuts on the gpu. Personally I'd still screw the cpu even if it was possible and instead go with a kicking gpu and use the money saved by ditching an exotic cpu to put more ram in there.
It depends on what you prefer. Some people prefer to have all the improvements in the beginning and show no real improvemented later on. The is usually terrible for a long-term lifecycle. Then, most like to have continued improvements to be shown thorughout the console lifecycle. That, usually, helps to sell more consoles towards the end of it's lifecycle. People feel like they are getting top quality at bargin basement prices. Can't you tell how the general mentality of people is shifting all around you at the halfway mark?
This type of comment always comes up and I'll always ask the same thing, what makes people think that in year 6 games are still being done with legacy thinking on ps3? If they were they would look and run like utter crap. It's a fundamental issue that comes up repeatedly most likely because people are still confused as to why the ps3 hasn't pulled ahead of the 360 graphically, so they always fall back to "not using spu's", "legacy thinking", etc type arguments. This really is nonsense at this point. As I've said before on other posts, look back at what a pc game with a 7 series nvidia card and 512mb ram looked like back in the day, and look at ps3 games today and it will give you a clue as to how heavily the spu's are being used. Anyone still running on legacy thinking on ps3 is basically dead in the water at this point.
There was an article about the 3 phases of coding on the PS3. I believe Mike Acton was speaking on this. The first phase was having everything or almost everything on the PPU with nothing to very little on the SPUs. The second phase was moderate usage of SPUs and some offloading of coding from the PPU. The third phase was light to no PPU usage for game code and heavy use of the SPUs. Sony's 1st party are on the 3rd phase, now. If you aren't on phase 3 or close to it at this point, it's legacy thinking to me. ND said the hardest part of taking full advantage of the Cell was to "keep all the plates spinning" (UC2 interview). Do you think any 3rd party devs have reached that point, yet? Until I see a Cell usage chart with associated jobs, I can't even say DICE is there. However, from their presentation, I applaud what they have done so far.
Personally, I don't understand all the "PS3 not pulling ahead of the 360 graphically" talk. I and most others seem to believe this happened some time ago. Most seem to believe the PS3 has pulled ahead in a number of categories (graphics, audio, A.I., and scale). I understand that you and some others don't subscribe to that, though.
I wonder what people will do if it turns out in the end to still have no advantage overall, even with Frostbite 2 engine. It will be funny to see if the Frostbite guys get hit with the "lazy dev" tag if after it's all said and done the 360 version ends up keeping pace with the ps3 version.
Based on the presentation, I don't think anyone would call DICE's effort "lazy". It doesn't mean they don't have a lot of room to improve their techniques on the PS3, however. That would still be effect, if the 360 version ends up
not keeping pace with the PS3 version.
But the ps3 would have graphically been better off if they put the same piece of crap cpu as the 360 has into their machine along with a better gpu. They probably would have won the console war that way as a side bonus. If they additionally ditched bluray and instead used that money saved to put more ram into the machine then they would have handily won this gen really quickly, and had significantly better graphical return than what you see now. Spending money on the cpu in my mind is the wrong choice for so many reasons, for business reasons, development reasons, and graphical return in the long run reasons. Plus remember that a pound of art talent is worth ten pounds of tech talent, so if you really just want graphical return then put more ram in there and let the artists do their thing. I really doubt you will ever see a gaming machine as imbalanced as the ps3 ever again, a situation where a gaming machine has a rockstar cpu combined with a geriatric gpu. I'd put my money on that, never gonna happen again, the lesson has been learned.
So you don't care for the better physics and audio the Cell affords games like UC2, Killzone 2 and 3? You wouldn't care for the additional space, in Blu-ray, that makes these games easier and better for the end user to experience (less discs, no loading screens, better quality audio, etc)? Would you have taken the HDD out as a standard option, for more RAM, as well? Of course, that takes away a company's incentive to subsidize a console as much as well. That, probably, means far less of a budget to work with for the design.
But that's exactly what I'm talking about. Every dev spends all their resources on just to keep up with the graphics and all other applications remain primitive. Where are those truly next gen games that do more with these consoles compared to a PS2/Xbox? So there are more AI characters running around in Reach, but fundamentally we're stuck playing the same gameplay, it's just prettier, louder, and there's a bit more of everything. That is not what we've been promised, but it all went sideways with the graphics race.
Flexibility means you can choose what you wish to put your resources into. It's just like some games choosing to render at a lower resolution than 720p for some games. The whole "they have to just to keep up with the graphics" part doesn't add up. It's highly unlikely they would try to improve these other areas. The proof is on the 360 multiplatform games. Most of those have the 360 ahead in the graphics department, but there is zero improvement in any other areas. If your theory held water, there would be some advancement in "all other applications" on the 360. After all, they are suppose to be twiddling their thumbs while the PS3 version is struggling to meet parity, right?