Yet, pretty much all graphically acclaimed games on PS3 came with MSAA/QAA until GoW3.
What about 360 games? Gears? Halo *? There is Alan Wake if you can count it with that resolution.
edit: Oh yeah, I forgot about Crysis 2 not using MSAA either.
And yes MLAA can be better than MSAA (definitely better than 2x). Think shader aliasing, post resolve tone mapping etc.
It's also programmable thus improvable unlike MSAA.
I think once again, you are just trying too hard.
Count how many games can't use msaa due to render engine incompatibility or whatever, one hand would probably have enough fingers to count them. Now count how many games can use msaa, you'll need a lot of hands to count them. All of those dozens of games were unable to use msaa due to the design choices made by Sony, specifically to use old gpu hardware that required lots of memory and provided slow msaa performance and instead go with a heavily customized cpu. Which was my point all along, that their decisions have consequences and their decision to ignore gpu and focus on spu meant years of games on that platform missed out on having any form of anti aliasing, or had to settle on image quality destructive qaa. I don't see how anyone can view this as a positive design choice.
It depends on what you prefer. Some people prefer to have all the improvements in the beginning and show no real improvemented later on.
If you buy a new sports car, would you prefer that it did 0 to 60 in 10 seconds for the first five years, and then finally do it in 5 seconds at year five? I mean seriously, who doesn't want their hardware to perform great from the get-go? Certainly not the people buying the product to begin with, especially at console launch prices. Certainly not the studios putting out the games as they don't want to look bad. And certainly not shareholders as they want to know that a competent product is out in the marketplace now, and not have to hear hollow promises. Plus, both platforms have shown consistent quality improvements over the years, look at standouts like Rdr or Castlevania and compare them to launch games. PS3 appears to have more change in quality over the years but that's only because it's first years of titles looked and performed so badly.
Then, most like to have continued improvements to be shown thorughout the console lifecycle.
Both have shown improvements, but given that the most impressive titles out are multi platform (to me) shows that the cpu gamble didn't pay off at all.
There was an article about the 3 phases of coding on the PS3. I believe Mike Acton was speaking on this. The first phase was having everything or almost everything on the PPU with nothing to very little on the SPUs. The second phase was moderate usage of SPUs and some offloading of coding from the PPU. The third phase was light to no PPU usage for game code and heavy use of the SPUs. Sony's 1st party are on the 3rd phase, now. If you aren't on phase 3 or close to it at this point, it's legacy thinking to me.
That's totally true...except everyone hit phase three some time ago.
ND said the hardest part of taking full advantage of the Cell was to "keep all the plates spinning" (UC2 interview). Do you think any 3rd party devs have reached that point, yet? Until I see a Cell usage chart with associated jobs, I can't even say DICE is there. However, from their presentation, I applaud what they have done so far.
An spu useage chart, as odd as it may sound, will give you precious little info as to whether or not the spu's are being used effectively at all. I could show you a 50% useage chart that is far more impressive than a 100% useage chart. Additionally, a 100% useage chart doesn't mean 100% useage on spu because they are dual issue. So you could have a chart that shows 40% spu useage and makes full use of dual issue, and hence is actually doing more work than a 70% useage chart that is making no use of dual issue. Finally, it's pretty darned easy to put out a chart that shows 100% spu use really. Percentage numbers mean almost nothing when comparing different games.
Personally, I don't understand all the "PS3 not pulling ahead of the 360 graphically" talk. I and most others seem to believe this happened some time ago. Most seem to believe the PS3 has pulled ahead in a number of categories (graphics, audio, A.I., and scale). I understand that you and some others don't subscribe to that, though.
You and I know this is infinitely debatable. Any "most others believe" statement can be countered with someone elses "most others believe" statement that says the complete opposite, and it never gets anywhere.
So you don't care for the better physics and audio the Cell affords games like UC2, Killzone 2 and 3? You wouldn't care for the additional space, in Blu-ray, that makes these games easier and better for the end user to experience (less discs, no loading screens, better quality audio, etc)? Would you have taken the HDD out as a standard option, for more RAM, as well? Of course, that takes away a company's incentive to subsidize a console as much as well. That, probably, means far less of a budget to work with for the design.
I definitely would have ditched blu-ray, but possibly kept the hdd standard. Standard cpu, killer customized gpu, more ram, awesome dev tools from day one. That formula would have won them this gen easy. Load times would have been faster since if they went with dvd then full game installs would likely have been supported, games would have looked better thanks to more ram, parity would have been achieved from day one, etc... It would have been win all around and we wouldn't even be having this conversation because right about now some studios would be dropping 360 support due to it being a distant third, as it's lack of memory would have prevented if from being able to visually keep pace and day one parity along with a < $599 launch price would have ensured another round of Playstation dominance.
Fyi cell doesn't afford better physics, ai, etc, it hasn't been a standout in that regard. No games have been ai standouts really, even games like KZ3 and Reach constantly have ai that does dumb things. They all mostly use havok for physics that runs on every platform on this planet and the next, so there is no physics advantage either. Remember as well that those spu charts you see with physics, ai, etc, on them doesn't mean it's doing more work. There are three cores in the 360 and 1 in the ps3, so there are two cores worth of general cpu load that have to be moved to spu no matter what just to maintain parity. Seeing those types of loads on an spu chart doesn't mean it has an edge in that category, every game *has* to do that to stuff on spu just to maintain cpu parity. Ie, they have to be at phase three for a while now.
Flexibility means you can choose what you wish to put your resources into. It's just like some games choosing to render at a lower resolution than 720p for some games. The whole "they have to just to keep up with the graphics" part doesn't add up. It's highly unlikely they would try to improve these other areas. The proof is on the 360 multiplatform games. Most of those have the 360 ahead in the graphics department, but there is zero improvement in any other areas. If your theory held water, there would be some advancement in "all other applications" on the 360. After all, they are suppose to be twiddling their thumbs while the PS3 version is struggling to meet parity, right?
They still aren't at parity just yet, standouts like Rdr and Assasins creed show that quite well although it has been getting better in this sixth year of the console gen. Which was my point all along that the cpu gamble failed in that it handicapped them for most of the gen, and provided no tangible improvement even years into the gen. Years later they are still figuring out clever ways to shift loads to spu, and there still is no game that is head and shoulders about multi platform games like Rdr, Castlevania, etc. The data out there couldn't be more clear to me, even if most people chose to ignore multi platform games, their design choice to focus on cpu was a failure.