But I admit that the situation is pretty complicated.
I won't quote your whole post, but that is the key part anyway
However, in an attempt to answer some of your points, I'll outline my situation. I have a 50" Samsung Plasma with one HDMI port. So I use a switcher box (which works well as I use a Logitech remote for everything except the PS3 controls (for obvious reasons)). Into the switcher box goes my SkyHD, PS3, 360 and DVD Surround System. Of course, that means that the one HDMI input on my TV is set to levels that most suit my viewing requirements.
And it's pretty good. 360 games look great, with good colour, blacks and whites. PS3 BluRay's have the right balance with deep blacks and white whites, SkyHD looks good as do DVD's. PS3
games, on the other hand, can often look washed out as has been seen in many games since the console launched.
So what to do? Recalibrate the TV every time I play a game? Not really likely, is it.
So it's up to the developers. Or maybe Sony should have set the colours/contrast levels of the GPU to give a more pleasing image day one. It's long been known (and this as a fully paid up Nvidia fanboy) that Nvidia GPU's, especially from the Geforce 1-7 days, offered a slighly more washed out image than competing ATI GPU's. The internal colour calibration of the 8800's onwards has been much better.
Which again brings me back to the "complicated" argument. Everytime any comparison comes up, forums will be full of "Youve been pade by M$, my PS3 dont look liek that!!!!!" comments. However, with all of the processing that is done by modern HDTV's, it's not likely to. On top of users calibrating the colour and contrast, most have internal scalers along with an assortment of processing abilities such as sharpening and smoothing, etc.
But it's not the job of the pixel-counter to 2nd guess how the millions of PS3/360 owners out there have their TV's set up, and any attempt to do so just muddies the waters and would make the whole "like for like" comparisons fraudulent.
Imho of course