True, but the proof is in the pudding, and the 360 had games with great graphics as good as PS3, and that had to get through to consumers on some level as well.
Consumers aren't dumb. Lets say, Battlefield 4 comes out, and it looks as good on 360 as PS3. No matter how much they've thought PS3 was more powerful, that has to register.
Well, you know that I of all people agree with that
The thing is, I heard both from back channel sources (at which time I was dubious), and then it was confirmed in the Eurogamer One architects tech article, MS was considering enabling the two redundant CU's, very late in the game, and decided against it.
This was framed as a choice between the upclock, but I think it needn't have been, they should have been able to do both. I am betting the costs would have been low as well. I've read they'll likely purge the additional redundant CU's once yields get better anyway. If that's true the yield hit could not have been too great.
14 CU's at 853 mhz would have put them at 1.52 TF, at 900 mhz if they had chosen to bump clock a little more aggressively it would have put them over 1.6, it would have enabled some additional TMU's as well. I think it more or less would have ended a lot of these power problems (1.5 vs 1.8 is in negligible difference territory), and I bet the cost was exceedingly low.
BeyondTed I think, told me that he thought they didn't overclock more than 853 because it might have destroyed their ESRAM BW increase window. I kind of doubt this technically from the way the architects described how that worked, and he didn't elaborate.
But I noticed the specs of the R7 260X the other day, this is the low end for AMD, it's basically the bonaire refresh, that thing is stock clocked at 1100 mhz! It's over 1.9 TF with 896 shaders! Such a shame MS couldn't have been more aggressive like this.
That said, the Kaveri APU's are enduring clock decreases, from 4.0 ghz CPU in Richland to 3.7 in Kaveri. AMD said the reason is their old process was optimized for CPU, and now it's more balanced between CPU/GPU.
In other words I'm guessing an SOC optimized process (presumably used on Xbox) cannot clock as high as an strictly GPU optimized one (presumably used on Bonaire). STILL, I think they could have done better than 853 mhz. I am guess their only concern on the upclock was heat/noise, with unfortunately performance being dead last in priorities.
Now MS will have to spend a whole lot more in marketing and price cuts to sell an underpowered machine. It's such shortsighted management imo.
Such a little thing but I think it would have done wonders, perhaps their biggest single mistake was not enabling the two more CU's for maybe $2 per unit, imo. If I but a $500 machine, I want it to have some grunt.
Whew all of the rambling, maybe a bit OT and technical, oh well!
In the end they did give us several performance improvements:
GPU 800>853
Magical ESRAM bandwidth increase 102>204 GB/s
CPU 1.6>1.75
And none of the GAF sponsored rumored downclocks. So I guess I should be happy. I'm just kind of OCD and it's going to bother me knowing I've essentially got a lowly 7770 in my console for the next 5+ years...especially that I was rocking a 4890 in my PC from ages ago, and X1 isn't that much more powerful...
I wonder if after feeling the backlash over 720P gate, the MS execs would have enabled those two CU's if they could go back in time?
I bet it is a priority for them now to streamline the drivers, Thuway's 10% driver performance improvement rumor may be a hint. Microsoft are probably scrambling to lessen multiplatform differences next time around.