Sony could have gone with a potent OoO PPC/Power though. And Cerny was investigating x86, not HSA, and concluded it was okay (in 2007). So the complaints of the devs had to be regards x86 as they understood it back then. I expect many of Sony's first parties hadn't had decent experience with low-level x86 coding for long enough to not appreciate the architecture's potential, maybe, but I'm not really seeing the obvious issues with x86 nor what PPC could do better aside from tools and experience (and considering most folk are coding in higher level languages, I'm not sure what value x86 experience really brings anyway!).
By late 2007, other non-technical trends were already kicking in that favored x86 over PPC.
Apple had dumped PowerPC by then, with indications that IBM did not find a benefit in pushing the design envelope in that range of product. There were and are very few vendors with the capability to manage large-scale OoO high-performance cores and the infrastructure needed to feed them, and IBM was phoning it in.
This was also following other things that may have been worrisome to Cerny, such as IBM's wrong turn with the in-order POWER6.
IBM showed a willingness to invest in very large POWER server chips, but it kept the crown jewels far from the mortal chips Sony would have wanted.
The stagnation would have been worrisome. ISA differences, barring exceptionally bad choices, are a second-order effect, which are imminently survivable in the power-cost space of a console and can with enough money and engineering be overcome. Just knowing that an architecture is going to iterate a lot means it can grow itself to a better position than an ostensibly superior evolutionary dead end.
AMD and Intel were stepping things up, and by that year AMD had announced SSE5, which brought vector extensions that had better permute capability and FMA. This showed that x86 was going somewhere useful.
By the time IBM got back to OoO server chips it historically would never let out of the vault, it was receding in mindshare and lacking in GPU tech.
It looks to me that Sony and Microsoft probably called this one right.
On the other hand, it would have been a scary number of years, since the call was made just as AMD proceeded to flounder architecturally.
The two-horse race Cerny may have hoped would continue to drive design evolution fell apart, but Intel did continue to innovate and an AMD's Fusion initiative at least showed some kind of significant design initiative and progress on the GPU front that IBM had no presence with.
It's not clear if the core philosophy that went into Bobcat and then to Jaguar had really come into its own at that time.
If AMD hadn't screwed the pooch so badly, one would wonder if the rumors of Orbis and a Steamroller APU could have been where Cerny wanted to go. Jaguar isn't a bad single-threaded performer, but I feel its selection is also something of a repudiation of the direction AMD tripped itself into with its big cores.
I wonder if Cerny lost any sleep in the years Bulldozer was losing to its predecessors, and Llano was delaying and disappointing the Fusion effort. Then there was the prospect of losing AMD as a design alternative, and then having to face Intel's much stronger bargaining position and very unimpressive graphics in that time frame.