I think it is more than that, the only thing the author listed as advantageous for the new consoles is the OoOE as well as other minor architectural enhancements.
He's dumbing it down for the audience. Nowhere did he actually make an estimation of what the difference in perf/MHz is like, he just said it's different. OoOE is not at all a dominating factor compared to all those other things I listed. You can find in-order CPUs that have much better perf/MHz than the PPE (Intel's Saltwell, ARM's Cortex-A53 will be an even better example), they're even still dual issue. You can't ignore all the serious performance problems these CPUs have when not dealing with very data regular code.
Well, I guess actual games will be the judge of that, right now it does seem like the majority of next-gen flare in cross-platform games is stemming from the better GPU (ie the rendering side) we will see down the road if the simulation side is catered for.
Yes, it takes time to develop these things. I remember people saying similar things when this last gen started.
I also noticed that most points in this discussion focused only on the X360 CPU, ignoring the PS3's Cell completely, which doesn't bode well for the new consoles. unless there are other bits I am not aware of.
Because a) XBox 360 very usually did as well as better than PS3, there aren't really showcase examples where PS3 games had this huge advantage in logic, it's largely because the SPEs were quite domain specific/limited in the kind of work they could do and b) like I said, the stuff SPEs were good at is mostly stuff GPGPU can do now. In fact, much of what the SPEs were being used for wasn't just GPGPU friendly code but actual graphics, a lot of stuff that was even being done by the GPU on XBox 360.
I really think the Wii U is a bad example, games there look and run pathetically, while some of them come close to PS3/X360 they usually lack AA/AF , scale back on textures, shadows and other features, run with horrendous 20ish fps .. etc.
Many developers actually gave up making any games on the Wii U at all, due to it's bad performance. if it was so close to the PS3/X360 that wouldn't have happened.
Look and run pathetically? I don't know what you're reading, most games that DF analyzed look about the same (some with better textures or more post-processing) and run somewhere between the PS3 and XBox 360 versions. A few have particularly notable issues, but a couple others run consistently better.
For argument's sake, let's consider they are not really that bad compared to last-gen. However, even with that when you have a GPU that is 10 times better than last-gen, while the CPU is only 2 times better, then you have a CPU that is barely faster than last-gen.
GPUs have always gotten faster at a more dramatic rate than CPUs. They also hit diminishing returns with those improvements faster, at this point you need a lot more GPU power to really translate into a much nicer looking game. They're not really uniformly 10x faster in every way anyway.
pMax said:
Just a question - are CPUs today 10x faster, IPC clock per clock, than CPUs of 2006?
No, of course not, especially if you're starting with Conroe (which was a 2006 release) and not Prescott. And it doesn't change much if you look at peak single threaded performance and not just IPC. The consoles really did have exceptionally bad perf/MHz, but even then I doubt you'd hit 10x that.
People groan about the consoles not being fast enough but they couldn't have really done that much better. Intel was never an option. Maybe if they pushed a Piledriver core to the limit they could have gotten a little over 2x the single threaded performance, with somewhat of a hit in threaded scaling. They couldn't have some monster APU with 4 PD modules and keep 12-18 GCN CUs, that would have been > 500mm^2 for sure and probably a huge hit to yield. So they'd probably need to something with 2 modules, but at the clock speeds needed to get similar multithreaded performance they would have used a ton more power, which they're likely also pushing the limits of right now. This is also a crappy situation if you still want to give up an entire core of even two for the OS.
Face reality, there's a reason both companies went with what they did.