The 750 gx and fx curb stomp the cle (broadway). It is a much, MUCH larger improvement than bobcat to jaguar, even on the same process size, and at the same clock as the 750 cx/cle.
Could you name some of these big improvements? According to the technical summary (https://www-01.ibm.com/chips/techlib/techlib.nsf/techdocs/BECF98824B9B663287256BCA00587B22/$file/750FX_Technical_Summary_DD2.X_V1.0_prel28May02.pdf) the improvements from the original 750 and its shrinks are:
1) More L2 cache
2) External bus pipelining
3) An extra outstanding L1 miss
4) an extra FPU reservation station, and faster reciprocal estimations.
This is at best comparable to the changes made with Jaguar, which off the top of my head include widening the L2 cache interface, more/shared L2 cache, deeper OoO window, better divider, deeper load/store queues, and 128-bit SIMD (widened integer too, not just SIMD) with a lot of improved timings (latency, not just throughput), and a bunch of support for new instructions.
GX then adds exactly two performance improvements (https://www-01.ibm.com/chips/techlib/techlib.nsf/techdocs/5A61BEB893287FF987256D650066CFD5/$file/PPC750GX_diff_v1.0_080604.pdf): more outstanding L2 misses (from 1 to 3-4) and more/higher associative L2 cache.
These are not exactly earth shattering changes.
And Broadway is not just a shrunk vanilla 750 itself. It actually has two reservation units in front of its FPU vs one on FX and GX, and has paired singles. So FX and GX would have even worse FPU performance, hardly what Nintendo should have gone for.
I'm skeptical about all this hype about huge perf/MHz gains in the 750 family, especially when the people posting it aren't actually referencing the changes. There's a reason these processors are called 750xx instead of being made part of a new processor family.
espresso does not have 3 cle (broadway) cores. Those are 750 Gx's with nintendos custom extensions. Big difference. Huge difference, and no, its not just cache size.
Do you have a single piece of credible evidence for that claim? Spending the money to update a different old processors like GX to fit Broadway's specifications instead of going for a substantially better processor doesn't make sense. The entire reason Nintendo would be so interested in using processors that are functionally as close as possible is likely a paranoid fear of compatibility problems.
But its a big enough difference for it to be just as silly thinking its just a tricore arthur 750 (the twelve year old processor you were talking about, might as well call jaguar or an icore or any x86 platform a 1978 processor since its based off the 8086. Its a silly practice.) Or even broadway.
That comparison is outrageous. The original 750 was released in 1997, the FX in 2002 (5 years later) and GX in 2004 (7 years later). That's absolutely nothing like the difference in time between 8086 and Jaguar (35 years!). And just because FX and GX are substantially newer doesn't mean that they're dramatically better. You can't just say that because processors X and Y over a time span from some other architecture are very different that any given two processors on a different architecture over a similar time span will be just as different. There are embedded cores that have stayed almost the same for decades. Alongside FX and GX there were much more modified PPC cores; this is what market segmentation/product differentiation is about.