Why put a POWER7 in there? It's a chip architected for massive data workloads. It has 4 MBs eDRAM per core, so how does that fit in with the eDRAM? 4 MBs for the core and 28 MBs for the GPU? If they're going with a custom core based on POWER7 but not a carbon copy of their existing P7 core, why not just include Broadway BC in the chip for Wii BC and use three copies of the same custom core and make life easier for your developers?
I don't think that the power7 is only intended at massive data workloads. IBM only use it (used it) in high end server configuration but I don't see why the CPU could not be used in lesser set-up.
It's a bit like Intel late design server parts come with significantly more cache whereas core i3 only have 3MB.
I think that the power7 might compare more favorably to Intel offering than AMD parts. In its server form there are I don't remember how many memory controllers, quiet some cores running @ really high speed but I don't think that if the whole thing were to be tone down it would be worse than AMD CPU. Actually IBM (in their power a2 presentation) claimed that Edram has positive effect on power consumption.
As others here discarded the possibility revamped power7, but as IBM it self, be it through a marketing account insists on it, I've to say that now I doubt even though it doesn't align at all with many things that transpire in the media.
I just re-read some stuff about the Power7 it's indeed a monster (
here or
here)
In the second link I discovered that the Power7 have indeed 2 4wide SIMD units. The things pushes 8 DP FLOPS per cycle and I guess 16 SP FLOPS per cycle, as much as Intel late architecture (Haswel is pushing FP performances further though).
Even at pretty low clock I can't see the thing lacking muscles vs something like Xenon, even at half speed. I'm not sure it would be wise to touch such power house to the sake of winning a hand few sq.mm of silicon.
If it's that I think the numbers we have for the cache are not correct or misinterpreted.
I've really a hard time with "tripled confirmed" information. My logic begs me to reject that information as it doesn't match what I've seen and read so far.
If that thru there is definitely something wrong somewhere. I mean it would take terrible devs kits and development environment for some devs to come out of the wood and state the CPU is weak.
Either way it is FUD sponsored by competitors but why in hell would Nintendo not react to such claims? I don't know I fail to understand.
A power7 core must be somewhere between 25 and 30 sq.mm, 3 of them say 2/3 MB of L3 and a memory controller would be around 100 sq.mm at max.
It's a significant investment in silicon, I can't see if that true Nintendo using a GPU as sucky as I think they did. It would not make sense.
If this is true BgAssassin's source claiming a raw throughput of 500/600 MFLOPS for the GPU would make sense. I would expect Nintendo to spend at least as much silicon on the GPU as on the CPU. That's kind of throughput is reasonable for a 100 sq.mm or more
Then there is the edram, another significant investment in silicon.
All this to me doesn't add up, that has nothing to do with a gimped hardware and the games I've seen this far. This could be a system that could indeed end in between this gen and the next one.
I could see such a thing mimicking the 360 with pretty much the gpu acting as a north bridge for the CPU, and edram including the ROPs. AMD would have improved the link between the smart edram and and GPU so the later can read from it.
The all thing is again, if that is the case why the FUD in the media? Why no reaction from Nintendo? Why the games so far look pretty underwhelming (with no AA)?
If this power7 is true it's definitely a hint that the system is not gimped so to speak. There might something rotten in the land of Denmark.
It would take a really dreadful and immature environment to justify what we are seeing from now.
May be there is something completely in the way Nintendo present the system ot the devs? A crazy thick layer of I don't know what. I mean like Nintendo enforcing the use of an wii emulator as an API lol.
I state that as a joke (actually some games looks like a Wii emulator) but on the other hand imagine poor Nintendo with on its hands a CPU that supports, 3, 6, 12 threads, a modern GPU supporting advanced programmable shaders.
One could wonder with all the noise we heard about the project and how it seems to have been finalized pretty late, the trouble they had with the pad, etc. if Nintendo is ready at all on the software side of thing.
All the dissonances we hear could be a proof of that:
1GB of ram untouched for now?
Support for a second controller but not for now?
What about the network infrastructure?
I mean could it be that Nintendo kind of jump on the gun on the hardware (+300 sq.mm of silicon, 2 GB of ram, lots of processing power), coming from systems with tenth of MB of ram (actually the 3DS has more than that) and is kind of overwhelmed by the effort to be done on the software to make the most out of this? I they have to build a lot and from scratch. (they may for example not support multi threading for now).
Either way I'm in darkness too much contradictory information for me to swallow just now. What I can tell is if that so Nintendo should give up on retaining information about the hardware, to say it is counter productive is an understatement...