I've been under the impression that it has 8 pixel pipes. I think I remember some guy had his hands on leaked dev kit docs (which were extremely similar to Cube's) and that there was some mention of that.
That info was also part of a deleted Ubisoft interview too. That interview may have been fake though as there was some serious flak received for posting it.
http://www.rage3d.com/board/showthread.php?t=33857367
Ah, thanks for the info guys. Still, even doubling the TEV to 8 pixel pipelines wouldn't account for the discrepancy as that's only one part of the graphics chip.
I remember a rumor about some kind of DRM security function being integrated into the chip, but that'd be an awfully large DRM module if that's all that's extra.
I don't think it's entirely unreasonable that there are extra pixel/texel pipelines that have gone unused so far in most games. Sure, we're used to just being able to pop in a graphics card with more pixel pipelines and automatically get a processing improvement in our games, but from what I remember it wasn't always that way. I remember when the original radeon came out that could do 3 texels per pixel that it was stated games wouldn't immediately take advantage of it and that they'd have to be programmed for it. Now, that could have been fud, or perhaps it's only more recently that DirectX has automatically allocated graphics card resources. Anyone know?
If it's a nontrivial task to reallocated GPU resources on the wii, then perhaps games that started out on the Cube would have required substantial redesigning to take advantage of the extra hardware.
Additionally, many devs are looking to sell their games on the wii/ps2/psp trifecta, which could cause them to design their engines for the least common denominator of all three. Say the fillrate and processing power of the psp, with the lack of features of the ps2.
Another possibility is a large surge in the use of middleware. Gamecube had quite a few games developed from the ground up for it over many years, whereas games are being pumped out on the Wii like there's no tomorrow. Devs may be happy with Gamecube/PS2 level performance but much shorter dev times, especially when the reward for a highly tuned wii games is something that still graphically looks like something you'd download off of xbox live arcade and being 1-2 years later to the market than your competition on a system where games sell because of neat controls and not graphical wow.
...maybe. I'll readily admit that the only game on the Wii that I'd question a GameCube being able to handle it is Super Mario Galaxies, but it's too tantalizing that there's unexplained die space on the wii. Even if the explanation is that the gpu had to be "padded" to that size so there'd be room for the tracings or for redundancy, I'd still like to hear it. I'd imagine unused die space would have been removed in a later revision to cut costs though and someone would have noticed if that happened.
Maybe hollywood is one big failed SOC, and the 65nm version of the wii will only have a single chip.