We probably shouldn't assume that. The DF article is worthless. "We told you so", yet nothing they wrote makes sense. The GPU doesn't look like a HD4670 at all, cut down or otherwise. The shader clusters make no sense, what they've identified as TMUs doesn't really look like TMUs, and we have no...
The game DF was talking about appears to be Most Wanted. The Wii U version is based on the PC version - better textures, improved lighting, draw distance and such. Ported inhouse at Criterion: http://www.youtube.com/watch?v=0eH3DmUgokk
Almost everything in this thread is unscientific. It's all conjecture and speculation, with a ton of confirmation bias thrown in for good measure. Some people claim that the GPU can't have 320 ALUs because most ports perform worse than on 360, I claim that in at least one case, the slowdown...
The kits the first demos were running on reportedly had the GPU clocked at 400MHz. Therefore, both the Zelda and the Japanese Garden demo were running on a 128GFLOPS GPU - if the system really only has 160 ALUs.
This is completely unscientific, but strictly looking at Razor's Edge, the reported slowdowns seem to have nothing to do with the GPU. While they occur only in certain situations, they don't consistently manifest. They're very much random. Digital Foundry mentioned heavy slowdown in a miniboss...
We don't know anything about the performance/clock, but the pretty massive cache alone should bring substantial improvements over Broadway. Feature wise, even Gekko was not the same as the 1998 model PPC750, and Espresso at the very least adds SMP on top of that, which no prior PPC750 ever had.
RAM bandwidth is unknown, could be several hundred GB/s if used correctly.
What exactly makes the CPU "crap" in your opinion?
I think we would have seen much, much bigger issues with the games so far if there were only 160 ALUs.
Unless we understand how that thing even works, we won't know if there actually are bottlenecks. It's more likely that most devs simply haven't figured out how to deal with stuff such as the strange memory architecture yet.
Yes, they said they didn't include it for legacy purposes, but made new hardware that was fully compatible. Which means all those unique things like the TEV and EMBM units have to be in there somewhere, and can probably be used in Wii U titles as well.
I really wonder what Nintendo did to ensure backwards compatibility. I guess it's possible that Latte is no R700 at all, but a modified Flipper with Radeon parts bolted on top - or vice versa. I don't think Nintendo would waste die space on stuff that isn't used in native mode.