From Neo-GAF
thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).
CPU
The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:
- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.
- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.
- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.
- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.
The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.
There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.
GPU
The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:
- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.
- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.
- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.
- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.
RAM
It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.
Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:
- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).
- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.
- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.
- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).
- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.
Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.