Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Status
Not open for further replies.
I don't get it. I'm reading that the memory bandwidth and CPU are absolutely terrible compared to the 360, but they paired it with a GPU that should be significantly better than Xenos. Makes little to no sense to me.

Maybe there was no cheaper GPU available (for mass production)?
 
Yeah, it's not like 360's CPU is amazing at this point either.

yeah, whats saddest is he reffers to the wii u cpu as slow junk, and he's comparing it to the pretty-terrible-especially-by-this-point cpu's in ps3 and 360!

hows it going to look next to the presumably decent even by modern day standards cpu's in durango and orbis? seriously, nobody will even begin to bother.

Anyways, fits with the narrative that Nintendo more or less gimped this machine (regardless of the GPU) with low bandwidth and a weak CPU. I really do consider Nintendo terrible engineers, I did before the Wii U, and it looks like they've reached new lows.
 
Speaking of the CPU Kevin Femmel at gimmegimmegames.com has floated an idea about the clock speed of this continuing conundrum. The Wii Broadway CPU had its clock "hidden" on the CPU itself and there is a number on the Wii U CPU which can be construed to be giving the same information. The last, and potentially relevant, string of text reads "1226LP734", or could it be 1.226Ghz as Femmel suggest?


See, I asked earlier, if you used Nintendo's multipliers what would that reveal about the speed of the GPU and CPU?

If the RAM is 800mhz, then the CPU would be 1200mhz, the GPU 400, and the DSP 200mhz.

That would make the WiiU a 1.6 jump from the Wii, which was a 1.5 jump from the GameCube. An normal generational jump for a Nintendo console.
 
Is Nintendo planning to integrate everything in the controller?? At 22nm and 20nm, power draw should be low enough to make Wii U completely portable lol.
Nintendo seemed to cut lot of corners to make it really power efficient and compact...
 
It's been rumoured that the Wii U CPU might just be a new version of Broadway, so I turned to Wikipedia for more information on it. According to them the 750cl on which Broadway is based (or vice versa) reaches up to 1Ghz at 90 nm. There is just the caveat that the same Wiki tells me that the 7xx series has been shelved, and we know that the CPU would have to be at or below a 45nm process node according to the teardowns. Was IBM willing to redesign a shelved design for a new process node? I guess time will tell.

Those are just my 2 cents.

Wikipedia also tells a Power PC 750 doesn't support SMP, so you can't make a triple core processor with it. As for those digits printed on the heatspreader, I believe the rumor is crazy and shocking, we all expected 1.6GHz.
 
My guess is that the CPU is lacking in raw vector throughput. Let's say it really is a < 1.5GHz Broadway. This CPU only has 64-bit floats (2-wide FMA). Clock for clock the integer/scalar performance would be higher than it is on Xenon, but the SIMD performance would be nowhere close (for fairly data regular code anyway). This would be put it several times behind Xenon and even further behind Cell in SIMD compute.

If this is really a triple core processor we're looking at ~11mm^2 per core, with 45nm being the most reasonable guess (I doubt IBM had 32nm ready back when the first dev kits rolled out). If the rumors are true that there's 3MB of L2 cache total then that leaves precious little for the actual cores. So I wouldn't be surprised at all if it didn't even have room for 4-way FMA, much less 2x4-way.

It also wouldn't be that unexpected.. Nintendo hasn't put a decent programmable vector coprocessor in a console since N64, and back then they pretty strongly discouraged anyone from using it directly. 3DS would be another current example that appears to be a decent GPU paired with a laughably weak CPU and no vector coprocessing off of the GPU.

If this is the case then maybe with Wii U devs will start to use the GPU for compute, which is a big departure from XBox 360 and PS3.
 
I don't get it. I'm reading that the memory bandwidth and CPU are absolutely terrible compared to the 360, but they paired it with a GPU that should be significantly better than Xenos. Makes little to no sense to me.

It reminds me of the SNES. The SNES had a horrendous CPU, nearly every one on the planet used a 7 or 8 MHz 68K, Atari, Amiga, Arcade, Neo Geo, Genesis, but Nintendo had to go out of his way to find an odd thing built-up from the 65C02, with a perfectly useless compatibilty mode.

Well, it was good enough after some slow early games but you had additional chips in cartridges to save the day (along with the impressive built-in hardware)

For the WiiU, I guess it's an amazing console if you make a high budget game that only targets it. (or an "original and artistic" game if you don't have enough money).
But mulltiplat games are hindered. Alternatively, if you get your game loop not using too much CPU but have heavy duty things you ran on SPE or threads, using the GPGPU is a possibilty. That's hard and has limitations, and the devs have to be willing to do it. That may be compared to stumbling on Sony's SPE, and damn, I'm having to redesign my game around it.. eDram would even play a similar role to SPE's local storage. But with GPGPU you have inefficiencies (wasted work) and stealing resources from the graphics.
 
Anyways, fits with the narrative that Nintendo more or less gimped this machine (regardless of the GPU) with low bandwidth and a weak CPU. I really do consider Nintendo terrible engineers, I did before the Wii U, and it looks like they've reached new lows.

Yes but. I remember "Iwata asks" with talk of using the least power and still have something usable, I guess they got what they wanted. If you decide you need a small and low power CPU so you can put it in a cell phone or a netbook you're going to make the same sacrifices and end up with something similar to or slower than a 10 year old desktop CPU. Lol, this thing is cutting edge next to an Atom or AMD-E350

How well Wii U CPU compares to a putative triple core E-350 would be an interesting question, I guess they share a similar philosophy, architectural specs (OoO, width, pipeline length) and strengthes & weaknesses.
Also, I have a feeling we're looking at ~ two years old hardware launching today..
 
How wonderful that Nintendo has graced developers and gamers alike with the system that should've been released in 2006.

I really want to hear from the horse's mouth the exact reasoning for the hardware choices. Outside of TDW and cost, it's almost as if Nintendo purposely gimped the system to make it just powerful enough for "reduced" multiplatform titles that they knew would be on the console regardless, but (relatively) plenty of graphics horsepower for exclusive titles to make use of. It's as if they think that limiting the CPU performance will limit the number of "system pusher" developers, and hence more thoughtful game designers will consider what they make for the Wii U, and hence better titles. I really hope that wasn't a real consideration. A wide dual-core Power7 should've been in the system. BC be damned or in some kind of integrated package on the CPU or MCM with the eDRAM

The obsession over the media and Wii Universe or w/e aspects of the system were, I think a mistake too. Obviously the 1 GB of RAM is there for that reason, as the first 1 GB is plenty enough for games with the very large eDRAM involved. Even something on the caliber of a full speed RV740 would be fine with just 1 GB 128 bit GDDR5 + eDRAM or 1.5 GB 192 bit GDDR5. 1 GB of memory reserved for system, OS, and background functions is ridiculous for a console. It's not a PC. 512 MB would be pushing it too.

Even with BC, and developer familiarity in consideration, it's pretty damning for developers. In a twisted way of personal thinking, I hope it's damning to Nintendo. They seemed to not have learned their lesson the first time with the Wii. Ninty could've pulled ahead with the choice system for multiplatform titles for the next year or two (1080p + 60 FPS multiplatforms!) while still offering something new and interesting with the Tablet controller, and having extra power that Ninty and other developers could harness for exclusive games that blow the 360 and PS3 away.

Good riddance.
 
No way the WiiU CPU can be fast, if it is made on 45nm as IBM says then with the known size of 33nm squared it is about 1/3 the size of the xenon by my math.
 
No way the WiiU CPU can be fast, if it is made on 45nm as IBM says then with the known size of 33nm squared it is about 1/3 the size of the xenon by my math.
Could be more like 1/2 (my guess is also memory controller is in gpu die hence those 33mm² are a bit more than they might seem). In any case though I agree this has to be more like smartphone class rather than (entry-level) pc class.
GPU isn't all that big neither btw, if you account for the edram (~50mm²?), but it would still look favorably to the competition.
The memory bandwidth of course is also very low, though it shouldn't really hurt the cpu side. You better hope though your frame buffers mostly fit into that edram...
 
Could be more like 1/2 (my guess is also memory controller is in gpu die hence those 33mm² are a bit more than they might seem). In any case though I agree this has to be more like smartphone class rather than (entry-level) pc class.
GPU isn't all that big neither btw, if you account for the edram (~50mm²?), but it would still look favorably to the competition.
The memory bandwidth of course is also very low, though it shouldn't really hurt the cpu side. You better hope though your frame buffers mostly fit into that edram...

The 360 also does not have the memory controller as part of the CPU.
 
Wikipedia also tells a Power PC 750 doesn't support SMP, so you can't make a triple core processor with it. As for those digits printed on the heatspreader, I believe the rumor is crazy and shocking, we all expected 1.6GHz.

Ah, there goes that then. I suspected that it was a long shot to begin with. Back to square one.
 
Unless the WiiU has a reasonably decent GPU with lots of cores, using GPGPU is going to really hinder what can be achieved graphics-wise. Also, RV700 series is quite an old architecture so probably not designed for GPGPU calculations either.
 
Last edited by a moderator:
Reggie says that apparently the Wii U is profitable as soon as someone buys one game. Assuming something like $10 in licence fees they would earn on a 'full' title, that gives us a pretty narrow bandwidth on the cost of this device (or set of devices ;) ).
 
all im going to say is..WOW!.

honestly im shocked, some peeps around here were saying this thing might have specs like this and be worse than current gen..and i doubted them....purely on the laws of physics..

They even seemingly have topped that with another gamecube derivative cpu clocked LOWER than the rumoured 1.5ghz number i refused to believe.
So for questioning your nintendo knowledge bloggers i humbly appologise.

Ninty you have broke new ground with your reverse moores law phylosophy...12.8gb/s bandwidth!?...cpu comparable to a atom n270 and no vector processing!....a bottom of the pile can be humanly possiblely built for late 2012 gpu to go along with it?
A wvga resistive touch screen contoller straight out of 2006....and 3 hours battery life to boot?

Any more than onecontroller and the game fps gets halfed!!....you give a handfull of the worlds top engineers and £500 million and they couldnt design a worse console....seriously its a work of genius....moores law in reverse!...nintys law lol, wii u...so bad its. Actually good!! :)
 
You know what, despite it being based on Broadway an everything, the Wii U doesn't even have particularly good BC performance.

It runs Wii games in 480p, compare this to the 360's BC which ran Xbox titles in 720p with 4xMSAA, and keep in mind this is full software emulation of the Xbox's Celeron processor on PowerPC Xenon.
 
It runs Wii games in 480p, compare this to the 360's BC which ran Xbox titles in 720p with 4xMSAA, and keep in mind this is full software emulation of the Xbox's Celeron processor on PowerPC Xenon.
But Xbox emulation on 360 was restricted and probably there were some to PPC translated binaries.

Hollywood was ~70mm² @90nm(?), probably with <20mm² @40nm, they integrated it in the GPU die.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top