Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)

What if this is for alpha/beta kits?
what about RAM? do you know something about that? :?:
 
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.

If this is the truth, so how then would you explain that multiplatform games like Batman AC, Asassin Greed III, and Ninja Gaiden 3, running pretty decently on WiiU?.
 
Last edited by a moderator:
Maybe they're all masochists on the Nintendo board, they love to have pain and suffering inflicted upon themselves?

I can't say that the prospect of the gamecube CPU making yet another appearance in a console appeals to me. Iwata's an electronics engineer, I have always respected him for that, but have the man simply gone INSANE? The basic design of that CPU's like fifteen years old now, or close to it anyway. Surely there are ARM cores today with both higher performance both per watt and per transistor than that old PPC nag.

Isn't Larrabee based on a bunch of 286 cores?
 
Just looked at what the GC/Wii CPU was and it's indeed nothing fancy.
2 wide SIMD/FP, no FMA or FMAC. basically the throughput per core and per cycle is a quarter of Xenon vmx 128 units (as far as FP perfs are concerned). It's a bit of a joke if true...

The up to date design in that range at IBM is the PowerPC 476FP or 470S, it's tiny and consume really few power even @1.6GHz.
If Nintendo wanted to stick to that kind of CPU they should have stick more of them.
6 cores at least, 8 may have made Ken Katuragi happy.
It could have make sense for the design, we have lot of efficient OoO cores, raw throughput per core and overall is lower than competition but we hope the thing to sustain good enough performances.

The 476FP core is 3.765 sq.mm on IBM 45nm process. You have to add the L2 the glue and IO but I mean I would not have cost an arm.
For 8 cores with say a sane amount of cache (256KB per core, the main one blessed with 512KB) how big the chip would end? 80 sq.mm?

For the GPU a lower clocked redwood would have done the job.

Nintendo would have ended up with two almost off the shelves parts, both doable on the cheapest high performance process around (ie TSMC 40nm processes).

>200 sq.mm of silicon, 1GB of DDR3, 256/512 of gddr5 (not even the highest speed).
The system would have been blessed with a lot less bandwidth but with such specs who would have really care? /Would that make a difference?

Actually the cheapest solution would have been to go with a SoC, a 192 bit bus, and 1.5 GB of fast DDR3. Would still be a under 200 sq.mm chip on an ironed out process (there were HD 7750 with more expansive GDDR5 selling under 100$ if memory serves right, the chip would have been in the same ballpark).

Is the WiiU as it is cheaper than that? I really wonder. In fact I would bet it's not.

I'm almost to the point where I suspect that the R&D department feels like it has to come with something custom to justify its own existence, or because they are asked to with no solid technical reasons as back up.
EDRAM is nice but did Nintendo look at what llano is pushing out on DDR3 for example?
Why give that much cache to what remains sucky CPU cores?
Why reinvent the wheel?

I don't get how looking at their goals and how much they could reasonably invest (R&D and system BOM) and looking at the market and what is readily available they end up with 3 sucky CPU and 32MB of Edram and most likely more than one chip.
not too mention a sucky resistive touch screen...
.

I mean at this point in history with the latest lithography allowing what was not long ago unthinkable silicon budgets achieving what it seems they wanted to achieve was nothing but easy.
 
Last edited by a moderator:
This is just insane, if true.

In an ideal world Nintendo would just give IBM and ATI a set price point. And let them use their best technologies to come up with hardware without interfering. Nintendo aren't really at the cutting edge of 3D tech and their internal hardware dev team arguably don't always make the right choices with tech.

I highly doubt they have to do something so weird to get engines up and running, WiiU CPU might be weak but, so weak that they have to offload work to the gpu? Sounds too stupid even for Nintendo, I think people are drawing lines too fast after reading that the gpu can use computer shaders, well r700 line has allways been able to do that but doesnt mean thay are using it to help the cpu.

Getting a DSP to hold sound workload on the other side seems like the kind of tweak they have chosen to get overall performance to a minimum goal.

Im really dissapointed at the overall picture we are getting of the hardware specs, 3x Overclocked Broadway + not even mid range RV730 like GPU? Soooo Dissapointing, I really understood why they left the tech race with the original Wii, they were in a really bad position after GC, but after so many years earning lots and lots of money I expected we would get some decent hardware able to run this gen titles at 1080 for example.

And whats the deal with the eDRAM? Will the GPU have it or has Nintendo chosen not to use it and cheapen even further the hardware? No AA in E3 games was the most dissapointing thing I saw from the hardware performance.

And why even bother wasting money in eDRAM instead of getting a somewhat more decent GPU? I really like the advantages eDRAM can give but pairing it with a weak GPU doesnt sound like a good choice.
 
I highly doubt they have to do something so weird to get engines up and running, WiiU CPU might be weak but, so weak that they have to offload work to the gpu? Sounds too stupid even for Nintendo, I think people are drawing lines too fast after reading that the gpu can use computer shaders, well r700 line has allways been able to do that but doesnt mean thay are using it to help the cpu.

Getting a DSP to hold sound workload on the other side seems like the kind of tweak they have chosen to get overall performance to a minimum goal.

Im really dissapointed at the overall picture we are getting of the hardware specs, 3x Overclocked Broadway + not even mid range RV730 like GPU? Soooo Dissapointing, I really understood why they left the tech race with the original Wii, they were in a really bad position after GC, but after so many years earning lots and lots of money I expected we would get some decent hardware able to run this gen titles at 1080 for example.

And whats the deal with the eDRAM? Will the GPU have it or has Nintendo chosen not to use it and cheapen even further the hardware? No AA in E3 games was the most dissapointing thing I saw from the hardware performance.

And why even bother wasting money in eDRAM instead of getting a somewhat more decent GPU? I really like the advantages eDRAM can give but pairing it with a weak GPU doesnt sound like a good choice.

Quoted for stating 1:1 my opinion.
 
Just as a ref here are a few vids of games running on a AMD A8 3500M

That's four star cores @1.5GHz (turbo at 2.4 which I suspect doesn't happen often).
I don't believe those cores SIMD support any operation that counts as two FLOPS.
And integrated redwood @ 444MHz
The TDP is 35Watts.

I just buy for 516$ a laptop with this APU and a HD6850M in switchable graphic configuration.
It has 6GB of RAM, a 750 GB HDD a BRD drive. A 15.6
It's a refurbished part from newegg. For my needs a sweet deal.
Brand new for 799$. There are laptop that comes without the dedicated graphic and are cheaper.
I'm speaking of laptop so there are battery, mobo, lot of RAM (below 4GB doesn't exist these days), a big hard-drive, Windows7, the lcd screen, etc.

What does the WiiU offers for most likely at least 300$?

How in hell Nintendo expect to get any core gamers with what they've shown?
It's a non show.
Most of the games in those vid are less than perfect (console version can be superior on certains aspects). That's say a lot about optimization in the console realm imho.
Those game still runs on that 1.5GHz CPU and 444GHz GPU.
That's not a CPU that has a lot of "horse power" by any modern standard.

How Nintendo could not put something together akin to that so most big cores big titles would have made it for this fall?

Instead we have noise about the CPU perfs, un-optimal devkits, etc. and ultimately about some of the highest selling franchise that could never make it to the system.

It's a failure imo. Nintendo failed to see what the system needed to keep up.
They seems to have been focused on bandwidth, edram and big cache for one CPU?

That's would have got BF3, CoD, etc.
Soc @40nm (TSMC)
8 PowerPC 47OS @1.6GHz, each with 256Kb of L2 (way lower power consumption than four star cores)
4 SIMD (instead of four) ~600MHz (hd69xx based would have been nice)
I use those figure to make sure that is significantly tinier than llano/trinity which might be too expansive (and power hungry a few selected chips aside).
128 bit bus.
1GB of GDDR5 (+60GB/s)
50Watts>TDP
16 GB of flash sold on the mobo.
BRD drive.

My take is you need CPU power. I understand Nintendo want it for cheap both in silicon and power, high frequency, IO with big SIMD cores may not be the good approach for them (or any body), Still I find difficult to understand the choice of only 3 sucky CPUs.

Bandwidth, in sane amount => GDD5 offers that.

Installation and caching seems critical to late games this gen. needed by some critical engine, 16 GB + BRD would have approach the same approach as on the ps3. A few game could have been installed on the device.

Llano (in it's mobile rendition) get the job done, with all the fat that comes with windows gaming and half the bandwidth) I think that cheap single chip design would have perform significantly better.
Optimizations, dirty tricks and twice the bandwidth could set it closer to the desktop version of llano which on most regards outperform our old console.
 
Last edited by a moderator:
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)

In which case the rv7xx GPU architecture is a bad choice considering its problems with some forms of and lack of ongoing support for GPU compute:
http://techreport.com/discussions.x/18201
 
If the CPU is literally Broadway x 3 @ 45nm + more cache, this would be the case.

If it is, however, something in the 470 line then this wouldn't really be the case. The IBM embedded stuff like the 470S would be deceptive in that matter. Your friend could see a clock rate of "1.6ghz" or "1.8ghz" or something and assume that it's much worse than the 3.2ghz Xenon, but in reality the OOOE 470 line would thrash Xenon in most general purpose code. However, just looking at it on paper (i.e. "holy crap, it's only 1.6ghz!?!") wouldn't give any non-programmer that idea and they'd instantly assume that it is a far weaker unit.

The comments I received weren't from a guy that looked at a simple design doc with a list of console specs. The source from whom this info came is a developer (core tech) who worked on porting his studio's code from their PS360 build to WiiU directly. I believe the expectation was to see a drammatic speedup, but after performance tanked they realised that the CPU performance on the devkit was well under par.

I don't know exactly how long ago this was, I believe as I understand it that this was fairly recent however (this year I believe). It's not expected for things to change.

Also from what I understand the GPU is custom and from the sounds of it pretty capable (relative to xenos/RSX - the 1.5x rumour is likely accurate). The ram is has been pretty much set at 1.5GB for the retail unit, don't expect this to change.

I don't know if the CPU is a broadway.
 
Also from what I understand the GPU is custom and from the sounds of it pretty capable (relative to xenos/RSX - the 1.5x rumour is likely accurate). The ram is has been pretty much set at 1.5GB for the retail unit, don't expect this to change.

These actually sound like pretty decent specs for realistic ones. After E3 I had gotten really down on the Wii U.

If it really has 1.5GB RAM and a 1.5X PS360 GPU that's about as good I could have hoped.

The RAM is even enough to theoretically make it that "mid gen" console. Not the GPU though.
 
so there are slow, small, out-of-order IBM CPUs? they feel like AMD Bobcat cores, which are routinely suggested for a PS4 or X720 on the "Predict next generation tech" thread.
my opinion is that they are inane on a next-generation console, but not hugely bad on a WiiU.


I'd say WiiU is for people who did not get an x360 or ps3. my near family didn't : they got the Wii, DS and after that a couple big Acer laptops.
not everyone is in the monoculture of big flat TVs, superheroes movies, noisy consoles with expensive and boring AAA games.
 
I'd say WiiU is for people who did not get an x360 or ps3. my near family didn't : they got the Wii, DS and after that a couple big Acer laptops.
not everyone is in the monoculture of big flat TVs, superheroes movies, noisy consoles with expensive and boring AAA games.
Current credible rumours suggest it will have a weaker CPU than the Playstation Vita (and even developers for the Vita complained about its CPU inadequacy when adapting games from the PS3).
If those rumours turn out to be true then don't get your hopes up about a wide selection of PS360 games ending up on the Wii U.
 
The comments I received weren't from a guy that looked at a simple design doc with a list of console specs. The source from whom this info came is a developer (core tech) who worked on porting his studio's code from their PS360 build to WiiU directly. I believe the expectation was to see a drammatic speedup, but after performance tanked they realised that the CPU performance on the devkit was well under par.

I don't know exactly how long ago this was, I believe as I understand it that this was fairly recent however (this year I believe). It's not expected for things to change.

Also from what I understand the GPU is custom and from the sounds of it pretty capable (relative to xenos/RSX - the 1.5x rumour is likely accurate). The ram is has been pretty much set at 1.5GB for the retail unit, don't expect this to change.

I don't know if the CPU is a broadway.
Thank you on your detailed reply. AFAIK, the 5th/final dev kits came in only a few weeks ago, so this info is most likely from the 4th version dev kits.

While its overall power is in question, documents from a sound middle-ware company implied that the CPU is at least as capable as xenon in at least some ways. Also, if the CPU was weak in every regard, the ports for the Wii U probably shouldn't be close to parity with the other systems unless the devs have already began to offset some tasks onto the GPU, DSP, "scarlet 2", etc. Are these thoughts correct?
 
Last edited by a moderator:
so there are slow, small, out-of-order IBM CPUs? they feel like AMD Bobcat cores, which are routinely suggested for a PS4 or X720 on the "Predict next generation tech" thread.
my opinion is that they are inane on a next-generation console, but not hugely bad on a WiiU.
I'm not sure gekko/broadway qualified as out of order.
I believe I read only the first stages of the pipeline were doing thing OoO.
THe PPC470x are OoO with IO completion, I'm not sure about about how that compares to Gekko/broadway. At least they are tiner and more power efficient than bobcat cores.

My pov is that that type of core, tiny, power efficient, with low clock speed can be decent candidate for a console.
I see them as the antithesis of something like Xenon/PPu which are really fast IO core with wide SIMD.
Xenon was pretty cheap vs high performce OoO CPUs of the time. Those CPU are even cheaper (silicon and power), most likely more efficient.

The conclusion to me is that you need more of them to compete.
Xenon have six hardware threads, Nintendo to make thing convenient could have gone with 6 of them. More is doable PPC 47x seems to go to 8 cores.

We don't know at all how the jaguar core will turn out. They may suck.

For next gen I would hope that they rise the FP throughput. (For PPC 47x) keeping the 2 wide simd/ "paired FP" and may be introducing support for FMA and FMAC. That would make up for half the deficit in FP throughput per cycle of such a core vs say a Xenon core.
Then you use plenty of them, 8 is indeed a minimum.

I'd say WiiU is for people who did not get an x360 or ps3. my near family didn't : they got the Wii, DS and after that a couple big Acer laptops.
not everyone is in the monoculture of big flat TVs, superheroes movies, noisy consoles with expensive and boring AAA games.
I don't know the WiiU may indeed find its public, but that is different from saying that the hardware design and decision made by Nintendo are good.
 
If Wii U CPU is too "low", why it is receiving a lot of multi platform games? even with better framerate and resolution than PS360 versions (NG3).
 
Status
Not open for further replies.
Back
Top