Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
The big chip looks slightly thinner than the width of the HDMI port, and a bit thinner than that in the smaller dimension.

Perhaps ~12mm x 9.5mm = 114mm, but that's not much more than a guess.

That second chip is incredibly tiny. Can they even fit the rumoured 2MB of 45nm CPU cache on there, let alone the cores. I guess that would explain why they would use IBMs edram with its much smaller silicon cost than sram.
 
Last edited by a moderator:
It's confirmed the edram is on GPU, so the idea of it sitting on CPU was bunk.
The edram comes from Renesas probably as it's quoted as one of three companies working on the MCM, with AMD and IBM.

This is very suprising (laughs) as we've never seen a mention of Renesas on this thread (laughs).

A google search on Renesas edram leads here among other things (first comment) ;)
http://forum.beyond3d.com/showthread.php?t=31379&page=500
 
Could there be a small amount of edram on the CPU as well?
It would fit with the prior rumours of 2MB of cache in a tri-core Gecko-based CPU, since I doubt you could fit 2MB of regular 45nm sram on such a small chip (although I haven't done the calculations).

Edit: Googling tells me 512kB of L2 cache on the 45nm process Intel used for Atom takes up 4.4mm2.
 
I agree, the CPU most probably has 2MB edram L2.

What could be misleading is an early comment from IBM about edram and how there was a very big amount of it
 
SRAM is hugely bigger than eDRAM though, so those 4.4sqmm would be like 1.5sqmm for equivalent DRAM. Not a very big amount, for sure.

Btw, MCM for CPU and GPU is a little surprising, to me at least. I was expecting either a proper SoC, or two discrete packages. Then again, these chips are so tiny that it really makes more sense to pack 'em both on the same substrate...

No separate starlet though? :p Maybe baked into the CPU this time, or sitting somewhere on the main PCB.

In all a very "nintendo" design. Simple, clean, straight-forward. PCB is rather large though considering the (lack of) components on it. Just like with the wii. Not that it really matters.

Edit:
I just noticed there might actually be a third die on that substrate, just like with the Wii. There's a brighter rectangular feature on the substrate in the lowest-most corner that seem to be covered by a blob of clear epoxy, just like with the original starlet. It's hard to make out for sure since the images are so low resolution.

I also wonder where the wireless transciever for the pad might be located on that PCB, and what its antenna(s) might look like. There aren't many chips on that PCB other than the MCM unless there's a bunch more hiding on the reverse side of course, but for cost reasons I doubt Nintendo stuck a lot of stuff on both sides. We know Wuu features bluetooth and wireless-N standards, and there's gotta be a controller for the optical drive and USB ports as well somewhere, unless AMD integrated some of that stuff into the GPU itself.

Also, regarding memory, there's this slightly curious bit in the discussion (which is really lame and doesn't go into any detail AT ALL):

Takeda said:
I would draw attention to how efficient it is. For a computer to function efficiently, memory hierarchy structure is very important, and this time the basic memory hierarchy is tightly designed. Although that is an orthodox solution, it makes the foremost feature of this machine's high efficiency.
So what kind of hirarchy might we have in the Wuu? We know of the integrated DRAM, but what main memory Nintendo is using this time is an unknown unknown. At best we infer that it must be DDR3 since it is cheap (now, at least), and if it had been something like Mosys 1T SRAM there should have been a press release blurb released about it sometime around the Wuu's unveiling (publically traded companies love that sort of thing, since PR about big contract wins make their stock go up.)
 
Last edited by a moderator:
Nice find Butta!

About those four mem chips around the MCM. The ones on top look a bit bigger and more square to me?

edit: Actually zooming in they look the same.

The GPU die looks to be at least 5 times bigger than the CPU imo. Perhaps the CPU is around 25-30mm2 and the GPU around 150mm2?
 
Last edited by a moderator:
^ I suspect that to be 2GB of DDR3.

It's confirmed the edram is on GPU, so the idea of it sitting on CPU was bunk.
The edram comes from Renesas probably as it's quoted as one of three companies working on the MCM, with AMD and IBM.

This is very suprising (laughs) as we've never seen a mention of Renesas on this thread (laughs).

A google search on Renesas edram leads here among other things (first comment) ;)
http://forum.beyond3d.com/showthread.php?t=31379&page=500

I kept the mention on GAF. :p

http://www.neogaf.com/forum/showpost.php?p=33222455&postcount=14045

http://www.neogaf.com/forum/showpost.php?p=33227894&postcount=14071

http://www.neogaf.com/forum/showpost.php?p=33233098&postcount=14122

Could there be a small amount of edram on the CPU as well?
It would fit with the prior rumours of 2MB of cache in a tri-core Gecko-based CPU, since I doubt you could fit 2MB of regular 45nm sram on such a small chip (although I haven't done the calculations).

Edit: Googling tells me 512kB of L2 cache on the 45nm process Intel used for Atom takes up 4.4mm2.

3MB. And I've always felt that the L2 cache would be eDRAM like in the PowerPC A2. Wonder if/when we'll learn what they decided.

Btw, MCM for CPU and GPU is a little surprising, to me at least. I was expecting either a proper SoC, or two discrete packages.

Same for me on both accounts.

No separate starlet though? :p Maybe baked into the CPU this time, or sitting somewhere on the main PCB.

Should be with the GPU LSI like last time.

Edit:
I just noticed there might actually be a third die on that substrate, just like with the Wii. There's a brighter rectangular feature on the substrate in the lowest-most corner that seem to be covered by a blob of clear epoxy, just like with the original starlet. It's hard to make out for sure since the images are so low resolution.

I had to look it up again to remember, but that was serial EEPROM in Wii so I would assume something similar.

I also wonder where the wireless transciever for the pad might be located on that PCB, and what its antenna(s) might look like. There aren't many chips on that PCB other than the MCM unless there's a bunch more hiding on the reverse side of course, but for cost reasons I doubt Nintendo stuck a lot of stuff on both sides.

Wii had a surprising amount (to me) of chips on the underside. I could see them doing the same thing again.

We know Wuu features bluetooth and wireless-N standards, and there's gotta be a controller for the optical drive and USB ports as well somewhere, unless AMD integrated some of that stuff into the GPU itself.

Starlet 2.0 will handle it and as mentioned before should be in the GPU LSI.

Also, regarding memory, there's this slightly curious bit in the discussion (which is really lame and doesn't go into any detail AT ALL):


So what kind of hirarchy might we have in the Wuu? We know of the integrated DRAM, but what main memory Nintendo is using this time is an unknown unknown. At best we infer that it must be DDR3 since it is cheap (now, at least), and if it had been something like Mosys 1T SRAM there should have been a press release blurb released about it sometime around the Wuu's unveiling (publically traded companies love that sort of thing, since PR about big contract wins make their stock go up.)

A poster discovered awhile back from one of Mosys' financial releases that there will not be any 1T-SRAM in Wii U. So for some of us that pretty much confirmed that IBM's eDRAM would be the replacement. But what I would assume he means is in regard to the 32MB of eDRAM being in "Mem 1" and the DDR3, which I suspect also, being in "Mem 2" with the CPU having access to both the eDRAM and DDR3.
 
So BG thinks it's a 28nm GPU integrated with 28nm edram?

quite surprising to see nintendo themselves provide the teardown, early. shame the pics are so small though.

also def did not expect the edram integrated into the gpu given microsoft still hasnt done it.

so if it's a 115-150mm gpu (taking the different b3d guesstimates) how much area would 32mb of edram take? and thus how much remains for the gpu proper? i have no idea.

that cpu is so tiny, i knew that'd be the case lol.

but hey, i'd much rather have a tinyass cpu and bigger gpu as far as that goes.
 
That POWER7 CPU as used in Watson sure is a lot smaller than I was imagining...

(Yes, PR quotes are worthless and should never be used as a technical basis when the other evidence is in the other direction).

Regards CPU performance, we know it's tricore which makes each core tiny. Can we infer internal makeup relative to known PPC cores/Xenon?
 
That POWER7 CPU as used in Watson sure is a lot smaller than I was imagining...

(Yes, PR quotes are worthless and should never be used as a technical basis when the other evidence is in the other direction).

Regards CPU performance, we know it's tricore which makes each core tiny. Can we infer internal makeup relative to known PPC cores/Xenon?

That CPU is even smaller that 4 ARM cortex A9 cores ( Exynos 4 Quad at 32nm for example that drives latest Samsung Galaxy )...This is even worse if it is in 45nm and not in 32 or 28...

And if you consider that in the GPU package is the EDRAM, dsp, wii compatibility and other processing units...

This thing is not more powerful than current MS and Sony consoles...
 
That CPU is even smaller that 4 ARM cortex A9 cores ( Exynos 4 Quad at 32nm for example that drives latest Samsung Galaxy )...This is even worse if it is in 45nm and not in 32 or 28...

And if you consider that in the GPU package is the EDRAM, dsp, wii compatibility and other processing units...

This thing is not more powerful than current MS and Sony consoles...

not this again
 
To be more precise to my prior statement we should take into account a Galaxy S3 has the gpus included in the SOC. But however if we consider an ARM cortex A9 at 40nm is 6.7mm2, 4 of these would be similar to the size of Wii u cpu.
 
not this again

Why?. I just want to be better informed about WII u power because i am considering buying it. In its day I bought a Dreamcast that became obsolete as soon as PS2 and Xbox launched. I don´t want to happen the same again. There is nothing wrong in consumers being well informed.
 
not this again

He's not entirely wrong. It will be difficult for developers to port over Xbox 360/PS3 games which make good use of those console's SIMD capabilities to the Wii U, let alone porting over next-generation console games. We're looking at a Wii U CPU which is likely to offer less performance than the PS Vita's CPU in some cases.

If Nintendo expects developers to make heavy use of GPGPU, a RV730 based GPU (if our assumptions are correct) is a really poor choice.
 
He's not entirely wrong. It will be difficult for developers to port over Xbox 360/PS3 games which make good use of those console's SIMD capabilities to the Wii U, let alone porting over next-generation console games. We're looking at a Wii U CPU which is likely to offer less performance than the PS Vita's CPU in some cases.

If Nintendo expects developers to make heavy use of GPGPU, a RV730 based GP (if our assumptions are correct) is a really poor choice.

Those are the key words here.
 
Why?. I just want to be better informed about WII u power because i am considering buying it. In its day I bought a Dreamcast that became obsolete as soon as PS2 and Xbox launched. I don´t want to happen the same again. There is nothing wrong in consumers being well informed.

Nintendo is not going out of business any time soon, and
the PS3 or 360 did not make the Wii obsolete.
The major problem for the Wii was the lack of FPS titles early
in its lifespan. I dont see this being a problem for the WiiU.

Another point is, the tablet controller is the primary selling point for this console.
If you and developers beleive in its worth, then I dont see a problem.
So far pre-sales are strong, and thats not even based on seeing
AAA Nintendo titles.

I foresee more of a PS2 situation, until I see what MS or Sony have planned.
We might see, I believe for the first time, three consoles selling close to the same number of units over their lifespan. Something like 60-75 mil for each one.

But WiiU becoming Dreamcasted is highly doubtful.
 
Those are the key words here.
the measurements of the CPU might not be accurate, but they give a very clear ballpark, which is definitely a useful datapoint. We can't know the architecture, but we can guess (at least those well versed in CPU architectures and manufacture can guess) at the probably internal design, giving an idea of execution units and relative power. Clearly this CPU hasn't got the many varied units of the POWER7. ;) I guess integer and FP paths, and some SIMD thing but possibly not VMX256 as in Xenon. I don't really know what can be gleaned from die size.
 
It'll be awkward when, in 3/4 years, high-end smartphones and mid-end tablets with Android and WP8 come more powerful than the Wii U with about the same autonomy.


Unlike their claims, one can definitely tell that Nintendo's main concern wasn't to spend little power and to create a small console. It was to be greedy and save some $30/unit on sub-par processing hardware.

The idea of the screened controller is spectacular, but it's going to be a dead end as soon as Android/Win8 tablets/smartphones get decent gaming peripherals with Splashtop/Project Glass alternatives. WVGA for a 6" screen, resistive panel against 720p 7" capacitive screens which actually double as tablet computers on the go for about the same price?
Nintendo cheapened out way too much this time. Kicking the sand into the air saying generic stuff like "it's HD" isn't enough.
 
It'll be awkward when, in 3/4 years, high-end smartphones and mid-end tablets with Android and WP8 come more powerful than the Wii U with about the same autonomy.


Unlike their claims, one can definitely tell that Nintendo's main concern wasn't to spend little power and to create a small console. It was to be greedy and save some $30/unit on sub-par processing hardware.

The idea of the screened controller is spectacular, but it's going to be a dead end as soon as Android/Win8 tablets/smartphones get decent gaming peripherals with Splashtop/Project Glass alternatives. WVGA for a 6" screen, resistive panel against 720p 7" capacitive screens which actually double as tablet computers on the go for about the same price?
Nintendo cheapened out way too much this time. Kicking the sand into the air saying generic stuff like "it's HD" isn't enough.

The problem is that i don´t know they have clear what is the marketing target. Grandmas and wii party players won´t buy it this time. All my friends that bought it have never played videogames and in fact have never liked them. For grant they have never touched it after having played some party games on saturday evening friend meetings. I remember when i was a child everybody bought an Atari machine playing Pong. It was the shit!. The first time people saw a videogame. All my neighbours had one. Do you thing these people bought an Apple II or a NES after that ?. And traditional "hardcore" gamers are lost for them too, as they won´t be able to have multiplatform games with them. They have only the traditional "Nintendo franchises" players, that were with them also with GameCube, but i supposse these are also less now...
 
Status
Not open for further replies.
Back
Top