Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I said silent 150W graphics cards fitting in a dual-slot PCI-Express, and those exist today, just check anandtech's link.
Don't call it silent. It's noisy, just tolerably so for a decent GPU. :p

The exaust fan is the only thing you see from the photos. It doesn't mean it's the active cooling component iinside the console.
But it has to provide adequate airflow as whatever components inside need. That is, if a GPU is pumping out heat that's being removed from the GPU via a GPU fan and heatsink, that heat is then filling the case and needs to be expelled. Would your PC example still be working if the PC's case fan was a small, slow (quiet) case fan? I imagine it'd overheat because the internal case temp would prevent adequate heat dissipation via the GPU cooler.
 
Out of each you and I have no ideia of either one.. so for now it's not really an argument but more of an excuse..

No, you can see the size of any of the available PC GPU heatsinks (including any you've been trying to refer to) on the internet, including disassembly of any stock heatsink. Disassembly of any and every console is well documented on the net, including the Wii so we can reasonably guess at parts of the layout of the WiiU. We have the dimensions of the Wii U and can see where its fan is located and what intakes it has, at least on the current shell.

As for cost, you can make some educated guesses. MS gave a dollar figure for how much the revised GPU heatsink cost, and it's also clear that when they replaced the smaller, heatpipe equipped CPU cooler for a LARGER and HEAVIER, less efficient aluminium cooler on the 65nm CPUs that they did it because of cost. Cost is also the reason they didn't want to use a heatpipe on the GPU, but then had to, but then removed it as soon as they could safely do it.

The problem is that you're just daydreaming over high end specs and don't want to put any thought into this.

Never suggested silent 250W GPU for Wii U or any other console, that's ridiculous even by my standards.
I said silent 150W graphics cards fitting in a dual-slot PCI-Express, and those exist today, just check anandtech's link.

I have a "silent" 180W overclocked 560TI. It's very quiet, with its full length length heatsink and multiple heatpipes and two fans, but it's not silent under full load (definitely not with the side of the case off) and I wouldn't expect Nintendo to gut the Wii U just to fit one in the case.

You look at a 150W HD6850 (40nm) with its cooler, and you could fit 2 of those inside Wii U's case, even with the optical drive included.

But this is where you go from the strange to the absurd!
 
No, you can see the size of any of the available PC GPU heatsinks (including any you've been trying to refer to) on the internet, including disassembly of any stock heatsink. Disassembly of any and every console is well documented on the net, including the Wii so we can reasonably guess at parts of the layout of the WiiU. We have the dimensions of the Wii U and can see where its fan is located and what intakes it has, at least on the current shell.

How about making any "reasonable guesses" yourself, instead of saying "it won't do because my empiric notion tells me it doesn't"?


As for cost, you can make some educated guesses. MS gave a dollar figure for how much the revised GPU heatsink cost, and it's also clear that when they replaced the smaller, heatpipe equipped CPU cooler for a LARGER and HEAVIER, less efficient aluminium cooler on the 65nm CPUs that they did it because of cost.
They did? And how much was that?




The problem is that you're just daydreaming over high end specs and don't want to put any thought into this.

What high-end specs exactly? Try reading my posts again.
I just said the "there's no chance the console will consume more than ~40W because OMG that thingie is so small!!" posts were as valid to me as a random post claiming there'd be a 100W GPU inside the console.



I have a "silent" 180W overclocked 560TI. It's very quiet, with its full length length heatsink and multiple heatpipes and two fans, but it's not silent under full load (definitely not with the side of the case off) and I wouldn't expect Nintendo to gut the Wii U just to fit one in the case.

I never said it'd have a 120W GPU (despite your continuous assumptions of such). I said it'd be possible for the Wii U to have a max system TDP of 120W with a ~65W GPU and a ~30W CPU -> given the heatsink size of 150W graphics cards.







Referring to being able to fit two HD6850 inside the Wii U's case
But this is where you go from the strange to the absurd!


Wii U's volume: 172 x 45 x 266 mm = 2059 cm^3
HD6850's volume: 112 x 41 x 226 mm = 1038 cm^3

2059/1038 = 1,98 = ~2.

Take out the extra space taken by the plastic cover + front plate and you could probably fit a slim optic drive (~400 cm^3).

Or could math be too "absurd" or "strange" to you?
 
Wii U's volume: 172 x 45 x 266 mm = 2059 cm^3
HD6850's volume: 112 x 41 x 226 mm = 1038 cm^3

2059/1038 = 1,98 = ~2.

Take out the extra space taken by the plastic cover + front plate and you could probably fit a slim optic drive (~400 cm^3).

Or could math be too "absurd" or "strange" to you?
You can't go by volume unless you crush the components into dust! ;) The dimensions are limiting. With your dimensions given above, you could fit one HD6580 into the Wii box leaving a slim border around it of 60 x 4 x 40 mm, into which nothing will fit - certainly not a DVD drive! Hence why some could consider the idea of fitting two boxes of about the same size as Wuu into Wuu's case absurd.
 
You can't go by volume unless you crush the components into dust! ;)

Actually yes, you can.
If Nintendo is designing the console's innards (PCB + I/O/power connections + optic drive + cooling elements), then what matters is the case's volume.
At least it's certainly not how many graphics cards you can stuff in it without breaking them.

The dimensions are limiting. With your dimensions given above, you could fit one HD6580 into the Wii box leaving a slim border around it of 60 x 4 x 40 mm, into which nothing will fit - certainly not a DVD drive! Hence why some could consider the idea of fitting two boxes of about the same size as Wuu into Wuu's case absurd.


They're certainly not going to design everything into a rectangle that fits in the center of the case with a border of air around it...

And a HD6850 isn't "about the same size" as a Wii U. It's half its size. That's like saying a 11" subnotebook is about the same size as a 17" DTR.
 
I never said it'd have a 120W GPU (despite your continuous assumptions of such). I said it'd be possible for the Wii U to have a max system TDP of 120W with a ~65W GPU and a ~30W CPU -> given the heatsink size of 150W graphics cards.

Actually I just think the idea of having a 120W TDP anything or everything in the WiiU is rather crazy.

According to the figures you provided (repeated below) the WiiU is only 4.5mm high. That means, going by what we've seen so far, it will probably have a 4cm fan (I'd thought it might be 5cm but apparently not). I think you're expecting rather a lot from that fan, and that's an understatement.

Wii U's volume: 172 x 45 x 266 mm = 2059 cm^3
HD6850's volume: 112 x 41 x 226 mm = 1038 cm^3

2059/1038 = 1,98 = ~2.

"You look at a 150W HD6850 (40nm) with its cooler, and you could fit 2 of those inside Wii U's case, even with the optical drive included."

Yeah that optical drive will fit nicely into the -17 cm^3 left after you've crushed your two plasticine 6850s into the case.

Oh, you forgot to subtract the case thickness from the available volume in the WiiU case.

Take out the extra space taken by the plastic cover + front plate and you could probably fit a slim optic drive (~400 cm^3).

Nice save! Your claim was starting to look absurd there for a minute!

Or could math be too "absurd" or "strange" to you?

Now you've explained it all makes perfect sense.
 
I hadn't been thinking about battery life for the Wu screen. Man, I hope you won't be spending a lot of time using the charge and play cable ...

I'll take a guess at about half of that! :p

The way the GC and Wii are designed you get air effectively ducted over the heatsink, which is the same thing as having a fan on the heatsink (barring any push/pull differences). In something like the Wii, adding a fan directly to the heatsink would mean a shorter heatsink, interrupt air flow through the case and may actually reduce overall cooling effectiveness. Unlike PCs the cooling has to be considered for the system as a whole rather than on a per-component basis. Even "passive" GPUs in the PC space require good "active" case cooling for anything other than the most scrubby of GPUs!

Thermal issues place a cap on performance even in the PC space (look at CPU TDP ratings for example) and that's despite decades of experience, the most advanced fabbing processes on earth (Intel are giants) and the capacity for much larger and much more expensive cooling solutions. There's no getting away from the heat issue unfortunately!

I guess we'll just have to agree to disagree. I think by time Wii U comes out it would be able to handle a GPU with a min. 640 ALUs clocked at 607.5Mhz (my current speculated clock) with little to no problem.
 
I guess we'll just have to agree to disagree. I think by time Wii U comes out it would be able to handle a GPU with a min. 640 ALUs clocked at 607.5Mhz (my current speculated clock) with little to no problem.

Well my speculated clock is 592.8MHz or less.
There's just no way the Green Leprecauns will ever let the GPU be clocked higher than that!


Anyhow, I've grown tired of trying to explain in various and easy ways that if a HD6850 (smaller one on the right) can handle a 150W TDP, then naturally the Wii U could handle a 120W full system TDP (larger one, on the left).

atiradeonhd68501024mb25.jpg


Oh noes, but it should be impossible due to "size and cost".. :rolleyes:
 
Going by noise, I have the Sapphire 6870, which is speeced at 151W (but can actually draw a bit more). As they don't use the standard cooler setup by AMD, their cooling solution is quite a bit less noisy, but probably more expensive, as it has more (and fatter) heatpipes. But it also uses a axial cooler, which is probably a bit more efficient (about 80 or 92mm diameter, didn't measure it).

Under usage, this thing can get quite hot... the heatpipes (some of which you can touch when your case is open) get so hot, they can burn your fingers, probably. I didn't try it^^ It did hurt. And that's with 5 case fans, a CPU fan which blows outside the case and a PSU which pulls air out of the case, too.

ALL these things aren't present in a Wii U. It's a small 40mm fan... my Pentium 1 had one of those^^ There's just no way in hell that this fan can produce enough airflow to cool anything beyond what current midrange laptops dissipate. And usually that's below 90W (the PSUs usually never go above that...). They do you radial fans, though, but they also use heatpipes in many cases. And that's for the whole device (including a screen, though).
 
No need to worry about cooling a CPU if the WiiU GPU turns out to be the totally bombastic GPGPU amirite.

>_>
 
I guess we'll just have to agree to disagree. I think by time Wii U comes out it would be able to handle a GPU with a min. 640 ALUs clocked at 607.5Mhz (my current speculated clock) with little to no problem.

Well I can't say it won't be able to because I don't know! I'm pretty confident on it having relatively low heat output from the CPU and GPU counted together though (say, 35W Llano levels or lower) based on size and the small exhaust fan.

The talk of heat issues makes me think they're already up and running on the final manufacturing process(es?), but possibly without final silicon. That would probably rule out 28nm for the GPU, because even AMD and Nvidia seem to be having trouble delivering anything on 28nm from TSMC, and I've not heard a peep about Global Foundries' high performance 28nm yet. This makes me think Nintendo will go with a GPU on 40nm or 45 nm (from NEC or IBM, possibly with a SoC) which would mean nothing new and miraculous in terms of perf/watt between now and early/mid next year.

But this is speculation based on rumours and guesses so I could (obviously) be way off mark. But not about the WiiU not throwing out 120W+ of heat from inside that little white case, I think.

I have a rough idea of what I think the cooling in the WiiU will look like btw, but it really needs a paint masterpiece diagram to describe and you can't upload stuff to B3D.

Anyhow, I've grown tired of trying to explain in various and easy ways that if a HD6850 (smaller one on the right) can handle a 150W TDP, then naturally the Wii U could handle a 120W full system TDP (larger one, on the left).

atiradeonhd68501024mb25.jpg


Oh noes, but it should be impossible due to "size and cost".. :rolleyes:

Looks like there's room for a Bluray drive in that 6850! Take off the eject button and you could probably fit in another 6850. :p

No need to worry about cooling a CPU if the WiiU GPU turns out to be the totally bombastic GPGPU amirite.

>_>

Not to worry, if you can cool a 120W GPU on a 4cm case fan you can probably cool a 65W CPU passively!
 
^ LOL @ AlStrong.

Well my speculated clock is 592.8MHz or less.
There's just no way the Green Leprecauns will ever let the GPU be clocked higher than that!

I'm curious to know how you achieved that number. Mine is based on Nintendo's use of multiples the last two gens, since the CPU, GPU, and memory clock were multiples and then Wii was a multiple of GC. So what I did was made Wii U a multiple of Wii and had the CPU, GPU, and memory as multiples of each other like in the past. What I came up with was:

CPU - 3645Mhz
GPU - 607.5Mhz
Memory - 1822.5Mhz

CPU is 6x the GPU, 2x the memory, and 5x Broadway. The GPU is 2.5x Hollywood. The memory is 3x the GPU. Since the numbers are so exact I know it won't be correct, but it gives an idea of what I expect Nintendo to do with Wii U.


Anyhow, I've grown tired of trying to explain in various and easy ways that if a HD6850 (smaller one on the right) can handle a 150W TDP, then naturally the Wii U could handle a 120W full system TDP (larger one, on the left).

atiradeonhd68501024mb25.jpg



Oh noes, but it should be impossible due to "size and cost".. :rolleyes:

Apparently they are forgetting the capabilities of Nintendium. ;)

Well I can't say it won't be able to because I don't know! I'm pretty confident on it having relatively low heat output from the CPU and GPU counted together though (say, 35W Llano levels or lower) based on size and the small exhaust fan.

The talk of heat issues makes me think they're already up and running on the final manufacturing process(es?), but possibly without final silicon. That would probably rule out 28nm for the GPU, because even AMD and Nvidia seem to be having trouble delivering anything on 28nm from TSMC, and I've not heard a peep about Global Foundries' high performance 28nm yet. This makes me think Nintendo will go with a GPU on 40nm or 45 nm (from NEC or IBM, possibly with a SoC) which would mean nothing new and miraculous in terms of perf/watt between now and early/mid next year.

But this is speculation based on rumours and guesses so I could (obviously) be way off mark. But not about the WiiU not throwing out 120W+ of heat from inside that little white case, I think.

I have a rough idea of what I think the cooling in the WiiU will look like btw, but it really needs a paint masterpiece diagram to describe and you can't upload stuff to B3D.

I know I want confirmation before I believe Wii U will use a 28nm process, but don't forget that NEC and IBM are both members of the 28nm alliance as well. NEC fabbed Flipper and Hollywood so it's not too far out there to believe what that investor said since it would come from NEC and not TSMC or GF. Especially with the release still a ways away. Also I'm on the "expecting an SoP" bandwagon over the "expecting an SoC" bandwagon right now. And I can see the Llano comparison as well.
 
The talk of heat issues makes me think they're already up and running on the final manufacturing process(es?), but possibly without final silicon. That would probably rule out 28nm for the GPU, because even AMD and Nvidia seem to be having trouble delivering anything on 28nm from TSMC, and I've not heard a peep about Global Foundries' high performance 28nm yet. This makes me think Nintendo will go with a GPU on 40nm or 45 nm (from NEC or IBM, possibly with a SoC) which would mean nothing new and miraculous in terms of perf/watt between now and early/mid next year.

It could point to a number of things, like using 40nm off the shelf parts to mimick the performance of a 28nm custom part. Or using an early WiiU GPU on 40nm which is creating heat problems because its aimed to be 28nm for the final design.

I don't know if the 28nm info is true at all, but I don't see how heat issues in early WiiU dev kits would rule it out.
 
Last edited by a moderator:
^ LOL @ AlStrong.

I'm curious to know how you achieved that number. Mine is based on Nintendo's use of multiples the last two gens, since the CPU, GPU, and memory clock were multiples and then Wii was a multiple of GC. So what I did was made Wii U a multiple of Wii and had the CPU, GPU, and memory as multiples of each other like in the past. What I came up with was:

CPU - 3645Mhz
GPU - 607.5Mhz
Memory - 1822.5Mhz

CPU is 6x the GPU, 2x the memory, and 5x Broadway. The GPU is 2.5x Hollywood. The memory is 3x the GPU. Since the numbers are so exact I know it won't be correct, but it gives an idea of what I expect Nintendo to do with Wii U.


Im curious, how did you go from Rule of 3/2 to... rule of 6/2?
Why not keep with Rule of 3/2?

CUBE, Wii
GPU to CPU 162 x 3 = 485, 243 x 3 = 729
GPU to Mem 162 x 2 = 324, 243 x 2 = 486


So why would they change that with the WiiU?
GPU to CPU 800 x 3 = 2400 (Power7 clock rate 2.4 GHz to 4.25 GHz)
GPU to Mem 800 x 2 = 1600


edit to add:
Hell, if rumors are true regarding 28nm fabbed GPU, and Nintendo doesn't want to appear inferior to the 360 by using the 2.4GHz number,
Then:

GPU to CPU 1000 x 3 = 3000
GPU to Mem 1000 x 2 = 2000
 
Last edited by a moderator:
Such high frequencies definitely won't happen IMHO, just isn't the way to go if you want power efficiency.
 
^I agree about power efficiency, but some sacrifices may have to be made.

Im curious, how did you go from Rule of 3/2 to... rule of 6/2?
Why not keep with Rule of 3/2?

CUBE, Wii
GPU to CPU 162 x 3 = 485, 243 x 3 = 729
GPU to Mem 162 x 2 = 324, 243 x 2 = 486


So why would they change that with the WiiU?
GPU to CPU 800 x 3 = 2400 (Power7 clock rate 2.4 GHz to 4.25 GHz)
GPU to Mem 800 x 2 = 1600


edit to add:
Hell, if rumors are true regarding 28nm fabbed GPU, and Nintendo doesn't want to appear inferior to the 360 by using the 2.4GHz number,
Then:

GPU to CPU 1000 x 3 = 3000
GPU to Mem 1000 x 2 = 2000

Mine were influenced by the tidbits of info from the first dev kit, which obviously means they aren't guaranteed, and from being a multiple of Wii. I wouldn't call that a concrete rule since Wii wasn't a significant change in hardware or clocks. However if we're looking at it from that perspective then mine is 6/3, not 6/2.

But after looking closer at the patent, I don't know how they would treat the memory clocks right now.
 
This is my idea of the Wii U:

wiidesign7.png


The yellow square are the miniPCI cards where Bluetooth and WiFi interfaces can be found, the red square i the electrical control part of the mainboard, the sami-transparent grey area is the BluRay drive, the black one is the CPU die, strong blue is the NAND Flash Chip. I have put the main RAM and the System LSI (a processor that unifies memory control, i/o control, GPU and eDRAM in a single die) inside a Type A MXM V.3.0 module (35W), the light blue area is the fan in the back of the console box.

The power consumption legend is this:

MXM Module (GPU+RAM+I/O+NB): 35W.
CPU: 10W.
NAND Flash, MiniPCI Cards: 5W
BluRay: 5W.
USB Ports: 10W (2.5W each).

65W in total, the typical power consumption of a netbook.
 
I like that breakdown. That said you might need to do a little changing if the patent is to be believed. I was reading some of it this weekend for the controller and it says there will be external memory for the CPU. Then there is VRAM and internal memory on the LSI for the GPU.
 
I like that breakdown. That said you might need to do a little changing if the patent is to be believed. I was reading some of it this weekend for the controller and it says there will be external memory for the CPU. Then there is VRAM and internal memory on the LSI for the GPU.

¿External Memory?

¿AMD Sideport perhaps?
 
Status
Not open for further replies.
Back
Top