Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Don't call it silent. It's noisy, just tolerably so for a decent GPU. :p

    But it has to provide adequate airflow as whatever components inside need. That is, if a GPU is pumping out heat that's being removed from the GPU via a GPU fan and heatsink, that heat is then filling the case and needs to be expelled. Would your PC example still be working if the PC's case fan was a small, slow (quiet) case fan? I imagine it'd overheat because the internal case temp would prevent adequate heat dissipation via the GPU cooler.
     
  2. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    No, you can see the size of any of the available PC GPU heatsinks (including any you've been trying to refer to) on the internet, including disassembly of any stock heatsink. Disassembly of any and every console is well documented on the net, including the Wii so we can reasonably guess at parts of the layout of the WiiU. We have the dimensions of the Wii U and can see where its fan is located and what intakes it has, at least on the current shell.

    As for cost, you can make some educated guesses. MS gave a dollar figure for how much the revised GPU heatsink cost, and it's also clear that when they replaced the smaller, heatpipe equipped CPU cooler for a LARGER and HEAVIER, less efficient aluminium cooler on the 65nm CPUs that they did it because of cost. Cost is also the reason they didn't want to use a heatpipe on the GPU, but then had to, but then removed it as soon as they could safely do it.

    The problem is that you're just daydreaming over high end specs and don't want to put any thought into this.

    I have a "silent" 180W overclocked 560TI. It's very quiet, with its full length length heatsink and multiple heatpipes and two fans, but it's not silent under full load (definitely not with the side of the case off) and I wouldn't expect Nintendo to gut the Wii U just to fit one in the case.

    But this is where you go from the strange to the absurd!
     
  3. How about making any "reasonable guesses" yourself, instead of saying "it won't do because my empiric notion tells me it doesn't"?


    They did? And how much was that?




    What high-end specs exactly? Try reading my posts again.
    I just said the "there's no chance the console will consume more than ~40W because OMG that thingie is so small!!" posts were as valid to me as a random post claiming there'd be a 100W GPU inside the console.



    I never said it'd have a 120W GPU (despite your continuous assumptions of such). I said it'd be possible for the Wii U to have a max system TDP of 120W with a ~65W GPU and a ~30W CPU -> given the heatsink size of 150W graphics cards.








    Wii U's volume: 172 x 45 x 266 mm = 2059 cm^3
    HD6850's volume: 112 x 41 x 226 mm = 1038 cm^3

    2059/1038 = 1,98 = ~2.

    Take out the extra space taken by the plastic cover + front plate and you could probably fit a slim optic drive (~400 cm^3).

    Or could math be too "absurd" or "strange" to you?
     
  4. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    You can't go by volume unless you crush the components into dust! ;) The dimensions are limiting. With your dimensions given above, you could fit one HD6580 into the Wii box leaving a slim border around it of 60 x 4 x 40 mm, into which nothing will fit - certainly not a DVD drive! Hence why some could consider the idea of fitting two boxes of about the same size as Wuu into Wuu's case absurd.
     
  5. Actually yes, you can.
    If Nintendo is designing the console's innards (PCB + I/O/power connections + optic drive + cooling elements), then what matters is the case's volume.
    At least it's certainly not how many graphics cards you can stuff in it without breaking them.


    They're certainly not going to design everything into a rectangle that fits in the center of the case with a border of air around it...

    And a HD6850 isn't "about the same size" as a Wii U. It's half its size. That's like saying a 11" subnotebook is about the same size as a 17" DTR.
     
  6. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    Actually I just think the idea of having a 120W TDP anything or everything in the WiiU is rather crazy.

    According to the figures you provided (repeated below) the WiiU is only 4.5mm high. That means, going by what we've seen so far, it will probably have a 4cm fan (I'd thought it might be 5cm but apparently not). I think you're expecting rather a lot from that fan, and that's an understatement.

    "You look at a 150W HD6850 (40nm) with its cooler, and you could fit 2 of those inside Wii U's case, even with the optical drive included."

    Yeah that optical drive will fit nicely into the -17 cm^3 left after you've crushed your two plasticine 6850s into the case.

    Oh, you forgot to subtract the case thickness from the available volume in the WiiU case.

    Nice save! Your claim was starting to look absurd there for a minute!

    Now you've explained it all makes perfect sense.
     
  7. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    I guess we'll just have to agree to disagree. I think by time Wii U comes out it would be able to handle a GPU with a min. 640 ALUs clocked at 607.5Mhz (my current speculated clock) with little to no problem.
     
  8. Well my speculated clock is 592.8MHz or less.
    There's just no way the Green Leprecauns will ever let the GPU be clocked higher than that!


    Anyhow, I've grown tired of trying to explain in various and easy ways that if a HD6850 (smaller one on the right) can handle a 150W TDP, then naturally the Wii U could handle a 120W full system TDP (larger one, on the left).

    [​IMG]

    Oh noes, but it should be impossible due to "size and cost".. :roll:
     
  9. TheWretched

    Regular

    Joined:
    Oct 7, 2008
    Messages:
    830
    Likes Received:
    23
    Going by noise, I have the Sapphire 6870, which is speeced at 151W (but can actually draw a bit more). As they don't use the standard cooler setup by AMD, their cooling solution is quite a bit less noisy, but probably more expensive, as it has more (and fatter) heatpipes. But it also uses a axial cooler, which is probably a bit more efficient (about 80 or 92mm diameter, didn't measure it).

    Under usage, this thing can get quite hot... the heatpipes (some of which you can touch when your case is open) get so hot, they can burn your fingers, probably. I didn't try it^^ It did hurt. And that's with 5 case fans, a CPU fan which blows outside the case and a PSU which pulls air out of the case, too.

    ALL these things aren't present in a Wii U. It's a small 40mm fan... my Pentium 1 had one of those^^ There's just no way in hell that this fan can produce enough airflow to cool anything beyond what current midrange laptops dissipate. And usually that's below 90W (the PSUs usually never go above that...). They do you radial fans, though, but they also use heatpipes in many cases. And that's for the whole device (including a screen, though).
     
  10. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    No need to worry about cooling a CPU if the WiiU GPU turns out to be the totally bombastic GPGPU amirite.

    >_>
     
  11. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    Well I can't say it won't be able to because I don't know! I'm pretty confident on it having relatively low heat output from the CPU and GPU counted together though (say, 35W Llano levels or lower) based on size and the small exhaust fan.

    The talk of heat issues makes me think they're already up and running on the final manufacturing process(es?), but possibly without final silicon. That would probably rule out 28nm for the GPU, because even AMD and Nvidia seem to be having trouble delivering anything on 28nm from TSMC, and I've not heard a peep about Global Foundries' high performance 28nm yet. This makes me think Nintendo will go with a GPU on 40nm or 45 nm (from NEC or IBM, possibly with a SoC) which would mean nothing new and miraculous in terms of perf/watt between now and early/mid next year.

    But this is speculation based on rumours and guesses so I could (obviously) be way off mark. But not about the WiiU not throwing out 120W+ of heat from inside that little white case, I think.

    I have a rough idea of what I think the cooling in the WiiU will look like btw, but it really needs a paint masterpiece diagram to describe and you can't upload stuff to B3D.

    Looks like there's room for a Bluray drive in that 6850! Take off the eject button and you could probably fit in another 6850. :p

    Not to worry, if you can cool a 120W GPU on a 4cm case fan you can probably cool a 65W CPU passively!
     
  12. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    ^ LOL @ AlStrong.

    I'm curious to know how you achieved that number. Mine is based on Nintendo's use of multiples the last two gens, since the CPU, GPU, and memory clock were multiples and then Wii was a multiple of GC. So what I did was made Wii U a multiple of Wii and had the CPU, GPU, and memory as multiples of each other like in the past. What I came up with was:

    CPU - 3645Mhz
    GPU - 607.5Mhz
    Memory - 1822.5Mhz

    CPU is 6x the GPU, 2x the memory, and 5x Broadway. The GPU is 2.5x Hollywood. The memory is 3x the GPU. Since the numbers are so exact I know it won't be correct, but it gives an idea of what I expect Nintendo to do with Wii U.


    Apparently they are forgetting the capabilities of Nintendium. :wink:

    I know I want confirmation before I believe Wii U will use a 28nm process, but don't forget that NEC and IBM are both members of the 28nm alliance as well. NEC fabbed Flipper and Hollywood so it's not too far out there to believe what that investor said since it would come from NEC and not TSMC or GF. Especially with the release still a ways away. Also I'm on the "expecting an SoP" bandwagon over the "expecting an SoC" bandwagon right now. And I can see the Llano comparison as well.
     
  13. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,044
    Likes Received:
    1,117
    Location:
    WI, USA
    I would surely go untapped as Wii has. :( :( :) :(
     
  14. Teasy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,563
    Likes Received:
    14
    Location:
    Newcastle
    It could point to a number of things, like using 40nm off the shelf parts to mimick the performance of a 28nm custom part. Or using an early WiiU GPU on 40nm which is creating heat problems because its aimed to be 28nm for the final design.

    I don't know if the 28nm info is true at all, but I don't see how heat issues in early WiiU dev kits would rule it out.
     
    #314 Teasy, Oct 7, 2011
    Last edited by a moderator: Oct 7, 2011
  15. MDX

    MDX
    Newcomer

    Joined:
    Nov 28, 2006
    Messages:
    206
    Likes Received:
    0

    Im curious, how did you go from Rule of 3/2 to... rule of 6/2?
    Why not keep with Rule of 3/2?

    CUBE, Wii
    GPU to CPU 162 x 3 = 485, 243 x 3 = 729
    GPU to Mem 162 x 2 = 324, 243 x 2 = 486


    So why would they change that with the WiiU?
    GPU to CPU 800 x 3 = 2400 (Power7 clock rate 2.4 GHz to 4.25 GHz)
    GPU to Mem 800 x 2 = 1600


    edit to add:
    Hell, if rumors are true regarding 28nm fabbed GPU, and Nintendo doesn't want to appear inferior to the 360 by using the 2.4GHz number,
    Then:

    GPU to CPU 1000 x 3 = 3000
    GPU to Mem 1000 x 2 = 2000
     
    #315 MDX, Oct 8, 2011
    Last edited by a moderator: Oct 8, 2011
  16. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    Such high frequencies definitely won't happen IMHO, just isn't the way to go if you want power efficiency.
     
  17. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    ^I agree about power efficiency, but some sacrifices may have to be made.

    Mine were influenced by the tidbits of info from the first dev kit, which obviously means they aren't guaranteed, and from being a multiple of Wii. I wouldn't call that a concrete rule since Wii wasn't a significant change in hardware or clocks. However if we're looking at it from that perspective then mine is 6/3, not 6/2.

    But after looking closer at the patent, I don't know how they would treat the memory clocks right now.
     
  18. Urian

    Regular

    Joined:
    Aug 23, 2003
    Messages:
    622
    Likes Received:
    55
    This is my idea of the Wii U:

    [​IMG]

    The yellow square are the miniPCI cards where Bluetooth and WiFi interfaces can be found, the red square i the electrical control part of the mainboard, the sami-transparent grey area is the BluRay drive, the black one is the CPU die, strong blue is the NAND Flash Chip. I have put the main RAM and the System LSI (a processor that unifies memory control, i/o control, GPU and eDRAM in a single die) inside a Type A MXM V.3.0 module (35W), the light blue area is the fan in the back of the console box.

    The power consumption legend is this:

    MXM Module (GPU+RAM+I/O+NB): 35W.
    CPU: 10W.
    NAND Flash, MiniPCI Cards: 5W
    BluRay: 5W.
    USB Ports: 10W (2.5W each).

    65W in total, the typical power consumption of a netbook.
     
  19. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    I like that breakdown. That said you might need to do a little changing if the patent is to be believed. I was reading some of it this weekend for the controller and it says there will be external memory for the CPU. Then there is VRAM and internal memory on the LSI for the GPU.
     
  20. Urian

    Regular

    Joined:
    Aug 23, 2003
    Messages:
    622
    Likes Received:
    55
    ¿External Memory?

    ¿AMD Sideport perhaps?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...