Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    That would be...no. Unless they rebooted the project.
     
    #361 AlexV, Nov 3, 2011
    Last edited by a moderator: Nov 3, 2011
  2. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    What makes you think so? These would be 768 ALUs with VLIW4, according to rumors such a chip (HD7570) has a TDP of 50W with a clock of 750 MHz and 1GB of DDR5. With further optimization and slightly lower clocks this could be around 30-40W and with a good amount of eDRAM/1T-SRAM this would be really competitve (really doubting XboxNext/PS4 will be more than twice as powerful)

    EDIT: 768 ALUs x 729 MHz (3x Wii GPU clock) = 1120 GFLOPS := Rumor suggesting around 1TFLOP
     
    #362 stiftl, Nov 3, 2011
    Last edited by a moderator: Nov 3, 2011
  3. RudeCurve

    Banned

    Joined:
    Jun 1, 2008
    Messages:
    2,831
    Likes Received:
    0
    Revolution VR Megaton revisited.:lol:
     
  4. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    From what I understand the architecture is locked down a couple of years before release. It takes months to years to finalise a chip design then many months to turn that into a production chip.

    I'm not sure that you can just assume that power consumption for a Nintendo chip can be drastically lower than a PC part, and 30 - 40 W for GPU is almost certainly way over what the current WiiU case/cooling will be happy with.

    Best you can hope for at this point is, IMO, Nintendo pushing for higher clocks on what they've already got. Maybe they could partially slide the motherboard out from under the Bluray drive and go for a larger heatsink and a different fan arrangement. Probably nothing will change though.
     
  5. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    But so is the architecture for the PC parts, AMD definitely knew what they will bring in 2012 at least 1 year ago. VLIW4 chips were released nearly one year ago. I don't know Nintendo's time frame but they definitely know better than me what will be available in the future from AMD an so I think they sure knew about VLIW4 architecture in the middle of 2010 already.

    Well, this is the TDP for the desktop part, mobile and/or embedded chips will probably use even less. And I don't think even 40W would be too much, because the IBM CPUs aren't very power hungry normally (I think it won't be clocked higher than 2,2GHz), maybe another 20W TDP.

    I think Nintendo proved that they know how to design power efficient packages.
     
    #365 stiftl, Nov 3, 2011
    Last edited by a moderator: Nov 3, 2011
  6. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    At this point they can't change what they're planning to go into the machine though. So far, no rumours indicate that VLIW4 stuff.

    60W just for the CPU and GPU would probably be above what the 360S dissipates from its processors; Nintendo have the disadvantage of less room for a heatsink and a much smaller fan. I don't think that it's going to happen without an outrageously noisy little fan on the back, which is to say that I don't think it's going to happen.

    Nintendo have done nothing to prove they are sorcerers, but lots to prove that they are ruthless at keeping costs under control.
     
  7. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    That's what I wanted to say though, still this doesn't mean that it wasn't planned from the beginning. Using VLIW5 (or whatever) parts in early devkits doesn't tell us anything really.

    Probably not: The original XBox360 used around 180W (Source) in games with a power supply of 203W (List of Revisions). The current XBox360 uses a 115W power supply, so I would say it draws around 90W of power. I think 60-70W TDP would be bearable for the WiiU's size.

    EDIT: XBox360 draws around 80-90W in games according to AnandTech

    Nobody said they are sorcerers, but IMO they did a good job with the Gamecube and the Wii regarding performance/watt.
     
    #367 stiftl, Nov 3, 2011
    Last edited by a moderator: Nov 3, 2011
  8. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    Well if they're still using the old shaders in current dev kits (a year after the the VLIW4 stuff hit the mass market) then I'd guess that's a sign that they're using the old stuff in their new machine. That'd my interpretation at any rate, though I don't know how long it takes to make revisions to a dev kit.

    That's measured at the wall. On the other side of the power supply you're probably getting about 80% of that, and at least a few watts will go for the DVD drive, HDD, fan, wifi, wireless pads and other processors. With 60 watts of heat coming from the CPU and GPU you're at least in the same ballpark as the 360S with its big copper core cooler with bit fan plonked straight on top and an abundance of vents.
     
  9. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    And NEC would be the one most likely making it so they wouldn't have to deal with TSMC or GF on possible supply issues.

    Was there something you guys were told that that you didn't publish? I'm not doubting you, just asking for confirmation. Part of my basis for this idea came from your article when you guys said that it hadn't been taped out yet. From a heat perspective it seems like a logical choice.

    The only thing available at the time was Cayman. Putting that in the dev kit would have been a huge misrepresentation of power. stifl said pretty much everything I would have said. Looking at Cayman's release date VLIW4 would have been developed concurrently when Nintendo started their plans for a GPU. AMD could have easily said, "This is what we are planning to do with our future GPUs. It will give the same amount of processing power while reducing some of the transistors. It will be readily available by the time you launch. Based on your target use [insert GPU used in the dev kit] for now."

    As we've discussed before we saw the 360 go from a 9800, to an x800, then finally Xenos. I'm just not ready to rule out the idea till I have enough confirmation to write it off.
     
  10. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    There were also some kits that used SLI 6800 GTs to give a better representation of what the GPU could do. The issue with 360 kits would appear to be that there simply wasn't any closer hardware available from ATI, but that wouldn't seem to be the case with the WiiU.
     
  11. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    But that's essentially what I just pointed out. There currently aren't any VLIW4 GPUs with a lower ALU count. At least not yet.
     
  12. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    I'd have thought AMD could disable shaders and alter clocks enough to make something of roughly the capability as the GPU Nintendo intend to use?
     
  13. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    but VLIW4 vs VLIW5 is mostly a shader compilation thing, they could use a radeon 6570 and it would not be too far from the final hardware.
     
  14. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    I just assumed it'd affect how you wrote shaders in order to get optimal performance. Perhaps it doesn't though, I don't actually know.
     
  15. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,044
    Likes Received:
    1,117
    Location:
    WI, USA
    I read that VLIW4 was essentially a concession for AMD's so-far-worthless GPGPU initiatives and it isn't actually a benefit for games. Tunafish posted above that VLIW5 is a better fit for 360's GPU, btw, and I do think it's clear that N wants access to other companys' game libraries. I think I'm expecting WiiU to be essentially a modernized 360.
     
  16. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    I'd missed Tunafish's post - he makes some interesting points there.

    I guess I'm expecting something similar to you from the WiiU. Better than parity from the most cost effective hardware they can put together, and in a family living-room friendly box. A bit more than that would be nice though, if it comes.
     
  17. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    Maybe so as I've read something very similar to that (nothing about gaming benefits though, but wouldn't that help physics if used?). I also read that the switch to DX10 started to cause poor utilization of their shaders in VLIW5 which in turn leads to the transistor reduction in VLIW4 due to trying to improve utilization by "trimming the fat" so to speak. I've also read that AMD had more plans for VLIW4, but because the fab was still at 40nm they passed to avoid an even bigger die than what it was.

    I can definitely agree with your view though about it ending up as a modern 360 (though our views on that might differ). I'm not saying the other direction as fact, just one that I believe is very plausible.
     
  18. Butta

    Regular

    Joined:
    Jan 18, 2007
    Messages:
    361
    Likes Received:
    2
  19. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    They're gonna need it to control heat and power in that chassis.
     
  20. I'll advise you to be more careful when trying to put false statements into other people's mouths.

    I dismissed (and still do) your speculation that the case size stays the same because Nintendo itself has stated that the current form isn't the final one.
    Never have I "speculated" that the case will be "way bigger". In fact, I've made no speculation whatsoever to the case's size.




    Why?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...