Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242
    Of course it'll be very interesting to see Nintendo's 1st party 2013 games, how they take advantage of the GPU and the overall strengths of WiiU.
     
  2. MDX

    MDX
    Newcomer

    Joined:
    Nov 28, 2006
    Messages:
    206
    Likes Received:
    0
    With the number of rumors and speculations floating around, can we start from a clean slate based on what we know? Speculate from the given facts.

    Lets start with the memory.

    What did Iwata say? He said that the system will include 2 GB of total system memory. 1 GB of memory for games, while another 1 GB will be available for the Wii U operating system and background applications.

    Why would he even have to say that? I suppose he could have been hinting at the fact that the WiiU will be more than just a game console by supplying the OS with 1 gig of ram. Many are assuming that, down the road, the 1 gig for the OS might be opened up for developers. Perhaps, but I kind of doubt it, because by coming out with this information publicly, he is basically putting a lock down on the amounts. In other words, if Nintendo had plans to open up the OS memory for games, Iwata would have simply informed the public that the WiiU has 2 gigs memory. But no, in the face of competition boasting rumors of having up to 8 gigs of memory for their consoles, Nintendo is stating, no, we are only providing 1 gig for games. Iwata also said:

    The large amount of memory allocated to the system allows for switching to an Internet browser, utilizing Miiverse, or other system without ending the game, providing for smooth transitions from the game to functionality to be used in your living room, and back again. Also, excess system memory is reserved for future feature expansion.

    So my question is…

    Who here believes its one pool of memory? And if it is, what type?
    How do you explain Nintendo making a distinction between?
    System memory and memory for games?


    For those who believe its 2 pools of memory, are they of different types?
    What types and why?

    Is there anybody here thinking there is more than two types of memory within the system? For example, the memory for games is broken down further to two types: eDRAM and GDDR5, while the system memory is GDDR3.

    Speaking of eDRAM.

    who here thinks both the CPU and GPU make use of eDRAM? If so, is that a shared pool?

    We had a rumor that Nintendo was working with two models that consisted of 768 and 1 GB of embedded dram. Based upon Iwata’s statement. Anybody here think that Nintendo would go so far as putting 1 GB worth of eDRAM for developers? Remember these other statements:

    and:

    IBM:
    I ask, would 1 gig of gddr5 or ddr3 raise eyebrows?
    Shouldn’t developers expect for a next gen console, a doubling or tripling of the memory? The current gen already boast half of that. Or did they really think Nintendo was coming out with a console no better than the current HD twins?



    Power Consumption:
    IWATA said that the average use of the WiiU would run about 40 watts. People have made a lot of hoopla about it.


    OG PS3------------189
    OG Xbox 360-----172
    Xbox 360 S-------- 88
    PS3 Slim----------- 85
    Nintendo WiiU---- 40
    Nintendo Wii-------16


    I don’t know how power consumption is supposed to give an indication of power, but if we go by that, the WiiU is half as powerful as the PS3 slim. That makes no sense. Nintendo has presented ‘energy efficiency’ as a console feature:

    Questions: what technology offered by AMD and IBM would help keep the TDP footprint this low? And what chips do they have that currently makes use of this technology? Basically, are there modern power-saving features that would exclude many of their chips that people are assuming the WiiU is based off?



    Power7 ?
    IBM hinted at the WiiU being a Power7, at some point they said it was, then that was retracted. Why? Because it was not true, or because they weren’t allowed to say that?

    Question: When a CPU or GPU is customized for a third party. Does AMD and IBM by default rename those chips? Scenario: IBM takes a Power7 as a basis for the WiiU CPU. However, because the chip is customized for use in a game console, IBM no longer refers to the chip as a Power7, which is made for servers. So, if somebody asks, what type of chip is in the WiiU, how would IBM respond? Would they give hints such as:

    IBM tells us that within the Wii U there's a 45nm custom chip with "a lot" of embedded DRAM (shown above). It's silicon on insulator design and packs the same processor technology found in Watson.

    If the chip was based on anything but a Power7, how could IBM describe it?
     
  3. BobbleHead

    Newcomer

    Joined:
    Sep 24, 2002
    Messages:
    58
    Likes Received:
    2
    The tech support team at AMD doesn't have any information about console products. AMD does not provide end-user support for those, so there is no reason to provide that team with any info. Especially not for currently unreleased products. When was the last time you emailed AMD about a problem with your Xbox360 or Wii? You are not our customer. Our customer for these chips is MS or Nintendo, who pay many millions of dollars. When they have support issues/questions, they do not go through the public-facing tech support team.

    They (and their lawyers) also expect us to keep our mouths shut.

    This is either fake, or a support guy trying to sound like he knows something when he does not. Either way, it tells you nothing.

    The vast majority of people inside AMD have no idea about the details of the WiiU. It was done by relatively small team and any information outside that team was "need-to-know". Even if you surveyed the GPU IP team which originally designed the base GPU family, >95% of them could not tell you what the configuration is. Only a few needed to be involved to get the specific configuration correct and working, and they know to keep their mouth shut. All additional modifications were done by the "need-to-know" team.

    Before you ask.. Yes, I know all the details. No, I will not tell you any of them.
     
  4. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242
    I believe you. Great post.
     
  5. DaSorcerer

    Newcomer

    Joined:
    Jun 14, 2012
    Messages:
    22
    Likes Received:
    0
  6. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    New mail from AMD Support:

    [​IMG]

    Same thing.
     
  7. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242
    Over on GAF, many have been falling for the bogus email.

    However now the above post by BobbleHead has appeared and with that, I am glad.
     
  8. BobbleHead

    Newcomer

    Joined:
    Sep 24, 2002
    Messages:
    58
    Likes Received:
    2
    It won't matter. This is the internet. People believe whatever source reinforces what they already (want to) believe, or makes them think they have inside information that no one else has (making them "special"). Just look at the people who still wanted to believe there was a ~600mm2 Power7 plus dual ~300mm2 GPUs hiding in the svelte case Nintendo showed more than a year ago. That would turn into a pile of melted plastic from all that heat. Nintendo does some pretty cool stuff, but they can't violate the laws of physics. At least not yet.

    If you want real information, you'll have to ask Nintendo. You won't get any from AMD unless they specifically tell us to release something.
     
  9. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242
    Thank you for your reply. I won't bother asking AMD or Nintendo. I know I won't get any answers. As an end-user, all I can do is buy the product (the console and games) to see with my own eyes what is being pushed to the screen. That will be the best proof of all, regarding what the GPU is capable of. I look forward to doing this in less than two months.

    Again, thanks for your response.
     
  10. LXFBN

    Newcomer

    Joined:
    Sep 12, 2012
    Messages:
    109
    Likes Received:
    0
  11. Vennt

    Newcomer

    Joined:
    Aug 19, 2004
    Messages:
    48
    Likes Received:
    0
  12. LXFBN

    Newcomer

    Joined:
    Sep 12, 2012
    Messages:
    109
    Likes Received:
    0
    That's what I was referring too.
     
  13. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,386
    Likes Received:
    299
    Location:
    NY
    Okay folks, B3D != gaf. You don't need to post every stupid rumor from gaf in this thread. If there is *compelling* information (e.g. confirmed facts) from gaf, then by all means post away. But this thread has gotten ridiculous. Do you really think someone from AMD's support team knows what GPU is in the WiiU? Look I love Mario 64 too, but you guys can't be this blind (I hope)!

    I'll tell you guys the same thing I told ChiefO (or whoever it was that was SURE the new xbox would be out in 2012): if what you want the WiiU to be is the same as what you think the WiiU will be, then you've probably set yourself up to be disappointed. You need to separate what you want and what's realistic (something I have seen MANY fans fail to do).
     
  14. Li Mu Bai

    Regular

    Joined:
    Oct 18, 2003
    Messages:
    540
    Likes Received:
    7
    Location:
    AZ
    This is somewhat problematic Megadrive. Nintendo will only allocate larger budgets & resources to IPs they deem "viable" so to speak. Titles such as SSB, 3D Mario, Zelda, etc. Monolith Soft & Retro's projects as well. (specific collaborations & particular new IPs included) Kirby, Mario Kart, Warioware, Mario Party, 2D Mario, etc.do not require massive HD assets. And therefore will receive none.(as already demonstrated) This helps control the cost of HD development for a company as large as Nintendo with so many IPs. Many will say initially they look like uprezzed Wii games, & they would be right.
     
  15. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242
    I should have clarified with, Nintendo's major franchises, such as 3D Mario, Zelda, and SSB.
     
  16. McHuj

    Veteran Subscriber

    Joined:
    Jul 1, 2005
    Messages:
    1,613
    Likes Received:
    869
    Location:
    Texas
    Well, power consumption is indicative of performance indirectly. Given a process tech, 45, 32 28nm, there's only some much compute you can do per watt. A lot of companies (at least internally) use performance/watt as a metric for the efficiency of their designs. IF semiconductor scaling was 100% efficient and so was the architecture, you should be able to double the performance at the same power consumption or keep the same performance at half the power.

    Given that the WiiU is at 45nm like IBM indicated (although we don't know what the GPU is) like the other consoles (360's SOC is at 45nm, not sure about PS3), but the WiiU consumes half the power, that too me indicates that its alot more efficient than the other two, but I'm not expecting much beyond that. There's only so much you can do at a given node, I'm not expecting miracles. Tell me theres a 28nm GPU in there, and I'll believe it's a lot more powerful than the other two.


    I hadn't thought that the WiiU could have two diffent types of RAM. It's possible. You can have 2 slow DDR3 chips on a 32-bit bus for the OS/Apps and 4 GDDR5 on a 64-bit bus. I theory that could provide anywhere from 25-50 GB/s. Eventually that could reduce down to 1 8Gb DDR3 chip 2 4Gb GDDR5 chips. Long term that would probably be fairly cheap.
     
  17. babybumb

    Regular

    Joined:
    Dec 9, 2011
    Messages:
    609
    Likes Received:
    24
    AMD tech support from India reads GAF. Good to know
     
  18. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242
    What interests me about the WiiU GPU the most is, not the raw spec, but rather, the collective experience AMD has gained from all of its acquisitions--such as ArtX, Real3D and ATI. Probably others too, that I'm just not aware of.
     
  19. MystWalker

    Newcomer

    Joined:
    Sep 19, 2012
    Messages:
    21
    Likes Received:
    0

    (Referencing bolded) Not necessarily... I would highly doubt that Nintendo would open all or even most of that reserved 1GB of RAM to games. However, it may be possible that in the future, depending on what OS features they decide to implement, they may release some, i.e. 256-320MB, of that RAM to games. Similar to how they later allowed developers to use up to 25%, IIRC, of the 2nd arm11 core for games on the 3DS.

    I have also thought about the possibility that the OS RAM is not just a sectioned off part of the main memory and is a separate, and likely slower, pool. It which case I would doubt that they would ever open it up for games. Such a case would remind me of the GCN, where the separate 16MB of DRAM was almost useless to devs because it was only accessible through an 8-Bit bus and had very low bandwidth.
     
  20. Ruskie

    Veteran

    Joined:
    Mar 7, 2010
    Messages:
    1,291
    Likes Received:
    1
    Wii U memory can't be GDDR5 because that would mean Nintendo put 32 MB of eDRAM in there just for sake of it and wasted good amount of money and transistor budget for something they don't need.

    GDDR5=high GPU bandwidth, no need for eDRAM for frame buffer.
    DDR3 = low GPU bandwidth, need eDRAM for frame buffer to workaround low bandwidth of main memory.

    Wii U doesn't have GDDR5, and if it had, than that would definitely bring TDP and costs up, and it looks very low as it is now.

    I would say that if Nintendo took existing R7xx design and worked on that, that would be R730. Its 128 bit card with DDR3, 480GFLOPs so that would be the ballpark they would go for. Highest TDP is 60W and is manufactured on 55nm process. Shrinking and customizing would bring that power draw to ~30-40W and thats max for Wii U.
     
    #2600 Ruskie, Sep 21, 2012
    Last edited by a moderator: Sep 21, 2012
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...