Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
With the number of rumors and speculations floating around, can we start from a clean slate based on what we know? Speculate from the given facts.

Lets start with the memory.

What did Iwata say? He said that the system will include 2 GB of total system memory. 1 GB of memory for games, while another 1 GB will be available for the Wii U operating system and background applications.

Why would he even have to say that? I suppose he could have been hinting at the fact that the WiiU will be more than just a game console by supplying the OS with 1 gig of ram. Many are assuming that, down the road, the 1 gig for the OS might be opened up for developers. Perhaps, but I kind of doubt it, because by coming out with this information publicly, he is basically putting a lock down on the amounts. In other words, if Nintendo had plans to open up the OS memory for games, Iwata would have simply informed the public that the WiiU has 2 gigs memory. But no, in the face of competition boasting rumors of having up to 8 gigs of memory for their consoles, Nintendo is stating, no, we are only providing 1 gig for games. Iwata also said:

The large amount of memory allocated to the system allows for switching to an Internet browser, utilizing Miiverse, or other system without ending the game, providing for smooth transitions from the game to functionality to be used in your living room, and back again. Also, excess system memory is reserved for future feature expansion.

So my question is…

Who here believes its one pool of memory? And if it is, what type?
How do you explain Nintendo making a distinction between?
System memory and memory for games?


For those who believe its 2 pools of memory, are they of different types?
What types and why?

Is there anybody here thinking there is more than two types of memory within the system? For example, the memory for games is broken down further to two types: eDRAM and GDDR5, while the system memory is GDDR3.

Speaking of eDRAM.

who here thinks both the CPU and GPU make use of eDRAM? If so, is that a shared pool?

We had a rumor that Nintendo was working with two models that consisted of 768 and 1 GB of embedded dram. Based upon Iwata’s statement. Anybody here think that Nintendo would go so far as putting 1 GB worth of eDRAM for developers? Remember these other statements:

Michel Ancel very high up on Wii U, talks 'enormous' memory, new Rayman Legends info, BG&E 2 future

and:

The executive producer of Sonic And All-Stars Racing Transformed has told Official Nintendo Magazine that he was surprised by the Wii U specs, specifically the GPU and memory which are better than the Sumo developers expected.

IBM:
The new memory technology, a key element of the new Power microprocessor that IBM is building for the Nintendo Wii U console,

I ask, would 1 gig of gddr5 or ddr3 raise eyebrows?
Shouldn’t developers expect for a next gen console, a doubling or tripling of the memory? The current gen already boast half of that. Or did they really think Nintendo was coming out with a console no better than the current HD twins?



Power Consumption:
IWATA said that the average use of the WiiU would run about 40 watts. People have made a lot of hoopla about it.


OG PS3------------189
OG Xbox 360-----172
Xbox 360 S-------- 88
PS3 Slim----------- 85
Nintendo WiiU---- 40
Nintendo Wii-------16


I don’t know how power consumption is supposed to give an indication of power, but if we go by that, the WiiU is half as powerful as the PS3 slim. That makes no sense. Nintendo has presented ‘energy efficiency’ as a console feature:

Energy Efficiency:

Wii U utilizes specially designed power-saving features to lower its energy consumption.

Questions: what technology offered by AMD and IBM would help keep the TDP footprint this low? And what chips do they have that currently makes use of this technology? Basically, are there modern power-saving features that would exclude many of their chips that people are assuming the WiiU is based off?



Power7 ?
IBM hinted at the WiiU being a Power7, at some point they said it was, then that was retracted. Why? Because it was not true, or because they weren’t allowed to say that?

Question: When a CPU or GPU is customized for a third party. Does AMD and IBM by default rename those chips? Scenario: IBM takes a Power7 as a basis for the WiiU CPU. However, because the chip is customized for use in a game console, IBM no longer refers to the chip as a Power7, which is made for servers. So, if somebody asks, what type of chip is in the WiiU, how would IBM respond? Would they give hints such as:

IBM tells us that within the Wii U there's a 45nm custom chip with "a lot" of embedded DRAM (shown above). It's silicon on insulator design and packs the same processor technology found in Watson.

If the chip was based on anything but a Power7, how could IBM describe it?
 
It looks very fake to me. I do not believe technical support will give out processor information on console products. I do not believe any support guy would have access to that info. I do not believe AMD have permission from Nintendo to divulge any such information. Anyone who's sent a support request to AMD would have a template reply they could modify.

If it is a legitimate reply, heads will be rolling somewhere.

The tech support team at AMD doesn't have any information about console products. AMD does not provide end-user support for those, so there is no reason to provide that team with any info. Especially not for currently unreleased products. When was the last time you emailed AMD about a problem with your Xbox360 or Wii? You are not our customer. Our customer for these chips is MS or Nintendo, who pay many millions of dollars. When they have support issues/questions, they do not go through the public-facing tech support team.

They (and their lawyers) also expect us to keep our mouths shut.

This is either fake, or a support guy trying to sound like he knows something when he does not. Either way, it tells you nothing.

The vast majority of people inside AMD have no idea about the details of the WiiU. It was done by relatively small team and any information outside that team was "need-to-know". Even if you surveyed the GPU IP team which originally designed the base GPU family, >95% of them could not tell you what the configuration is. Only a few needed to be involved to get the specific configuration correct and working, and they know to keep their mouth shut. All additional modifications were done by the "need-to-know" team.

Before you ask.. Yes, I know all the details. No, I will not tell you any of them.
 
The tech support team at AMD doesn't have any information about console products. AMD does not provide end-user support for those, so there is no reason to provide that team with any info. Especially not for currently unreleased products. When was the last time you emailed AMD about a problem with your Xbox360 or Wii? You are not our customer. Our customer for these chips is MS or Nintendo, who pay many millions of dollars. When they have support issues/questions, they do not go through the public-facing tech support team.

They (and their lawyers) also expect us to keep our mouths shut.

This is either fake, or a support guy trying to sound like he knows something when he does not. Either way, it tells you nothing.

The vast majority of people inside AMD have no idea about the details of the WiiU. It was done by relatively small team and any information outside that team was "need-to-know". Even if you surveyed the GPU IP team which originally designed the base GPU family, >95% of them could not tell you what the configuration is. Only a few needed to be involved to get the specific configuration correct and working, and they know to keep their mouth shut. All additional modifications were done by the "need-to-know" team.

Before you ask.. Yes, I know all the details. No, I will not tell you any of them.

I believe you. Great post.
 
Apparently someone else emailed AMD



http://m.neogaf.com/showthread.php?t=490844&page=100000


Guys, that email was REAL. I just decided to email AMD on a whim and they sent me the EXACT same thing.

p2NpW.png
 
Over on GAF, many have been falling for the bogus email.

However now the above post by BobbleHead has appeared and with that, I am glad.

It won't matter. This is the internet. People believe whatever source reinforces what they already (want to) believe, or makes them think they have inside information that no one else has (making them "special"). Just look at the people who still wanted to believe there was a ~600mm2 Power7 plus dual ~300mm2 GPUs hiding in the svelte case Nintendo showed more than a year ago. That would turn into a pile of melted plastic from all that heat. Nintendo does some pretty cool stuff, but they can't violate the laws of physics. At least not yet.

If you want real information, you'll have to ask Nintendo. You won't get any from AMD unless they specifically tell us to release something.
 
It won't matter. This is the internet. People believe whatever source reinforces what they already (want to) believe, or makes them think they have inside information that no one else has (making them "special"). Just look at the people who still wanted to believe there was a ~600mm2 Power7 plus dual ~300mm2 GPUs hiding in the svelte case Nintendo showed more than a year ago. That would turn into a pile of melted plastic from all that heat. Nintendo does some pretty cool stuff, but they can't violate the laws of physics. At least not yet.

If you want real information, you'll have to ask Nintendo. You won't get any from AMD unless they specifically tell us to release something.

Thank you for your reply. I won't bother asking AMD or Nintendo. I know I won't get any answers. As an end-user, all I can do is buy the product (the console and games) to see with my own eyes what is being pushed to the screen. That will be the best proof of all, regarding what the GPU is capable of. I look forward to doing this in less than two months.

Again, thanks for your response.
 
Okay folks, B3D != gaf. You don't need to post every stupid rumor from gaf in this thread. If there is *compelling* information (e.g. confirmed facts) from gaf, then by all means post away. But this thread has gotten ridiculous. Do you really think someone from AMD's support team knows what GPU is in the WiiU? Look I love Mario 64 too, but you guys can't be this blind (I hope)!

I'll tell you guys the same thing I told ChiefO (or whoever it was that was SURE the new xbox would be out in 2012): if what you want the WiiU to be is the same as what you think the WiiU will be, then you've probably set yourself up to be disappointed. You need to separate what you want and what's realistic (something I have seen MANY fans fail to do).
 
Of course it'll be very interesting to see Nintendo's 1st party 2013 games, how they take advantage of the GPU and the overall strengths of WiiU.

This is somewhat problematic Megadrive. Nintendo will only allocate larger budgets & resources to IPs they deem "viable" so to speak. Titles such as SSB, 3D Mario, Zelda, etc. Monolith Soft & Retro's projects as well. (specific collaborations & particular new IPs included) Kirby, Mario Kart, Warioware, Mario Party, 2D Mario, etc.do not require massive HD assets. And therefore will receive none.(as already demonstrated) This helps control the cost of HD development for a company as large as Nintendo with so many IPs. Many will say initially they look like uprezzed Wii games, & they would be right.
 
This is somewhat problematic Megadrive. Nintendo will only allocate larger budgets & resources to IPs they deem "viable" so to speak. Titles such as SSB, 3D Mario, Zelda, etc. Monolith Soft & Retro's projects as well. (specific collaborations & particular new IPs included) Kirby, Mario Kart, Warioware, Mario Party, 2D Mario, etc.do not require massive HD assets. And therefore will receive none.(as already demonstrated) This helps control the cost of HD development for a company as large as Nintendo with so many IPs. Many will say initially they look like uprezzed Wii games, & they would be right.

I should have clarified with, Nintendo's major franchises, such as 3D Mario, Zelda, and SSB.
 
Power Consumption:
IWATA said that the average use of the WiiU would run about 40 watts. People have made a lot of hoopla about it.


OG PS3------------189
OG Xbox 360-----172
Xbox 360 S-------- 88
PS3 Slim----------- 85
Nintendo WiiU---- 40
Nintendo Wii-------16


I don’t know how power consumption is supposed to give an indication of power, but if we go by that, the WiiU is half as powerful as the PS3 slim. That makes no sense. Nintendo has presented ‘energy efficiency’ as a console feature:

Well, power consumption is indicative of performance indirectly. Given a process tech, 45, 32 28nm, there's only some much compute you can do per watt. A lot of companies (at least internally) use performance/watt as a metric for the efficiency of their designs. IF semiconductor scaling was 100% efficient and so was the architecture, you should be able to double the performance at the same power consumption or keep the same performance at half the power.

Given that the WiiU is at 45nm like IBM indicated (although we don't know what the GPU is) like the other consoles (360's SOC is at 45nm, not sure about PS3), but the WiiU consumes half the power, that too me indicates that its alot more efficient than the other two, but I'm not expecting much beyond that. There's only so much you can do at a given node, I'm not expecting miracles. Tell me theres a 28nm GPU in there, and I'll believe it's a lot more powerful than the other two.


I hadn't thought that the WiiU could have two diffent types of RAM. It's possible. You can have 2 slow DDR3 chips on a 32-bit bus for the OS/Apps and 4 GDDR5 on a 64-bit bus. I theory that could provide anywhere from 25-50 GB/s. Eventually that could reduce down to 1 8Gb DDR3 chip 2 4Gb GDDR5 chips. Long term that would probably be fairly cheap.
 
What interests me about the WiiU GPU the most is, not the raw spec, but rather, the collective experience AMD has gained from all of its acquisitions--such as ArtX, Real3D and ATI. Probably others too, that I'm just not aware of.
 
With the number of rumors and speculations floating around, can we start from a clean slate based on what we know? Speculate from the given facts.

Lets start with the memory.

What did Iwata say? He said that the system will include 2 GB of total system memory. 1 GB of memory for games, while another 1 GB will be available for the Wii U operating system and background applications.

Why would he even have to say that? I suppose he could have been hinting at the fact that the WiiU will be more than just a game console by supplying the OS with 1 gig of ram. Many are assuming that, down the road, the 1 gig for the OS might be opened up for developers. Perhaps, but I kind of doubt it, because by coming out with this information publicly, he is basically putting a lock down on the amounts. In other words, if Nintendo had plans to open up the OS memory for games, Iwata would have simply informed the public that the WiiU has 2 gigs memory. But no, in the face of competition boasting rumors of having up to 8 gigs of memory for their consoles, Nintendo is stating, no, we are only providing 1 gig for games. Iwata also said:

The large amount of memory allocated to the system allows for switching to an Internet browser, utilizing Miiverse, or other system without ending the game, providing for smooth transitions from the game to functionality to be used in your living room, and back again. Also, excess system memory is reserved for future feature expansion.

So my question is…

Who here believes its one pool of memory? And if it is, what type?
How do you explain Nintendo making a distinction between?
System memory and memory for games?


For those who believe its 2 pools of memory, are they of different types?
What types and why?

Is there anybody here thinking there is more than two types of memory within the system? For example, the memory for games is broken down further to two types: eDRAM and GDDR5, while the system memory is GDDR3.

IBM:

I ask, would 1 gig of gddr5 or ddr3 raise eyebrows?
Shouldn’t developers expect for a next gen console, a doubling or tripling of the memory? The current gen already boast half of that. Or did they really think Nintendo was coming out with a console no better than the current HD twins?


(Referencing bolded) Not necessarily... I would highly doubt that Nintendo would open all or even most of that reserved 1GB of RAM to games. However, it may be possible that in the future, depending on what OS features they decide to implement, they may release some, i.e. 256-320MB, of that RAM to games. Similar to how they later allowed developers to use up to 25%, IIRC, of the 2nd arm11 core for games on the 3DS.

I have also thought about the possibility that the OS RAM is not just a sectioned off part of the main memory and is a separate, and likely slower, pool. It which case I would doubt that they would ever open it up for games. Such a case would remind me of the GCN, where the separate 16MB of DRAM was almost useless to devs because it was only accessible through an 8-Bit bus and had very low bandwidth.
 
Wii U memory can't be GDDR5 because that would mean Nintendo put 32 MB of eDRAM in there just for sake of it and wasted good amount of money and transistor budget for something they don't need.

GDDR5=high GPU bandwidth, no need for eDRAM for frame buffer.
DDR3 = low GPU bandwidth, need eDRAM for frame buffer to workaround low bandwidth of main memory.

Wii U doesn't have GDDR5, and if it had, than that would definitely bring TDP and costs up, and it looks very low as it is now.

I would say that if Nintendo took existing R7xx design and worked on that, that would be R730. Its 128 bit card with DDR3, 480GFLOPs so that would be the ballpark they would go for. Highest TDP is 60W and is manufactured on 55nm process. Shrinking and customizing would bring that power draw to ~30-40W and thats max for Wii U.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top