Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
The gamepad IS much more complex than a $50 chinese android tablet
The build quality is definitely better than those Chinese tablets but Nintendo tend to use cheap internal components for their units. So I have no doubt things like the front facing camera, accelerometers and gyros will be in the ball park of cheap devices. Even the LCD screen isn't exactly cutting edge compared to average Android tablets.
 
PS3 sells for $100 less, includes a HDD, more elaborate cooling, a larger PSU. That leaves the wuublet, does it really cost $100 more than a dual shock? I am skeptical.
 
I expected Wii U would be able to render current generation lv graphics at 1080p without much problem at least..
There is no excuse..if so called next generation console only manages to keep up with 7 year old console, it is pathetic no matter how u put it.
 
PS3 sells for $100 less, includes a HDD, more elaborate cooling, a larger PSU. That leaves the wuublet, does it really cost $100 more than a dual shock? I am skeptical.
I am skeptical too, though I don't think that when Nintendo speak of loosing money they speak of the BOM alone.
And I agree with SargonHamurabi, the performance are kind of unacceptable imo for a design that comes 7 year later than the 360 (even though both the 360 and the ps3 were getting closer to power wall). At least the Wii was more powerful than the gamecube that was in the same ballpark as the xbox/ps2 / the console of the same generation.

I read the DF article about the WiiU, the comparison they make with their 300 Pounds PC, is not flattering to say the least. Nothing unexpected, sadly.

I expected Wii U would be able to render current generation lv graphics at 1080p without much problem at least..
There is no excuse..if so called next generation console only manages to keep up with 7 year old console, it is pathetic no matter how u put it.
Well I would not push that far but sticking to 720p there should be neat and visible improvements in IQ across the board (from texture quality, to texture filtering, shadows, lightning, some form of anti aliasing, etc.).
 
Last edited by a moderator:
So it has a dual-core ARM in there, aswell as a separate DSP? Any one want to have a stab at the reeasoning behind that?
Wii had an ARM CPU called "Starlet" that worked as a sort of middle man for security, I/O and for the standby mode operation. Actually it was one of the major differentiators between Wii and Cube.

The DSP may be related to backwards compatibility since Wiicube used one for audio processing.
 
So it has a dual-core ARM in there, aswell as a separate DSP? Any one want to have a stab at the reeasoning behind that?
Where have you seen dual-core?
An ARM CPU in the northbridge/GPU would be kind of predictable because there was one already in the Wii, but I haven't seen anything about it being dual-core.
That would mean an even smaller/weaker GPU.


The gamepad IS much more complex than a $50 chinese android tablet

How is the gamepad "much" more complex? See the ifixit teardown for youself. There's no SoC, no RAM and a measly 32MB of cheapo local storage.
The only "advanced" part that may go in there is the dual-channel broadcom Wireless-N module that #may# be working at 5GHz to ensure better connection with the console. The rest is just a bunch of hardware decoders and call it a day.
How is that "much" more complex? How is it more complex at all?
 
Wii had an ARM CPU called "Starlet" that worked as a sort of middle man for security, I/O and for the standby mode operation. Actually it was one of the major differentiators between Wii and Cube.

The DSP may be related to backwards compatibility since Wiicube used one for audio processing.

Oh, ok. Good points.


Where have you seen dual-core?
An ARM CPU in the northbridge/GPU would be kind of predictable because there was one already in the Wii, but I haven't seen anything about it being dual-core.
That would mean an even smaller/weaker GPU.

Wow, that's a big conclusion to jump to isn't it? We know nothing about the ARM chip to make a call about its power draw, impact on other hardware etc. We also know nothing about the GPU to start with so what are we basing this assumption on?

Dual core I must admit was just from blu's tech thread on gaf. The way everyone was chatting about it I figured it was confirmed...
 
Wait wait wait, in the previously posted tear-down there are 4 separate DDR3-1600 chips (each one 512MB) on the board. Would it be that each of them have a bandwith of 12.8 GB/s individually and so there is a total bandwith of 50.4 GB/s for the RAM, or would it just flatly be 12.8 GB/s?

If it's the first case, then I can see why the ports still came out the way they did. They were not developed to access the RAM in such a fashion, so the de-facto bandwith was very low.
 
Wait wait wait, in the previously posted tear-down there are 4 separate DDR3-1600 chips (each one 512MB) on the board. Would it be that each of them have a bandwith of 12.8 GB/s individually and so there is a total bandwith of 50.4 GB/s for the RAM, or would it just flatly be 12.8 GB/s?

If it's the first case, then I can see why the ports still came out the way they did. They were not developed to access the RAM in such a fashion, so the de-facto bandwith was very low.
Are you suggesting each Ram chip has it's own data bus?
 
Wait wait wait, in the previously posted tear-down there are 4 separate DDR3-1600 chips (each one 512MB) on the board. Would it be that each of them have a bandwith of 12.8 GB/s individually and so there is a total bandwith of 50.4 GB/s for the RAM, or would it just flatly be 12.8 GB/s?

If it's the first case, then I can see why the ports still came out the way they did. They were not developed to access the RAM in such a fashion, so the de-facto bandwith was very low.

No offense, but don't you think that if there was even a slight chance of what you are describing here to be true, that it would have already been pointed out.

12.8 GB/s is the total.
 
How is the gamepad "much" more complex? See the ifixit teardown for youself. There's no SoC, no RAM and a measly 32MB of cheapo local storage.
The only "advanced" part that may go in there is the dual-channel broadcom Wireless-N module that #may# be working at 5GHz to ensure better connection with the console. The rest is just a bunch of hardware decoders and call it a day.
How is that "much" more complex? How is it more complex at all?

NFC
Camera
Gyro
Accelerometer
Magnetometer
Microphone
Bunch of DSPs
IR send/receive
Dual analogs
Nintendo QA
 
*ahem* This thread is about the speculative look at the Wii U GPU. Discussion about other topics will continue to be removed. Continuing to post off-topic will result in bans.
 
What are the benefits of the 12.8 GB/s on the MEM2 pool of RAM?

Not a whole lot, and it will be difficult to fully gauge until we know the bandwidth figures for the edram in the GPU. If we're talking a staggeringly high bandwidth then it will mitigate the low main memory bandwidth better than having edram with low bandwidth. 12.8 GB/s is really slow for a new system coming out and really will make it hard on developers at first (as can be evidenced) to get games up to parity with PS360. There's practically zero advantages of having 12.8 GB/s, with 2 GB memory, when your direct competition has a higher level of bandwidth. The edram here is basically the wildcard in terms of determining just how Wii U is memory wise. It could be devs who did magic with the PS2 can repeat that with Wii U.

I do hope we learn more about the edram and how it all fits in with the GPU in the near future.
 
Wait wait wait, in the previously posted tear-down there are 4 separate DDR3-1600 chips (each one 512MB) on the board. Would it be that each of them have a bandwith of 12.8 GB/s individually and so there is a total bandwith of 50.4 GB/s for the RAM, or would it just flatly be 12.8 GB/s?

Each of the DRAMs is specified as having 16-bit I/O, hence 64-bit bus total.
 
Well, I would guess that if they take in account all the launch expanses, it is pretty easy to understand.
For the company pov, financially, the launch expanses are part of the costs there are no reason to discard them.
Say they are in the grey, add the marketing campaign, new online infrastructure, etc. you got there fast. You got to tell the investors that you might loss some money.

Actually, it really seems like they're incompetent since the 'selling below cost' claim only takes into account manufacturing costs:

In addition to the yen’s continuous appreciation, the Wii U hardware will have a negative impact on Nintendo’s profits early after the launch because rather than determining a price based on its manufacturing cost, we selected one that consumers would consider to be reasonable. In this first half of the term before the launch of the Wii U, we were not able to make a profit on software for the system while we had to book a loss on the hardware, which is currently in production and will be sold below cost.

http://www.nintendo.co.jp/ir/en/library/events/121025/04.html

Even DF find it incredulous that Nintendo is managing to make a loss on the hardware at $300.
 
I have no doubt it took some creative accounting to make such a self serving statement in the face of so much criticism about the price.
 
Status
Not open for further replies.
Back
Top