Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I wonder if someone will make possible to use the tablet controller through a 5GHz WiFi router.
Then maybe set up a controller in windows along with a 800*480 secondary screen. Or better yet, a downscaled 1080p cloned screen with touch support for Win8 in TV scale mode.

Now that would rock in my media center.


EDIT:
Turns out this is going to be much better. A $150 tablet with embedded gamepad.
yEQCT.jpg

Get Splashtop in it and it's a go.
 
Last edited by a moderator:
It's custom on the protocol-level of the network connection (instead of standard tc-pip, you have something else; "nintendovision" or whatever), but not on the lower link level as you want to remain compatible with other wireless networking gear to avoid problems, maybe regulatory issues as well.

I think the other way around.

the protocol-level would probably be based on tcp-ip, or something like it.
I don't see changing anything above level 2 on the OSI model to make sense. Collisions would occur on Layer 1 with other devices, and customizing Layer 2 or above wouldn't solve anything. Customizing Layer 1 to prevent frame collisions would be much more efficient and less problematic than customizing other layers.

For remaining compatible with other devices, I see the Wii U as having 2 different wireless interfaces.
One for the standard Wifi that works with existing 802.11 b/g/n devices, and the other one using custom spectra to deal with multiple Wii Us and game pads in the same environment.
 
I wonder if someone will make possible to use the tablet controller through a 5GHz WiFi router.
Then maybe set up a controller in windows along with a 800*480 secondary screen. Or better yet, a downscaled 1080p cloned screen with touch support for Win8 in TV scale mode.

Now that would rock in my media center.

Probably not I think. Same reason DS and 3DS on their special "DS networks" didn't work with our wi-fi USB cards, thus many games not accessible through stuff like XLINK.
 
A colleague of mine installed something on his pc that can basically stream anything to almost any Android or iOS device So not sure why beig able to do it to the GamePad would be such a big win ...
 
I think the other way around.

the protocol-level would probably be based on tcp-ip, or something like it.
I don't see changing anything above level 2 on the OSI model to make sense. Collisions would occur on Layer 1 with other devices, and customizing Layer 2 or above wouldn't solve anything. Customizing Layer 1 to prevent frame collisions would be much more efficient and less problematic than customizing other layers.

For remaining compatible with other devices, I see the Wii U as having 2 different wireless interfaces.
One for the standard Wifi that works with existing 802.11 b/g/n devices, and the other one using custom spectra to deal with multiple Wii Us and game pads in the same environment.

Well, let's say for the sake of simplicity that Layer 1 is the physical aspects of wifi 5GHz and Layer 2 is ethernet.
I don't see why you would change any of that, it seems to work fine : multiple wifi devices do work without shutting out each other, unless there is a lot of congestion. It all works already ; 802.11 n even allows devices to use multiple channels for the better or the worse, or at least seamlessly switching between them to manage to get the packets across.. if I'm not mistaken.

If you manage to suffer too much congestion so that your Wii U gets unusable, well, bad luck. It shouldn't happen : 5GHz waves have a harder time crossing walls and obstacles, and 5GHz wifi has a pretty wide spectrum.

Then using something other than TCP/IP is absolutely no big deal I think. I remember ~10 years ago when I had TCP/IP, IPX (for games such as doom2) and NetBIOS all on the same 10Mb ethernet network (we had two 98SE PC, one XP and one 3.11)
Using a custom, non IP protocol is worth it : there would be absolutely no downside to it and whatever gains in latency, better sized packets, simplicity you get are free gains.
 
A colleague of mine installed something on his pc that can basically stream anything to almost any Android or iOS device So not sure why beig able to do it to the GamePad would be such a big win ...

I've seen that app I think, it can transcode movies on the fly and he can view them wherever there's a fast enough internet connection. But watching a movie on a portable device is one thing - when you also want to use it as a controller, latency will suddenly become very important.
 
If you manage to suffer too much congestion so that your Wii U gets unusable, well, bad luck. It shouldn't happen : 5GHz waves have a harder time crossing walls and obstacles, and 5GHz wifi has a pretty wide spectrum.
In the range they have chosen, there are only 4 channels available (36,40,44,48). I don't have a WiiU yet, I'm curious if they allow to change the channel to avoid interference, or if it's automatic and/or spread spectrum (bluetooth style), or if they need all 4 of them.
 
Given the resolution and likely compression ratios, there would be more than enough bandwidth on a single WiFi channel, they can easily increase compression ratios and reduce bandwidth dynamically if they start having issues.
Again streaming video over a wifi connection with relatively little latency isn't rocket science.
The only reason people think the low latency to the WiiU tablet is impressive is because most modern TV's suck and have hideous latency.
 
Does it really matter if it's H264 vs MJPEG?
Only in trying to understand:
1) If the Wuublet screen is imposing a need for V-sync on the main TV. Apparently not as the source needn't be a complete image but is sent piecemeal.

2) What's the decoding HW in the Wuublet and how will that affect power consumption.
 
Only in trying to understand:
1) If the Wuublet screen is imposing a need for V-sync on the main TV. Apparently not as the source needn't be a complete image but is sent piecemeal.

Wouldn't matter, the only requirement is that after the frame finishes, the buffer is available for long enough to do the compression. Doesn't have to be on screen, doesn't matter if it's VSync'd. They probably insert a fence at the point the swap is inserted into the command buffer and then use some sort of CPU side sync primitive to hold up primitive submission if it get's too far ahead.

2) What's the decoding HW in the Wuublet and how will that affect power consumption.

If you exclude B and P frames they are very close to each other (macro blocks are different sizes, and quantization is slightly different), There are hardware solutions available for both, and the only solution I've ever seen the actual HW documentation for was actually programmable enough to do either.
 
Well, let's say for the sake of simplicity that Layer 1 is the physical aspects of wifi 5GHz and Layer 2 is ethernet.
I don't see why you would change any of that, it seems to work fine : multiple wifi devices do work without shutting out each other, unless there is a lot of congestion. It all works already ; 802.11 n even allows devices to use multiple channels for the better or the worse, or at least seamlessly switching between them to manage to get the packets across.. if I'm not mistaken.

If you manage to suffer too much congestion so that your Wii U gets unusable, well, bad luck. It shouldn't happen : 5GHz waves have a harder time crossing walls and obstacles, and 5GHz wifi has a pretty wide spectrum.

Then using something other than TCP/IP is absolutely no big deal I think. I remember ~10 years ago when I had TCP/IP, IPX (for games such as doom2) and NetBIOS all on the same 10Mb ethernet network (we had two 98SE PC, one XP and one 3.11)
Using a custom, non IP protocol is worth it : there would be absolutely no downside to it and whatever gains in latency, better sized packets, simplicity you get are free gains.

According to wiki they are using the 5Ghz spectrum which is different from the 802.11n spectra (which uses the 2.4 Ghz range) and some type of proprietary transfer protocol. The technology seems to be the same, just using a spectrum that currently no off the shelf wifi device uses.

This tells me that they took 802.11 and customized it to the 5Ghz spectrum to take care of the first two layers, and they customized the third and fourth layers to their requirements as they would not need the complexity and functionality that TCP/IP stack provides.

So yes they customized the whole damn thing, for the better or worse.

Isolating the WiiU wifi network by using 802.11n technology on a complete unused spectrum would solve pretty much all the congestion issues it may have in contrast to using the standard spectrum.

Of course today's wifi has plenty bandwidth to stream compressed video, there's no doubt about that.
However, the problem with using the same wireless spectrum is that wireless uses collision avoidance.
Two devices cannot transmit on the the same channel at the same time, or else you end up with destroyed frames. (also the reason it's half-duplex) Yes you could use different channels, but the amount of channels in the 2.4 Ghz range is limited, and you cannot control how many channels are being taken and "abused" by other devices.
The problem occurs when you have some devices in the existing wifi environment transmitting a shitload of stuff (think conferences), thus you would want to use a complete separate unused spectrum to maintain reliability and limit interference.


I'm kind of getting lost on the direction of this discussion, but anyway, based off of the current information we probably won't see people hacking some router or usb wifi stick to communicate with the game pad as the devices are already incompatible from the first layer.
 
http://www.pcper.com/reviews/General-Tech/Nintendo-Wii-U-Teardown-Photos-and-Video
UPDATE: We have confirmation from Polygon.com that this controller does in fact power the gamepad, utilizing a version of Miracast:
"The technology, co-developed by Nintendo and wireless and broadband communications giant Broadcom, marries run-of-the-mill Wi-Fi with a powerful bit of proprietary software to create a two-way stream of low-latency, high-definition video and controls between the Wii U and its innovative GamePad.
That complex suite of software is designed to mitigate interference and deliver a smooth video signal and communication speeds, said Dino Bekis, senior director of wireless connectivity at Broadcom.
The technology is built on top of something called Wi-Fi Miracast, which Broadcom first developed last summer. It's a system that is specifically designed to deal with interference issues while maintaining liquid fast two-way communication.
Broadcom and Nintendo then teamed up to create a more solid system for the Wii U-to-GamePad connection."
I can't find any public spec for Miracast.

Edit: I don't think this makes any difference to the 3d performance... and it's not particularly interesting.

The only obvious gotcha is whether the API provides "wiiu_screen_is_rendered_please_send_this_data_to_wublet()" or "wiiu_miracast_enable_hardware_screen_mirror_at_vsync(1)".
 
Last edited by a moderator:
Seems like we could have a breakthrough in Wii U clockspeeds from some hacker guy on twitter

https://twitter.com/marcan42


@0xabad1dea we're calling the WiiU security processor the Starbuck (vs. Starlet on Wii). And it seems to be about equally vulnerable, too.

Hector Martin ‏@marcan42

Wii U codenames worth knowing: system Cafe, CPU Espresso, GPU/SoC/etc. Latte, ARM secure processor Starbuck (we made that one up).

Hector Martin ‏@marcan42

@digitalfoundry 1.243125GHz, exactly. 3 PowerPC 750 type cores (similar to Wii's Broadway, but more cache). GPU core at 549.999755MHz.
 
Seems like we could have a breakthrough in Wii U clockspeeds from some hacker guy on twitter

https://twitter.com/marcan42
1.24 GHz and no changes (besides L2 cache sizes) on the cores is somewhat disappointing. If that is true Nintendo probably wanted to minimize the developement costs they have to pay IBM. I guess slapping on VMX support and using a slightly more modern implementaion of the ISA (with one or two more pipeline stages resulting in a ~2 GHz clockspeed) would have busted the budget N allowed for the CPU.

550MHz for the GPU sounds reasonable considering the power consumption. Bets are still open for the architecture (R700 or R800/Evergreen) and the exact size. But given the CPU, I'm inclined to guess just four RV770 style (full size) SIMDs (320 SPs in total) with 4 TMUs each (16 total). Compared to Xenos of the XBox360 it would mean +65% theoretical Flops (it's more flexible, so it may equate to roughly twice the arithmetic throughput on average in the real world) but just 10% faster texturing. And fitting to that low level there could be even just 8 ROPs. But let's wait and see.
 
More shading power but not much more texturing capacity would fit fine with most of Nintendo's own games using some type of NPR rendering. They could do more complex illumination models, more polygons per character, lots of animated scenery elements and such, but still use a single color texture per surface, like the previous Zelda or Mario games. Okay, the levels themselves could probably still require 2-3 layers with different scales to hide tiling repetition, but that's still no normal, spec, and gloss maps.

This NPR art style would also help keep asset production costs low - no need for normal maps so no need to build or sculpt high poly meshes, less need for troubleshooting the assets, and so on. So less development costs overall.


But why try to make the system capable of receiving 3rd party ports in theory, and then crippling it in practice? Still doesn't make too much sense...
 
Surely three texture layers can't be prohibitively expensive, rendering-wise, even the original unreal engine did that at playable framerates* on the hardware available back in the late 1990s.

*Depends on your definition of playable, if not running on a voodoo graphics board of course... :LOL:
 
http://www.pcper.com/reviews/General-Tech/Nintendo-Wii-U-Teardown-Photos-and-Video
I can't find any public spec for Miracast.

Edit: I don't think this makes any difference to the 3d performance... and it's not particularly interesting.

The only obvious gotcha is whether the API provides "wiiu_screen_is_rendered_please_send_this_data_to_wublet()" or "wiiu_miracast_enable_hardware_screen_mirror_at_vsync(1)".

Turns out that the wireless technology is awfully depressing :rolleyes:
I totally thought that they should and would steer clear from wi-fi channels, but instead. Oh well.

I eat my words.
 
Seems like we could have a breakthrough in Wii U clockspeeds from some hacker guy on twitter
https://twitter.com/marcan42

Even with this low clock some high profile PS360 games like Assassins Creed 3 runs as good on Wii U as on PS360. There must be really some magic in Wii U :)

Marcan got this in the Wii Mode of the Wii U. Could be that the specs are different in Wii Mode and normal Wii U? The Wii U CPU could also be in a lower clock when it is idle.

Many questions but Marcan is silent now. It is a pity that marcan don't offer more and more reliable information. I don't really don't if he is credible anyway.
 
Status
Not open for further replies.
Back
Top