Fact: Nintendo to release HD console + controllers with built-in screen late 2012

I was just trying to play some Wii games in 3D on the Dolphin Emulator, when it just occurred to me, if "Project Cafe" is intended to support 3D at all, they may have to come up with a different solution for the pointer interface. Most 3D displays have an IR LED to sync with the shutter glasses, which could interfere with the Wiimote's tracking of the IR Sensor Bar. I'm not having any problems with my emulation setup, but my 3D Vision IR emitter may operate differently from the emitters on TVs.
 
Last edited by a moderator:
I bet there will be embedded RAM in Cafe. It's got to at least match Xbox 360's Xenos which has 256MB/sec bandwidth on-chip.

A modern GPU doesn't need edram to match Xenos. An R740 based GPU would wipe the floor with Xenos, with or without edram. It's simply not needed to meet that requirement.
 
Yeah, the embedded memory in the GC/Wii didn't have crazy huge bandwidth like the PS2's GS. IIRC, it was something like 13 GBps compared to the PS2's 48.
 
a 1500mAh battery to power a 6" screen? That would be good for what? 2-3 hours tops?

Two, three hours might be sufficient if the screen only comes on when it's required. Personally, I doubt the screen will function as the primary display in any situation. The benefit of using it to avoid split-screen in multiplayer games or to substitute for an unavailable television set is marginal at best. It makes little sense for Nintendo to spec the controller for such purposes. A controller is an input device first and foremost. The screen will mainly provide visual feedback in game situations involving touch or motion controls, I think.
 
R740 is more than capable of emulating Wii/GC without edram. It's already doing it on the PC via Dolphin and Nintendo's implementation would be far better optimised.


But you can't emulate in real-time a physical advantage like memory bandwidth.
The Wii's eDRAM is supposedly in the ~27GB/s (GC's ~18GB/s x 1.5), add the ~4GB/s from the 1T-SRAM and probably another ~3GB/s (?) from the GDDR3 and you'll have a peak of ~34GB/s of combined memory bandwidth.

If you're using a single memory pool, you have these options:

- Have no BC at all
- APU with >3MB L3 cache shared between GPU and CPU, and let the GPU use the L3 cache for texture and frame buffer.
- Limit the BC to some games who won't reach a certain bandwidth threshold (don't even know how doable that would be).
- Get some eDRAM in the GPU to compensate for the 27GB/s eDRAM.
- UMA 128-bit GDDR3 @2200MHz+ (35GB/s)
- UMA 192-bit DDR3 @1600MHz+ (38,4 GB/s)
- UMA 256-bit DDR3 @1066MHz+ (34.1GB/s)
- UMA 128-bit GDDR5 @3.2GHz+ (51.2GB/s)
- UMA of some MoSys 1T-SRAM combination, like 128-bit 2GHz? I wouldn't know..



The benefit of using it to avoid split-screen in multiplayer games or to substitute for an unavailable television set is marginal at best. It makes little sense for Nintendo to spec the controller for such purposes. A controller is an input device first and foremost. The screen will mainly provide visual feedback in game situations involving touch or motion controls, I think.

If that was true, I doubt Nintendo would invest in a 6" screen, as a ~3" would suffice for that purpose.

Being able to play the games away from the TV+console will most definitely be one of the main selling points. To play the "full-fledged" games without having to isolate yourself or the whole living room is definitely interesting from a social point of view.

But this brings another factor in the discussion: sound.
Will the controller have stereo speakers (if so, what quality? how loud?) and\or a headphone jack? Or maybe even a special digital output for custom 5.1 headphones?
Or analog jack output capable of doing either stereo or Dolby Prologic IIx, with 3rd parties being able to launch custom 7.1\5.1 headphones with a built-in DPIIx decoder?
 
Last edited by a moderator:
The wiiremote already has a speaker, even though it's not exactly great. With all the rumors about the controller, they could probably easily afford putting in a pair of speakers along all the big touch screens lol. Though I don't really see a point in 5.1/7.1 options. Really, apart from PR talk, 99,9% of the people do not even own a headphone/earplugs that can be called pretty decent. Let alone high end headphones. Seems like a waste of money to me.
 
The wiiremote already has a speaker, even though it's not exactly great. With all the rumors about the controller, they could probably easily afford putting in a pair of speakers along all the big touch screens lol.

But loud-enough speakers would spend a lot of power. 2*1W speakers would consume more than 2W, and that's a sizeable consumption if you're concerned about battery life in a "handheld".

The speaker in the wiimote is puny, and you only hear it from time to time for a few effects (I wouldn't be surprised if Nintendo actually limits the speaker's usage for developers somehow).
Stereo speakers in the Stream\Café controller would be used 100% of the time, there'd be a big difference between both situations.


Though I don't really see a point in 5.1/7.1 options. Really, apart from PR talk, 99,9% of the people do not even own a headphone/earplugs that can be called pretty decent. Let alone high end headphones. Seems like a waste of money to me.


If the console is sending a digital stereo signal for a DAC in the controller, and given that the console will most certainly support at least Dolby Prologic IIx, then supporting 7.1\5.1 DPIIx through the analog jack output would come at no cost at all.
At least for a single-controller\single-player situation, it wouldn't require more processing power.
I could see single and two player modes supporting DPIIx for 5.1 and 3-4 player modes going down to "virtual 3D" stereo mode -> depending on either there's a dedicated sound DSP or how beefy the CPU is.
 
a 1500mAh battery to power a 6" screen? That would be good for what? 2-3 hours tops?
1500mAh and even under keeps phones with ~4"+ screens functional for days, with screen on all the time still over a day probably. Sure 6" eats more than 4", but it doesn't need to keep rest of the phone hardware alive too.
 
1500mAh and even under keeps phones with ~4"+ screens functional for days, with screen on all the time still over a day probably. Sure 6" eats more than 4", but it doesn't need to keep rest of the phone hardware alive too.

1. No, you're not getting anywhere near a day with the screen on from a phone (8-10 hours probably). a 6" screen is near triple the size.
2. The rumored controller isn't just powering a screen either. At least I'd hope it's not.
 
Being able to play the games away from the TV+console will most definitely be one of the main selling points. To play the "full-fledged" games without having to isolate yourself or the whole living room is definitely interesting from a social point of view.

Either the built-in screen brings enhanced game-play or it doesn't. If it's used for enhanced game-play, then obviously that precludes you from using it as the main display. You can mandate that all games must be playable without the additional screen, but that would mean you can't use it for any signature features. In any event, a interface designed for a modern television set probably won't be all too legible when squashed down to 6".
 
Either the built-in screen brings enhanced game-play or it doesn't. If it's used for enhanced game-play, then obviously that precludes you from using it as the main display. You can mandate that all games must be playable without the additional screen, but that would mean you can't use it for any signature features. In any event, a interface designed for a modern television set probably won't be all too legible when squashed down to 6".
Don't interfaces still need to be readable on SD resolutions?
 
1. No, you're not getting anywhere near a day with the screen on from a phone (8-10 hours probably). a 6" screen is near triple the size.
2. The rumored controller isn't just powering a screen either. At least I'd hope it's not.

Ah ye, you're right, I just re-checked the numbers, AOSP Android ROMS can go around 100-110mA with screen on, but that still includes everything else in the phone on.
 
Ah ye, you're right, I just re-checked the numbers, AOSP Android ROMS can go around 100-110mA with screen on, but that still includes everything else in the phone on.

What is everything else in the phone? The controller is going to need some computational ability in there if it's going to be running this display with streaming support and its going to need wifi or bluetooth and probably the ability to power a headset so it's not like there's a big savings vs a phone.
 
Either the built-in screen brings enhanced game-play or it doesn't. If it's used for enhanced game-play, then obviously that precludes you from using it as the main display.

Not at all.

If you play with TV+controller screen, you get a HUD-less image on the TV and game info on the controller, main screen keeps rendering the HUD-less image when you do "mini-games" on the controller.
If you play with the controller screen only, you get a HUD in front of the game, just like every other game so far, main screen shuts off while you play the mini-games, or they simply do it with some kind of transparency.

And tactile feedback will still be there regardless of using the big or the small screen, so there's still that differentiation.

Besides, with a 960*540 resolution in the controller, you actually get 35% more pixels than the current 800*480 for Wii games, and they don't seem to suffer from "ergonomy" in their HUDs, and neither did the games in older consoles.



What is everything else in the phone?
Baseband processor continuously running, application processor in low-power mode running Android's OS services, RAM for all that, random accesses to mass storage?
It's certainly a substantial chunk of the device's power consumption.



The controller is going to need some computational ability in there if it's going to be running this display with streaming support and its going to need wifi or bluetooth and probably the ability to power a headset so it's not like there's a big savings vs a phone.

If all it does is receive video+audio+rumble+tactile feedback and send touchscreen+buttons(+camera?) input, I don't see why they'd need anything above very power-efficient, fixed function hardware.
For the Wii, it was an application processor (ARM9 Starlet) that handled I\O, but that's mainly because it also handled system updates during standby mode. That's not something the (dumb) controllers are going to need.


Apart for the wireless communication (which I don't really think it can be either BT or WiFi), it's nothing like the power demands of a CPU+GPU+Cache+RAM+Storage that you'd find in a handheld.
 
If you play with TV+controller screen, you get a HUD-less image on the TV and game info on the controller, main screen keeps rendering the HUD-less image when you do "mini-games" on the controller.
If you play with the controller screen only, you get a HUD in front of the game, just like every other game so far, main screen shuts off while you play the mini-games, or they simply do it with some kind of transparency.

It's not a very logical setup. You get the HUD-less experience when there's plenty of screen real estate. When screen real estate is limited, you have to squeeze in more visual elements. It's more trouble for game developers, more constraints on game design, more demand on battery power for ultimately an out-of-the-norm usage scenario.
 
It's not a very logical setup. You get the HUD-less experience when there's plenty of screen real estate. When screen real estate is limited, you have to squeeze in more visual elements. It's more trouble for game developers, more constraints on game design, more demand on battery power for ultimately an out-of-the-norm usage scenario.

How on Earth is getting a HUD on top of the game rendering more battery demanding?!
The rendering is all done in the console, not the controller!

Trouble for developers? So they need to implement vectorized images for the HUD instead of fixed-sized images, what's the big deal?
 
Trouble for developers? So they need to implement vectorized images for the HUD instead of fixed-sized images, what's the big deal?
With Scaleform being the standard in UE3 now, and used in plenty of other games, scaling HUD and menu elements is trivial. But if you were to take that stuff off the main screen and put it on a supplemental screen, you'd probably want to use a different layout to make the most of all the space available. When they rip the HUD off a game and put it on the DS's bottom screen, they rearrange everything.
 
Back
Top