How come you quoted the first part and not the following sentences where I explained that power saving features make loads of difference even when playing games?
I addressed what you said even if I didn't quote it. You're assuming that the load power consumption will go down but you have no empirical data to back it up. High end PCs have a much broader dynamic range for power consumption because their CPUs can work a lot harder than is necessary for most games, and their GPUs also have huge peak FLOP capabilities that are difficult to fully extract in real world software. We don't know exactly what the GPU's capabilities are but I doubt it has extremes anything like high end discrete cards.
You think that the 33W Wii U is its Furmark scenario and it'd use a lot less if better regulated. But what does this actually mean? That there's a significant percentage of the GPU blocks that should be clock gated or even power gated, but aren't? That it can still achieve the same performance while running at a lower clock speed and voltage? This is all baseless.
Even if power consumption under real world heavy load scenarios is much higher than it could be with the same fundamental architecture, which is some serious wishful thinking, it raises the serious question of why Nintendo didn't have it done this way in the first place. It's not hard to see why they'd neglect performance during the menu, because they don't expect the user to be sitting in the menu that long and the impact to their power bill isn't that great anyway. What they do care about, and what they stressed repeatedly, is the cooling requirements for the system, determined by load conditions, which would absolutely be the limiter for a handheld version. They at least say the cooling on Wii U was a design challenge for them (probably due to inexperience, but that doesn't change anything), so wouldn't they have taken reasonable steps to make the power requirements as low as they could?
It could just be that they couldn't do any better, but then how are they going to be able to for a handheld?
I'm sorry but I think you're very confused about what happens beyond the power plug.
The power brick outputs just a single DC voltage, and so do the batteries. The "power regulators needed to convert the battery voltage to various rails on the board" need to be there if you have a power brick or a battery.
But before you get to those power MOSFETs, the power bricks have to convert an AC signal to a DC one with an efficiency of ~80%. This is a physical limit of how a rectifier circuit works, because there's always a loss occurring in the diodes and smoothing capacitor(s).
Lithium Polymer batteries output a DC signal directly. There's no AC->DC conversion, there's electro-chemical potential turned to electric potential with an estimated efficiency of 99%.
So yes, using batteries instead of a power brick would consume substantially less power than whatever is being measured at the wall with a Wii U.
First of all, a quick look at efficiency ratings for 75W AC/DC convertors shows that it's easy to get ones that are > 90% (down to 12V DC). I don't know how good Wii U's is but unless you have an actual rating or measurement you can't just assume it's 80%.
The other major issue is you're acting like DC is DC when it comes to efficiency loss, when the voltage it's converted to by the AC to DC regulator vs whatever the main PMIC or other regulator circuits on a handheld are probably not going to be the same. Without knowing what these circuits go to we can't really do a comparison at all. But I do know a handheld has more diverse voltage requirements because of the LCD backlighting, unless the disc drive needs comparably high voltage rails (and we're talking about ditching optical media on the handheld)
??!?? What are you talking about?! The Wii U is designed for DC voltages like pretty much every electronics circuit in the planet.
Whatever comes before the voltage regulators only has an impact if you're coming from an AC signal because of the 50Hz/60Hz noise. Using a battery would only make the circuit work better.
This is what I mean by you acting like "DC" vs "AC" is all there is to it. There's a huge difference between regulating a 3.7V or 7.2V battery to 5V, 3.3V, 1.5V, 1V, etc..
As already mentioned, you can convert 120V AC to 12V DC at about 90% efficiency. A switching regulator over a > 50% voltage drop can easily take you down to low 90s for DC to DC conversion, and AFAIK a boost regulator is worse (if, for instance, you have to go up to 12V, which your handheld battery will probably not supply). So it's not like AC to DC is a huge cost in efficiency vs DC to DC being a negligible one. Without knowing what voltages are actually needed by the system we can't do a great analysis.
The difference in
aspect ratios would give the 3DS XL the edge in screen area. 16:10 (top 3DS XL) and 4:3 (bottom 3DS XL) has a larger area than two 16:9 screens with the same diagonal.
"They're not really smaller" means they're about the same, I'm not splitting hairs over this like you want to.
And the 3DS XL has to compensate in brightness for a parallax barrier, so you can't really assume that one would consume more than the other.
Trying to find some figures for how much turning 3D off saves battery life but I'm struggling..
Suffice it to say many would agree 3DS's battery life is too damn low, but that hasn't stopped it from selling :/
You said yourself 1W for the screens. What are you even disagreeing with when I said "at least 1W"?
If there's a 45% power consumption improvement in a single node transition, how much would there be in two? Your numbers present an even more optimistic situation than mine.
... you said a THREE TIMES improvement. It'd be about a 2.1x improvement. Not everything is up for a shrink so that wouldn't apply to the entire power budget. Nonetheless, the figure I gave of 15W is already less than half what it consumes.
The GPU is 156mm^2 large. At 20nm, it would be about 40-45mm^2, working at 550MHz. I bet the Adreno 330 is larger than that at 28nm.
Have you ever looked at actual scaling numbers? I've never seen anything shrink that much between two nodes. Realistic expectations are for it to be around half the size.
And I think you're wrong about Adreno 330.
Again, I just disagree. sub-5W using a single 20nm SoC, plus a ~35Wh battery and 5+4" screens would be feasible IMO, using the same form factor as a 3DS XL.
So a 6.6x reduction in power consumption from savings in power regulation which aren't nearly as high as you think, node shrinks which will give you ~2x optimistically (power savings have been going down with shrinks save for major technology changes, you'd probably get something from HKMG here though), dropping the disk drive, and improvements in power efficiency which you're hand-waving by looking at what it doesn't save under idle loads and comparing it with FurMark (which you give 50% reduction). Even if the latter really is a comparable datapoint there's no way that's going to bring you down to sub 5W. There's no way I can see this as anything more than a flight of fancy.
And a 35Wh in 3DS XL form factor is also pretty outrageous. This isn't sporting the form factor of Project Shield. I can't think of a single device that crams such a huge battery into such a small form factor. Open Pandora has a much larger battery than its contemporary devices and it isn't even half the capacity you spec despite being about the same size and form factor you want, but significantly thicker (in no small part due to the battery)
And Nintendo is king of undersized batteries, this is just nuts..
The 3DS XL is about the same form factor as the Project Shield, which has batteries with the same capacity as a 10.1" tablet, along with a SoC that actually needs active cooling. And its supposed battery life is quite good for gaming too.
Not true at all, Project Shield is much, much, much thicker than 3DS XL (and a fairly different shape). Or is thickness immaterial to what you refer to as form factor?
Not as a handheld but as a handheld+home console hybrid.
It seems you failed to read the part where I said that the console would have to be bundled with a wireless HDMI (mirroring the top screen)+ wireless sensor bar. That way it would keep all the original functionality.
The only difference is that the games would be running in the handheld instead of a separate box, retaining full software compatibility.
You're right, I did fail to read that
Is wireless HDMI something a lot of TVs support, or do you need a receiver for it? How much power does that take? That's a pretty high bandwidth signal to pump out over the wire, I can't believe the power cost is negligible..
Nonetheless, if this needs to be used as a home console to hit its main selling points then is turning it into a handheld hybrid really the thing that would make it market viable?
2014 will probably be crawling with >256GB SD cards, so physical media shouldn't be a problem.
They're $400 now. There will probably be more of them available in 2014 but they're not going to be cheap enough to render media "not a problem." I don't have any graphs of SD card pricing but I don't see any realistic chance of 256GB SD cards dropping below $100 at any point in 2014. For an SD card to enter territory of cheap enough to be moot for a storage it's going to have to be under $50, I would say.