Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Whats newest 360 brick's rating? (guess I should waddle over and look, lulz, will do in a sec)

The first batch of Slims (with HANA still separate) had a 135W DC output rating. The latest slims have a 115W DC output rating (HANA combined with the southbridge). Measured load consumption is around 90W.
 
50W doesn't leave much room for much performance once you take out the optical drive. That's what? 30-35W split between the GPU and CPU after memory and accessories and optical drive are considered? It doesn't look like it'd be much more powerful than a current generation console outside of perhaps more RAM.

Personally I'm waiting for a major 'gotcha' like the Wii U OS and tablet using up 512MB of RAM which leaves only 1GB for games.
 
It worse than that. Like other have posted. Its likely 45-50 watts used while running.
Yeah, that's what I meant by "max" power. Sorry if I was being unclear (I guess I was in shock from the news :)), I just wanted to point out that there wasn't much room to move around in with such a constraint on power.

50W doesn't leave much room for much performance once you take out the optical drive.
I doubt that drive's going to consume very much. Likely, it's a 2x BR drive like in PS3 (for cost reasons), so the disc rotation speed should be quite low. As the spindle motor is normally the biggest consumer, that leaves the electronics, but BR hardware should be quite mature by now and I doubt the chipset will gulp down very much either...
 
50W doesn't leave much room for much performance once you take out the optical drive. That's what? 30-35W split between the GPU and CPU after memory and accessories and optical drive are considered? It doesn't look like it'd be much more powerful than a current generation console outside of perhaps more RAM.

The console has up to 75W at its disposal.
The fact that no console (or any electronic applicance that I know of) constantly uses all the power they have available, it doesn't mean the sum of the TDP from each component can't reach a value close to that.

So yes, typical consumption of the Wii U may be around 50W, but the peak power consumption for the GPU+CPU could be well up to 60W.




Personally I'm waiting for a major 'gotcha' like the Wii U OS and tablet using up 512MB of RAM which leaves only 1GB for games.
Why? You don't want the console to succeed?
 
If it did reach 75 Watts, it would be a badly designed piece of hardware. You usually have a security margin of 1.5 to 2.0. And since consoles usually run "optimized" software, I can't see the peak being very much different from the average case, either. Some noise is there, sure, but most of the time, it won't be.

Also keeping in mind that the PSU most likely isn't 100% efficient, either.
 
The console has up to 75W at its disposal.
The fact that no console (or any electronic applicance that I know of) constantly uses all the power they have available, it doesn't mean the sum of the TDP from each component can't reach a value close to that.

So yes, typical consumption of the Wii U may be around 50W, but the peak power consumption for the GPU+CPU could be well up to 60W.
Console do run close to max consumption. Of course they will not be using 75w. The power supply couldnt handle that first off and it wouldnt be reliable at anywhere near what you are talking about....


60watts would push it to 80%. That is the very upper limit and I dont see it even that high. I know when building a pc you try to keep in with 40-70%, that is the rule i have always seen.
 
If it did reach 75 Watts, it would be a badly designed piece of hardware. You usually have a security margin of 1.5 to 2.0. And since consoles usually run "optimized" software, I can't see the peak being very much different from the average case, either. Some noise is there, sure, but most of the time, it won't be.

And where is this "security margin of 1.5 to 2.0" coming from?


Also keeping in mind that the PSU most likely isn't 100% efficient, either.

PSU efficiency has nothing to do with rated output.


Console do run close to max consumption. Of course they will not be using 75w. The power supply couldnt handle that first off and it wouldnt be reliable at anywhere near what you are talking about....

What do you mean "it can't handle"? It can handle up to 75W because its internal components are rated as such.
It's obviously not prepared to do an "average" of 75W because that would mean the peak values are going above that number, but the peak is 75W and not any other number.

60watts would push it to 80%. That is the very upper limit and I dont see it even that high. I know when building a pc you try to keep in with 40-70%, that is the rule i have always seen.

I see, so you're comparing it to bad PC PSUs.

But no, 80% isn't the "upper limit". Even in PCs, only "bad" PSUs won't be able to handle the rated wattage.

Some PSUs (Corsair, Seasonic, BeQuiet) can even handle sustained wattages well above their rated output, and all the good ones will handle some 5W above its rated specification just fine.


Yes, many uninformed people use the "up to 70%" rule for buying PSUs, without even knowing why.. That's mainly because there's practically no quality regulation for PSUs and their rated maximum output and they'd risk buying a "800W PSU" from "LCPower" or something that won't even do 600W and damage some components due to unstable voltages.
 
Last edited by a moderator:
50W doesn't leave much room for much performance once you take out the optical drive. That's what? 30-35W split between the GPU and CPU after memory and accessories and optical drive are considered? It doesn't look like it'd be much more powerful than a current generation console outside of perhaps more RAM.

Personally I'm waiting for a major 'gotcha' like the Wii U OS and tablet using up 512MB of RAM which leaves only 1GB for games.

Your typical slot load BD-rom drive eats about 5w of power.
Your typical configuration of Power 476fp takes up 1.6w per core.

Here's what I posted on GAF:
Slot load BD rom: 5w (1A at 5v)
Flash memory: 0.25w
Wifi modules: approx 2w per chip
2GB DDR3: approx 5-10w, depending on configuration
 
So said my prof in constructional engineering as well as electrical engineering and "experience"...

The "Security Margin" you speak of is applied in the components themselves (capacitors, rectifiers, resistors, etc.). It's not in an end-product like a console PSU.
A well-built PSU will output its rated wattage perfectly well for as long as you want, because its less-capable component is able to drive that voltage/current with a security margin of 1.5/2.0.

I'm pretty sure your prof will say the same.

Say, look at PS3, with its 380 Watt PSU, but only using about 200 Watts under full load.
That's the phat model, right?
Why won't you check the power usage in an AAA game from 2012 and see if it's still 200W?
 
The "Security Margin" you speak of is applied in the components themselves (capacitors, rectifiers, resistors, etc.). It's not in an end-product like a console PSU.
A well-built PSU will output its rated wattage perfectly well for as long as you want, because its less-capable component is able to drive that voltage/current with a security margin of 1.5/2.0.

I'm pretty sure your prof will say the same.

Are you really going to argue this? It will lead to nothing but dissapointment. All CE devices run closer to 50% than 100% on maximum usage
 
Are you really going to argue this? It will lead to nothing but dissapointment. All CE devices run closer to 50% than 100% on maximum usage

Funny.
So when you turn on your 600W microwave at maximum power, you think the output is actually 300W?
 
Are you really going to argue this? It will lead to nothing but dissapointment. All CE devices run closer to 50% than 100% on maximum usage
ToTTenTranz has issued an easily verifiable challenge. Someone measure the power draw running a known AAA title (Uncharted 2 or 3 should be accepted as taxing the system) and it can be proven one way or the other, and someone fed crow.
 
ToTTenTranz has issued an easily verifiable challenge. Someone measure the power draw running a known AAA title (Uncharted 2 or 3 should be accepted as taxing the system) and it can be proven one way or the other, and someone fed crow.



I'm more confident on the microwave test, lol.
 
ToTTenTranz has issued an easily verifiable challenge. Someone measure the power draw running a known AAA title (Uncharted 2 or 3 should be accepted as taxing the system) and it can be proven one way or the other, and someone fed crow.

http://www.anandtech.com/show/3774/welcome-to-valhalla-inside-the-new-250gb-xbox-360-slim/3

The slim used around 90W to run games on a 135W supply. (And that's 90W on the outlet side. The brick is at best 94% efficient, but that's highly doubtful, so it's probably supplying a good deal less than 90W to the console)

Those power supplies have to be rated assuming you've got every single peripheral under the sun plugged in. If you're playing a demanding Kinect game (is there one?) while charging multiple controllers over USB, using wireless on a weak signal with the antennae at maximum strength downloading a game at full-rate to an HD in the background, or whatever other power consuming tasks you can think of. Even for that absolute worst case, there has to be some headroom (10% at the minimum, just to make sure you don't exceed the brick).
 
There are also the power spikes from spinning up the optical drive & random accesses. The rate at which the drive reaches nominal speed will have an effect, of course (usually modest for notebook solutions to keep the power down).
 
The "Security Margin" you speak of is applied in the components themselves (capacitors, rectifiers, resistors, etc.). It's not in an end-product like a console PSU.
A well-built PSU will output its rated wattage perfectly well for as long as you want, because its less-capable component is able to drive that voltage/current with a security margin of 1.5/2.0.

I'm pretty sure your prof will say the same.

Dunno, I switched classes a while ago and graduated in a different course...

Though... but using your logic, if all parts have this margin, then the rated output should effectively mirror this margin too, no?

Also... running your PSU at full load for a longer period of time WILL lead to it dying pretty fast. Just look at PC PSUs. There's a very good reason why 150 Watt GPUs recommend 24A at the 12V rail (i.e. 288 Watt). I know I know, PCs are a different case, but still basic "engineering" logic should still apply.

That's the phat model, right?
Why won't you check the power usage in an AAA game from 2012 and see if it's still 200W?

Dunno, I sold mine... but I highly doubt that current games use more power than the tested ones... like Final Fantasy XIII and GTA4. Current games surely aren't "furmark" to our systems, compared to the older ones... at least not by a larger than negligible difference.
 
Status
Not open for further replies.
Back
Top