Windows 10 [2014 - 2017]

Status
Not open for further replies.
Well, faster than I can press the on button. ;) But still, why does it reset the monitor? Windows 7 handles the monitor being off just fine.

Also, whenever an app requests permissions to access the internet, Windows 10 offers two tick boxes, one for home networks and one, discouraged, for public networks. On my PC, the public network box is ticked and the private one isn't.
 
Well, faster than I can press the on button. ;) But still, why does it reset the monitor? Windows 7 handles the monitor being off just fine.

Also, whenever an app requests permissions to access the internet, Windows 10 offers two tick boxes, one for home networks and one, discouraged, for public networks. On my PC, the public network box is ticked and the private one isn't.

That behavior is already there since 7 or 8. That thing also Stull buggy.

Home network at rare random time become public and apps can't connect.

In 7 it's easy to change using network and sharing center control panel menu.

In 10, I need to go to network and sharing center control panel, then go to home groups.
 
022222222221
Uses less power. And at night or going out I switch off at the wall.
Unless you're running an ancient monitor, the power draw at the wall during standby is probably so low as to not be measurable on a Kill-a-Watt device.

Regardless, is the standby power going to consume more than the extra power you have to burn while fighting resolution settings with your PC every time you boot "out of sequence"? You might have saved 5whr over the span of a year, but you spent that same amount in three days with the extra power-on time, and perhaps a reboot or three, trying to solve the problem you caused by saving the 5whr of power.

Just sayin' ;)
 
5 watts in a year!. From googling its seems standby uses a lot more than you think.

From the Lawrence Berkley National Labs and their (continuously updated) report on standby power usage: http://standby.lbl.gov/summary-table.html:

Code:
Product/Model  Average (W)  Min (W)  Max (W)  Count
Computer Display, LCD
Off            1.13         0.31     3.5      32
On             27.61        1.9      55.48    31
Sleep          1.38         0.37     7.8      30

Compared to the "OFF" state, sleep is rounding error. Also, you'll notice I used 5whr which is not the same as watts, it's watt-hours. Google suggests the average per kilowatt hour is 12 cents, so that extra 0.2 watt-hours (difference between monitor OFF and monitor sleep, given by the average above) over the span of a year is about twenty one cents.

Let's say you turn the socket off at the wall, as was mentioned earlier, so you incur the wrath of FULL SLEEP POWER at all times! ZOMG NO TEH WHORE-ERS! ;) That'll cost ya $1.45, again, for the year.

Now, let's talk about how fast a 450W power supplies chews through that KWR rate while you're fighting with the resolution settings...
 
Switch off at the wall. That's all connected devices not on standby. Multiply that by hundreds of millions of households who operate the same way, and you're saving amounts of energy*. Yes, it's a rounding error when you consider it in relation to the amount of daytime burnt in people's operations, but it's still a measure of improvement against waste. And the inconvenience is zero when 1) I can just ensure I switch the monitor on before the PC, and 2) I've gone back to Windows 7 for now anyway! :p But the real issue here is Windows 10's seemingly odd behaviour.

*4 devices on standby consuming 1 watt in sleep is 4 Wh, over 8 hours is 32 W, times 100 million US households is 3.2 billion watts, or 3.2 million kW. Possibly of the order of 3 million kg CO2? The annual consumption of 35 US citizens wasted every day doing nothing. (numbers not double checked!)

If you think like an individual, or in terms of money only, sure, what's the point in saving a few watts?
 
Switch off at the wall. That's all connected devices not on standby. Multiply that by hundreds of millions of households who operate the same way, and you're saving amounts of energy*. Yes, it's a rounding error when you consider it in relation to the amount of daytime burnt in people's operations, but it's still a measure of improvement against waste. And the inconvenience is zero when 1) I can just ensure I switch the monitor on before the PC, and 2) I've gone back to Windows 7 for now anyway! :p But the real issue here is Windows 10's seemingly odd behaviour.

*4 devices on standby consuming 1 watt in sleep is 4 Wh, over 8 hours is 32 W, times 100 million US households is 3.2 billion watts, or 3.2 million kW. Possibly of the order of 3 million kg CO2? The annual consumption of 35 US citizens wasted every day doing nothing. (numbers not double checked!)

If you think like an individual, or in terms of money only, sure, what's the point in saving a few watts?

It comes down to Windows 10 relying more upon display EDID than in Windows 7 (Windows 8/8.1 was when Microsoft started to place more emphasis on display EDID rather than display inf files). So in Windows 7 if a monitor had an incorrectly populated EDID, it didn't affect things much (Microsoft used generics monitor INFs). That made behavior consistent but didn't allow windows to use displays to the displays fullest capabilities.

There's some EDID byproducts that occur when you properly support EDID, however. In the case of DP connected monitors, when the monitor goes into standby it disappears from Windows connected devices list (the espected behavior) and everything gets stuffed onto any non-DP connected monitors. So manufacturer's have to put in workarounds for that to bypass the intended behavior (DP standard). Again in Windows 7 this doesn't happen due to Windows 7 not parsing/using all of the EDID information.

TL: DR - A lot of odd things can happen in Windows 8/8.1/10 due to Windows now correctly parsing and using the EDID information on displays. This is exacerbated if the display's EDID isn't correctly populated or wasn't created properly.

That was a needed change going forward, however, as it makes it easier to properly support more complex display standards (HDR, etc.) which would be problematic of the old method of using a generic monitor INF file when a display didn't come with a manufacturer provided INF file (which users typically neglected to install anyway).

Regards,
SB
 
Last edited:
Also, you'll notice I used 5whr which is not the same as watts, it's watt-hours
yes that true but what you posted even based on your numbers isnt correct.
A monitor uses ~1 watt per hour when its in sleep, agreed.
Person turns it off at night = 8 hours = 8 watt hours for 1 night, but you're claiming less than that for a whole year! i.e. your numbers are orders of magnitude off
I do agree with you as overall the amount of power is bugger all (esp since I just woken up & turned the heatpump on = 2000W in usage), thus Personally I dont turn off monitors PC's (well anything)
 
yes that true but what you posted even based on your numbers isnt correct.
A monitor uses ~1 watt per hour when its in sleep, agreed.
Person turns it off at night = 8 hours = 8 watt hours for 1 night, but you're claiming less than that for a whole year! i.e. your numbers are orders of magnitude off
I do agree with you as overall the amount of power is bugger all (esp since I just woken up & turned the heatpump on = 2000W in usage), thus Personally I dont turn off monitors PC's (well anything)

It's not 1 watt.

The device in his example is consuming on average

Off = 1.13 watts cosumed
Sleep = 1.38 watts consumed.

A difference of 0.25 watts. It'd still add up to more than 5 WHr per year, of course. But it's still pretty insignificant. Using his google supplied cost per KWHr that'd still only work out to a savings of 2.19 KWHr or 0.26 USD per year if the monitor was on standby 24 hours every day versus monitor plugged in but turned off. While he may have gotten the 5 WHr incorrect, he's pretty close to what it actually costs with his 0.21 USD per year as the monitor is going to likely be used at least part of each day on average.

Regards,
SB
 
yes it is, Shifty said turned off at the wall, which uses 0 power

If it is still physically connected to the wall plug it is still drawing current and thus is still using electricity. Most modern electronics don't physically cut the circuit like you would with a mechanical switch.

Regards,
SB
 
I'm having a problem with icon placement on the desktop. First let me say that I don't have any sort option set for desktop icons, nor do I have the grid thing option set.

I right click and create, say an excel doc. The icon for it locates in whatever random place on the desktop I picked. If I edit the doc, and save it, the icon moves to what looks like the first empty position in a phantom grid order. I can move the icon, back to where it was. It'll stay there after restart etc, until the file is again modified and saved, when it once again moves to the same place it moved to the last time.

I've scanned this thread, but didn't see this mentioned. Some reports on the net suggest it's an Nvidia driver thing, but that sounds far to generic to have not been reported here.
 
If it is still physically connected to the wall plug it is still drawing current and thus is still using electricity
Incorrect
http://www.theenergycommunity.com/myth-busters-1/
In order to draw and use electricity, you need a completed circuit (i.e. there cannot be a gap in it because electricity will not be able to flow across is). In order to complete the circuit you need to do two things:

  1. Plug something in
  2. Switch the plug socket on
  3. If either of these are not done, then the socket will use zero power. If there is something plugged into the socket but the switch is off, then no electricity will be used either. Therefore, both of the sockets below are not using any electricity at all
  1. IMAG0427_1.jpg
  2. sorry about the large image
 
If it is still physically connected to the wall plug it is still drawing current and thus is still using electricity. Most modern electronics don't physically cut the circuit like you would with a mechanical switch.
The wall plug has a mechanical switch that cuts the power to the socket. This is a rather weird conversation where 'off' doesn't count as off! The wall plug either has 250 V or 0 V depending on the switch position. I switch it to 'off' at night, so there's 0 volts and zero electricity being used (by all connected devices on a multisocket adapter). How can it be any other way?!

Also, Windows 7 is doing that same thing, so it's decidedly moot! Maybe something to do with the BOIS update I did? I retract that complaint against Windows 10.
 
btw any idea why WIndows Defender in Windows 10 still use ugly-ass dated look?

it is an integral part for OS security and it looks as if someone just left it in a corner while everything around it got overhauled.
 
The wall plug has a mechanical switch that cuts the power to the socket. This is a rather weird conversation where 'off' doesn't count as off! The wall plug either has 250 V or 0 V depending on the switch position. I switch it to 'off' at night, so there's 0 volts and zero electricity being used (by all connected devices on a multisocket adapter). How can it be any other way?!

Also, Windows 7 is doing that same thing, so it's decidedly moot! Maybe something to do with the BOIS update I did? I retract that complaint against Windows 10.

As I was mentioning. Most electronic devices now days don't have a mechanical switch. Meaning a switch that actually connects and disconnects a physical contact. Only devices that actually do that (almost unseen in modern electronics) will draw 0 power when turned off.

So something like a modern TV doesn't do that. When turned off there is still a physically complete circuit at all times, even in the off position because it doesn't physically break the connection as you would with a mechanical switch.

Even if you have a physical button or switch that you hit on the device itself, it doesn't necessarily mean it cuts the physical circuit. For example, anything you can turn on remotely can't physically break the circuit The device just goes into the "off" state instead of the "sleep" state. Both states still draw power due to the completed circuit.

Something like a power strip is supposed to physically cut the circuit, but I've seen ones that don't.


See above. In the case of that image there are actual physical switches that physically cut the circuit. I can't speak for Australia or New Zealand, but most households in the US, Japan, and Taiwan as examples, don't have outlets like those.

Regards,
SB
 
btw any idea why WIndows Defender in Windows 10 still use ugly-ass dated look?
Defender is still in W10? News to me! I haven't seen hide nor hair of it in the year I've been running W10... :LOL:

Ok, I've seen definitions updates for it in win update, but I assumed they removed the GUI and just made it like a system service that runs silently in the background...
 
See above. In the case of that image there are actual physical switches that physically cut the circuit. I can't speak for Australia or New Zealand, but most households in the US, Japan, and Taiwan as examples, don't have outlets like those.
Do Americans say 'off at the wall' to mean switch the device off? To me, 'off at the wall' means exactly that, using the physical switch on the socket attached to the wall. What your suggesting here is that US power outlets don't even have power switches?! So you guys have to unplug to actually power anything off? :oops:
 
Status
Not open for further replies.
Back
Top