I have a whole bunch plugged in constantly for various synth nonsense. Running off solar, I thought it would be worth seeing how much of a difference it makes turning everything off at night, and basically it wasn’t worth the effort. It’s like a percentage point on top of the things like my fridge that run constantly, and is way less than using my toaster once a week.
That said, if you’re on mains, it’s probably a worthy consideration if a lot of people were to do it, but it’s also probably comparable to using ChatGPT once a day or something
The article is very light on details. There are better articles with some real numbers.
Chargers for a phone draw 0.1W roughly. That's 0.9 kWh per year, and with a price of €0.35/kWh would be €0.32 / year / charger you leave plugged in. That's not even a rounding error compared to what my heat pump uses.
Devices with an indicator light barely use anything more. The ones with a display or clock do use more power, usually a few watt, what then comes down to maybe €10 / device / year using napkin math.
I mean, transistors and ICs do degrade over time, hovewer, out of all the power supplies I've repaired, the vast majority had dead caps, and those kinda tend to dry out with time regardless of whether they're in use. So, kinda negligible, just like the power consumption in standby.
Nobody considers the risk of them going up in flames at night. They have a temp trip safety, but there is still some risk left. Especially for cheap Chinese power blocks.
some risk left. Especially for cheap Chinese power blocks.
You can leave away the "cheap Chinese". I have tried to find some of high quality once, and that was a super hard task. They are all the same as the cheap Chinese, whether it's written somewhere or not.
In EU they are not allowed to consume more than 0.5 Watt on idle. And this regulation has been in force since 2008.
Since mostly everybody design for that, I expect this norm also benefit other countries. So this is not really an issue, unless you are in a country without such regulation, and you buy some cheap off brand charger.
Since the standby power is so low, the wear is most likely insignificant too.
Having an idle unit that uses 0.5 Watt on constantly for a month, consumes about 1/3 kWh, but since this regulation has been in force since 2008, I suspect idle is improved further for most devices. 0.5 is a maximum allowed value, and most would prefer to stay below that to not get into trouble.
old power supplies used transformers, which when not loaded behave like inductors, and this causes reactive current to flow. not a problem for the last 15+ years because everything uses switching mode power supplies now
The switch of my power strip broke before any of my USB chargers by turning it on and off everyday. So I stopped turning it off/on and it wont break again, I prefer wasting cents of idle power than having to buy a new power strip each 6 months. Cheaper and easier.
But it is particularly concerning for cheap, uncertified chargers. These often lack appropriate levels of protection and can be a fire hazard.
I mean, you could hypothetically have an unsafe charger that plugs into wall power, but I don't think that that's specific to chargers. Any electrical device that plugs into wall power could hypothetically be unsafe.
In the case of chargers, the power supply is external to the device being powered and uses a standard interface, so it's easy to examine and replace. I think that the only thing that comes close are external, semi-standardized power supplies with barrel plugs. So if you want to make sure that you have, say, all UL-marked chargers (in the US; a CE mark isn't really the same thing in the EU but is the closest analog that I'm aware of) you can do that fairly easily compared to ripping an internal power supply out of a device. But I'm not convinced that USB chargers in particular are especially problematic relative to other forms of power supply or wall-power-connected device.
Has any study been done on how efficient they are as heaters? The electricity they use when idle doesn't vanish; it's given off as heat. In the winter it might be worthwhile to not bother to unplug them because what they're giving off could offset what other, more conventional, heat sources might otherwise provide. i.e. you leave a charger plugged in, and your house heating goes off half a second sooner, saving you the pennies there that the charger costs otherwise.
Admittedly, this doesn't apply to summer and hotter climates, so most people, most of the time, probably ought to be unplugging them, but there's a small percentage of cases where the reverse might actually be beneficial.
100% is the typical claimed number as it is easy to measure watts of electric in and find that is exactly equal to watts of heat out - or if not the difference is easially explained by measurement error. There is no hypothesis (much less theory!) of where the energy could go it it isn't heat and conservation of energy is enough to also decide 100% efficient.
The above isn't the whole story though. If you could somehow measure watts from the power plant output you would discover that 4-12% (depending on a bunch of factors) of the energy is lost before it even gets to your house and so your efficiency goes down. If you measure fuel into the power plant, those range for 10% (old systems from the 1920s only run in emergencies) to 60% (combined cycle power plants) - I'm not sure how you would measure wind and solar. Eventually the universe will die a heat death and 0% long term.