Is it more energy efficient to charge a phone/tablet using a desktop/laptop while your computer is being used vs using the charger?
I was just thinking if you are on your computer anyway would it just use some of the excess electricity that your computer would have wasted or would it be worse than charging your phone from a charger while using your laptop separately.
The other answers have touched upon the relative efficiencies between a phone charger and a desktop computer's PSU. But I want to also mention that the comparison may be apples-to-oranges if we're considering modern smartphones that are capable of USB Power Delivery (USB PD).
Without any version of USB PD -- or its competitors like Quick Charge -- the original USB specification only guaranteed 5 V and up to 500 mA. That's 2.5 W, which was enough for USB keyboards and mice, but is pretty awful to charge a phone with. But even an early 2000s motherboard would provide this amount, required by the spec.
The USB Battery Charging (USB BC) spec brought the limit up to 1500 mA, but that's still only 7.5 W. And even in 2024, there are still (exceedingly) cheap battery banks that don't even support USB BC rates. Motherboards are also a mixed bag, unless they specifically say what they support.
So if you're comparing, for example, the included phone charger with a Samsung S20 (last smartphone era that shipped a charger with the phone) is capable of 25 W charging, and so is the phone. Unless you bought the S20 Ultra, which has the same charger but the phone can support 45 W charging.
Charging the S20 Ultra on a 2004-era computer will definitely be slower than the stock charger. But charging with a 2024-era phone charger would be faster than the included charger. And then your latest-gen laptop might support 60 W charging, but because the phone maxes out at 45 W, it makes no difference.
You might think that faster and faster charging should always be less and less efficient, but it's more complex since all charging beyond ~15 Watts will use higher voltages on the USB cable. This is allowable because even the thinnest wire insulation in a USB cable can still tolerate 9 volts or even 20 volts just fine. Higher voltage reduces current, which reduces resistive losses.
The gist is: charging is a patchwork of compatibility, so blanket statements on efficiency are few and far between.
There is no such thing as 'excess electricity' in a modern (switching) power supply unit. They use as much power as is needed. There is a few percent of loss in the device, no big deal.
Some desktop computers are less efficient because they have too strong psu's (lots of reserve for your future "gaming" graphics card) built in.
This would only affect the 12V rail though no? It's not like they are beefing up the 5V rail that supplies your USB ports in excessive amounts. Picking a random PSU from pcpartpicker, the CORSAIR RM650e vs RM1200e (650W vs 1200W) both have a +5V@20A rail. There would be no need to have a larger 5V rail to support gaming cards.
Also correct me if I am wrong, most PSU's are more efficient at 20-50% utilization, not 100%. I'm basing this off the higher ratings for 80 Plus.
I've previously spoken with PSU engineers for enterprise power supplies -- specifically for 48-54v PoE equipment -- who described to me that today's switch mode power supplies (SMPS) tend to get more efficient with increasing load. The exception would be when the efficiency gains from higher loading start to become offset by the heating losses from higher input currents.
It'll depend on how efficient your phone charger is vs your PC PSU.
Looking at some charts, it's a very close battle but generally the phone charger seems to win out. Probably because it's more optimized for its max power output, whereas the PSU needs to support a wider range of loads.
I remember seeing an experiment saying that the difference is negligible. Even if it isn't, it's far more important to keep your battery between 20 and 80 percent at all times.
Charging at higher voltage is more efficient (at least on electric vehicles) but also your phone is such a relatively small battery it barely matters. Avoid wireless charging as it is extremely inefficient.
USB PD (Power Delivery) actually does use a higher voltage for more wattage. Standard USB is limited to 5V at 0.5A and sometimes up to 5V @ 2A on quick chargers. But PD chargers can give 20V and 3A for 60w or even 5A (100w) with properly rated cables. There's even a proposal for up to 48V at 5A to get 240w. This is all determined by a negotiation between the charger, the cable (which does have a small chip for this purpose), and the device, therefore PD chargers must support multiple voltages.
Gallium Nitride based modern phone chargers are 95% efficient.
The very best, most expensive PC power supplies on 115v AC will only reach 94% at the very specific 50% load of power supply rated wattage. So if you have a 500 watt power supply, and aren't using almost exactly 250 watts, you aren't getting that 94% efficiency. Regular power supplies under normal variable load conditions are going to be somewhere in the 80% efficient range. If the PC is idle, that efficiency can drop to 20% (but it's fine because it's only a few watts).
So using a modern Gallium Nitride stand alone charger will be more efficient. It will be extremely more efficient if you use that stand alone charger instead of charging off your PC while your PC is idle.
Counter point. Most computer power supplies have a curve to their charging efficiency (somewhere north of 50% load). If your PC is substantially below the peak of that curve, then adding load (the phone) could raise the PSU's efficiency say from 80 to 85% (I'm making up numbers) which would affect the overall efficiency of the entire PC's load.
I think your answer is still probably correct, but it's an interesting nuance to think about.
Side notes: Some PSU's use gallium, e.g., Corsair ax1600i, though by and large most do not. Also if your in the EU then your working with 220/240v PSU's which adds more efficiency, but that would apply to the phone charger as well.
Adding a 30 watt phone is going to be maybe 5% of the PC load. So it could bring it up to a few percent. But that is insignificant compared to the normal swings of 200+ watts between normal and load.
Your computer doesn't "waste" electricity, power usage is on-demand. A PSU generally has 3 "rails"; a 12V (this powers most of the devices), a 5V (for peripherals/USB) and 3.3V (iirc memory modules use this). Modern PSUs are called Switched-mode power supplies that use a Switching voltage regulator which is more efficient than traditional linear regulators.
The efficiency of the PSU/transformer would be what determines if one or the other is more wasteful. Most PSUs (I would argue any PSU of quality) will have a 80 Plus rating that defines how efficiently it can convert power. I am not familiar enough with modern wall chargers to know what their testing at.. I could see the low-end wall chargers using more wasteful designs, but a high quality rapid wall charger is probably close to if not on par with a PC PSU. Hopefully someone with more knowledge of these can weigh in on this.
Not to discourage such thoughts in the future, but your single post asking here probably used up more electricity than what you would save over the course of the next ten years.
In that case, you can make it a point to charge when the grid is “cleaner” - usually overnight. Your electricity costs may be cheaper then anyway.
The Apple Home app shows a grid forecast for your location, with cleaner times highlighted in green. I’m sure they pull this info from the utility company, so the info should be available in other smart home apps or maybe even your utility’s website.
But like others said, phone charging is very minimal. We’re talking about a 20W charger vs. say, a 1500W air fryer. Running larger appliances off-hours is a bigger deal - dishwasher, laundry, etc.
I'd like to remind everyone of the "vampire effect" of wall-wart chargers - if you just leave them plugged into the wall waiting for you to connect a device, you're constantly wasting a bit of electricity. That should also be involved in the efficiency decision of using the already plugged in computer or laptop.
Vampire effect was a real problem with older transformer-based wall warts. If you still have one that feels heavy and solid, or has a fixed energy rating on the label, this is you
If it feels light, almost empty, or the label has a wide range of frequencies and voltages, it uses an order of magnitude less and really doesn’t add up anymore. I believe it’s something like a couple cents per year.
Edit: found something from 14 years ago calculating the worst case scenario of leaving a wall wart plugged in as 7¢/month but I don’t believe it’s anywhere near that expensive for a modern one …. Although it’s probably more telling that I didn’t find anything more recent with a price estimate.
Unless your chargers are generating a noticeable amount of heat (and they shouldn't be), the amount of electricity they are using is simply negligible. Electricity cannot simply evaporate into thin air.
It is a little bit more efficient, because your PCs Power Supply is not very efficient when you dont use much power. By using it more of it, it becomes more efficient, peaking somewhere at 50%.
But: we are talking about very little differences here and only for Desktop PCs. When you use a Laptop, you Powersupply is way less powerful, so you use more of it by just using the Laptop. So in that case, i would rather use the charger. To be perfectly honest with you: All that is not realy worth thinking about. Its like opening your Fridge for only 6 Secounds instead of 8. Yes, its saves power, but there are petter ways to do so. Doing only 1 kilometer less with your electric car ist about the same as charging your phone 10 times.
IF you've a high-efficiency charger, then I'd say it's probably more-efficient to use that charger.
The warmer you run your computer, the less-efficient it becomes, & the shorter the lifespan of the hottest chips in it ( this effect shouldn't be significant )
e.g. increasing a CPU by 10Celsius should cut its lifespan in half.
by having more heat-generating-stuff going on in your computer, you impair the cooling of your CPU & GPU ( slightly, probably ), & that may affect your computer's time-to-failure.
The amount of heat this will add to your case is negligible. We're talking 15% waste on a 20W load, so 3W worth of extra heat. And that heat is produced in the PSU.
If the heat is negligible I would assume it should not matter as long as you do not charge while your pc is doing a task that uses up to much resources?