The vast majority of cell phones use a single-cell Li-Ion battery, so their capacities can be directly compared using mAh. Laptops almost always contain multi-cell Li-Ion batteries, so their capacity cannot be directly compared using mAh (e.g. a 4S battery rated for 2500mAh has more energy than a 3S battery rated for 3000mAh).
So why don't we use Wh for phones too? Simply because manufacturers would rather advertise a battery size of five thousand mAh (wow, so much capacity!) instead of 19 Wh.
The same issue happens with portable USB battery packs - they're all advertised in mAh even though they use a wide variety of chemistries and cell configurations internally. What manufacturers do is take the total Wh of the pack and convert it back to the equivalent mAh of a single-cell Li-Ion. It's annoying, and I really wish they would just use Wh directly.
I don't think they know about metric prefixes, Pip.
Imagine if the marketing people discovered that they could advertise that it has 19 million uWh (in Doctor Evil voice). Don't say it too loudly though, someone at Apple might hear.
That would be ideal, but I think at this point there's just too much marketing momentum using mAh, and switching to mWh would be too confusing to consumers. But yeah, I agree, mWh is definitely the most appropriate unit to use.
And 19000 mWh. I'd rather have 19000 of something rather than 5000. I feel cheated and no amount of telling me it's exactly the same will change my mind.
Generally Li-ion (3.7V nominal) batteries were used so they could just base it off of current usage rather than power usage and you could get a decent idea comparing between smart phones.
Laptop batteries tend to use an operating voltage of multiple times that (2-cells would use 7.4V-ish, 3-cell would be 10.8 to 11.4V nominal, 4-cell would be 14.8V and so on), but the number of cells can vary wildly per model, so Wh is easier to compare numbers between laptops.
Can I add a follow up question: Why don't normal batteries have any useful measurements on them, at least in the UK anyway, not sure about elsewhere. Rechargeable batteries will have an Ah rating but normal AA or AAA etc will just say "Ultimate" or "Advance" etc, like why can't we just have an Ah or Wh or even just a standardised rating based on a fixed current discharge or something? It's infuriating that in 2023 I'm buying something with know way of quantifying its content other than the inference of the product name.
The reason phone vendors can advertise capacity is because the load (the phone) is a known quantity. They made the phone, so they can reliably estimate the battery's capacity based on average use by that phone.
Similarly, power bank manufacturers can do the same, because the load is controlled by them. The USB port might only provide 5V at 1.5A or 3A - whatever the power bank manufacturer put in - so they can reliably estimate how much current over time the battery can provide.
But makers of alkaline batteries don't have that knowledge. They have no way of knowing if you're going to put them into a kid's toy that pulls only 20mA, or a DC motor for a rotisserie that pulls 1A. So they can't possibly provide you any measure of Ah that is going to satisfy all consumers. If they did, they'd open themselves up to legal problems for making misleading claims about their product.
I don't find that to be a particularly compelling argument though. If you go to buy a lead acid battery for solar usage, for example, they give you the capacity based on a 20-hour discharge (or, 1/20th C rate). The same could absolutely be done for primary batteries
Laptops predate cell phones in mainstream use. When laptops started, there were a variety of battery types in use with no standard charging voltage so Wh was the fair way to compare.
Cell phones have pretty much always been 3.7v lithium so mAh is a fair comparison and gives a bigger number than Wh.
3000 mAh * 3.7V = 11.100 mWh
Much bigger. Much better.
I hate mAh… it’s absolutely no information how much energy is inside without taking the voltage into account. If you use directly (m)Wh, you directly have the amount of energy the battery can contain.
The thing is, batteries are measured in Ah, and not Wh. That's because their voltage changes all the time, and is mostly the same for the same kind of chemistry, and also because for most of their uses, the current is the actually useful information.
Phones are just using the standard metric. It's laptops that are weird.
Frankly I'd be fine with watt-hours too, as long as consistent. It isn't like converting one into another is hard (1kWh = 3.6MJ).
My point for joules is twofold:
Due to the name, plenty people confuse power (watts) with energy (watt-hours). Joules avoid this.
You can also convert other "esoteric" units of energy to joules, for better comparison across fields. Such as "food calories" (i.e. kilocalories; 1kcal = 8.4kJ)
wh/kwh are defined as energy per hour, while a joule, is energy (in watts) per second.
It's power times hours and power times seconds respectively. ("Per" is usually understood as division)
That said you're right that they're similar. The difference is only if you're using hours or seconds to measure time.
Wh is a unit of energy, Ah is a unit of electric charge, basically how many physical electrons passed by.
The voltage of a battery goes down gradually as it is discharged, so getting an accurate value for total energy dissipated is very complicated, as this varies greatly with the discharge profile and other physical factors like the age/health of the battery.
The one thing that stays constant is the amount of electric charge a battery can provide.
If it's old, the voltage of that charge will be lower and go down quicker, but it will be the same total charge.
I agree from a consumer point of view, joules would be a friendlier unit, however it is also a lot easier to game. Electric charge is a much more definite unit in an electrical engineering sense.
If any of what I said is confusing please ask me to clarify, I'm assuming a basic level of electronic literacy but it's hard to know what knowledge I'm taking for granted as an ex electrical engineer.
I can't imagine it's hard to establish a standard environment for battery capacity testing, or that such a standard doesn't exist already. Charge might be the more definite unit but it is not the useful unit. I think the closer they get to actually measuring the battery performance the better
Church is meaningless if it's not provided at a useful voltage though. What people truly care about is usable energy, which is what Watt-hours or Joules tell us. For example, I don't care if my portable battery pack is 1000 milliamp hours, it's meaningless unless I also know The battery chemistry used (nominal voltage) and the number of cells so I can figure out the actual potential energy.
Also, as a phone's battery ages, if I'm not mistaken it truly does hold less "charge", but I still believe the more useful metric is actual energy stored. That's how it's done in the EV scene, you use kWh to see how much energy is left in your battery. As the battery ages, "100%" represents slightly lesser energy (kWh)
For the same reason Audi didn't sell the Audi "5", Pontiac never sold a "6LE". and Saab didn't try to sell the "9 turbo".
It sounds more impressive with the zeros added.
And I think it gets more obvious once you compare two phones and see which one to buy. I can tell you immediately, between a battery with 3450mAh and 4200mAn there's a 750mAh difference.
But i have to look twice at the numbers if it's 3.45Ah and 4.2Ah and i want to see if it's worth the extra $70.