Ahhhh, should have known that it wouldn't be as straightforward as I thought, I might get one of them plug in things at some point, cheers
Why bother when it is something you cannot affect. The charging process is even more complex than indicated in the last post. The charger is a switch mode power supply converting a mains voltage to lower voltage typically 36 to 42 v for the normal Li ion pack. It is not 100% efficient and in use creates some heat. A watt meter on its input will measure only the total power consumed by it. The output voltage from this charges the battery pack, initially at a fixed current and gradually rising voltage. The current limit having been chosen to optimise the time and the chemistry of the battery . If the current is to low it takes ages to charge, if the current is to high, then the conversion of ions Is inefficient and there is wasted energy as heat. This is in addition to the normal heating effect of a current flowing in a resistance. Exact values are dependent on the battery technology electrochemistry used and temperatureand this is why vendors expect you to use their matched chargers and batteries.
When the battery is near charged, the current falls below the current limit and the voltage rises to 46 approx. When the voltage gets close to 4.2v across any cell in the battery, the charger switches off . Depending on the sophistication of the battery management system BMS then a process of cell balancing takes place. . A wattmeter connected between the charger and the battery will only measure the energy fed into the battery, including that which is converted into heat, so is also not a very useful measurement.
Similarly when the battery is being discharged on use, the total energy available depends on the rate of consumption as well as temperature