# how do you charge lithium betteries

6456 Views 13 Replies 9 Participants Last post by  ElectriCar
i was hoping to find out a little more about how the charger actually charges the batteries.

is it as simple as setting the max voltage and then pumping that many volts into the battery untill the battery reaches that level.

for example if i have a lithium battery that i want to charge to 4v could i just pump 4v into the cell no matter what the voltage of the cell is. even if the cell is discharged to 3v.

or does the charge voltage need to be set at a slightly higher level then the battery voltage.
for example
cell voltage = 3v charge voltage =3.1v,
cell voltage = 3.2v charge voltage =3.3v
etc

there are two reasons i ask. Firstly I've seen a couple of people use dc/dc converters to charge cells individually and these can't adjust the voltage. Secondly i have been pondering the idea of putting in a permanent petrol generator, which I'll rectify to DC and then use that to charge the batteries on the go.

So will a constant charge voltage damage the batteries?

thanks you for all you help
1 - 3 of 14 Posts
most 'smart charger' start with a CA phase to jam as much juice in as they can handle until voltage comes up to some trigger... typically around 3.65 x number of cells, then switch to a constant voltage and hold until amps drop to 'nothing'.

In a perfectly matched pack, all cells hit the trigger at the same time and the pack voltage reflects the intended CA to CV switch. If cells are NOT balanced, then high cells may 'take off' before pack voltage triggers the charger to CVunless controlled by a BMS.

The key is that balance is good to hit that trigger voltage at the same time, and precise control of that trigger voltage is required.
There is only one way to charge lithium batteries IMHO and manufactures.

It starts with a constant current of no more than 1C or per manufacture specified C rate.

Once the cell voltage reaches 4.2 volts per cell, or slightly lower of 4.1 for increased cycle like, the charger goes constant voltage until the charge current tapers to 3 to 5% of rated battery AH capacity.

whoa Nelly.... I don't think you want to go above avg cell voltage of 3.8 to leave any margin for error! probably safer around 3.7 when the cells start climbing the 'knee', but well before meltdown.
While we are in a voltage cutting mood, I've quit going over 3.6 volts per cell on a regular basis. There doesn't seem to be much up there.

I would agree.... and most of the conservative pre-set charger curves seem to use 3.65 x cells as the CA->CC voltage. This leaves a little room for a couple cells to be 'out of balance', and still have the series pack voltage hit the mark before the high ones melt down....

Having faith in a decent charger to accurately sense pack voltage, and catch that trigger point to switch to CV, is key to considering running without a cell-level BMS. It also means that the better job you do top-balancing, the more likely the pack will hit the mark all at nearly the same time.

By same token you have to accept that a top-balanced pack will NOT be bottom balanced, and if you run the pack down to low, the cell(s) with less capacity may be damaged.
1 - 3 of 14 Posts