DIY Electric Car Forums banner

how do you charge lithium betteries

6467 Views 13 Replies 9 Participants Last post by  ElectriCar
i was hoping to find out a little more about how the charger actually charges the batteries.

is it as simple as setting the max voltage and then pumping that many volts into the battery untill the battery reaches that level.

for example if i have a lithium battery that i want to charge to 4v could i just pump 4v into the cell no matter what the voltage of the cell is. even if the cell is discharged to 3v.

or does the charge voltage need to be set at a slightly higher level then the battery voltage.
for example
cell voltage = 3v charge voltage =3.1v,
cell voltage = 3.2v charge voltage =3.3v
etc

there are two reasons i ask. Firstly I've seen a couple of people use dc/dc converters to charge cells individually and these can't adjust the voltage. Secondly i have been pondering the idea of putting in a permanent petrol generator, which I'll rectify to DC and then use that to charge the batteries on the go.

So will a constant charge voltage damage the batteries?

thanks you for all you help:D
1 - 14 of 14 Posts
My CALB SE 130AHA batteries (38 pcs.) have a charge voltage about 3.6V.
I charge them up (with a Zivan NG3) to 3.5V max (133V), so I don't "pump" them up to the max.
This should increase the lifetime and security by seriell charging.

I use a initial constant current (129V / 15A) until the "rest" of the charge the current goes sligtly down (steps between 9V and zero) and the voltage rises up to 133V at the end.

IMHO: a constant charge voltage would not damage the cells, but it might be difficult to watch the voltage not running high at the end of the charge. Less current you charge, more voltage would come out of your power supply I think. A good charger would compensate this.

But let's read what others write to your post ... :)
for example if i have a lithium battery that i want to charge to 4v could i just pump 4v into the cell no matter what the voltage of the cell is. even if the cell is discharged to 3v.
Usually, but only because a charger usually has a current limit built in. So with a cell at 3.0 V, you apply say a 20 A charger set to 4.0 V, and the charger immediately adjusts the voltage to say 3.02 V so that the current stays under 20 A. As the voltage of the cell rises, the output voltage of the charger rises, until it hits the limit you set of 4.0 V. At that point, the charger will be putting out little or no current, to maintain the cell at 4.0 V.

or does the charge voltage need to be set at a slightly higher level then the battery voltage.
It does, but the current limit on the charger does that for you automatically.

Firstly I've seen a couple of people use dc/dc converters to charge cells individually and these can't adjust the voltage.
If these don't current limit, then they're not suitable for battery charging. All EV DC/DCs will have a current limit that reduces the voltage as needed to respect the current limit.

Secondly i have been pondering the idea of putting in a permanent petrol generator, which I'll rectify to DC and then use that to charge the batteries on the go.
For that, you'll need some sort of current limit. Fortunately, it should be pretty easy to set up a circuit that adjusts the field voltage of the alternator to maintain about the right current into the batteries.

So will a constant charge voltage damage the batteries?
If you connect a power source with no current limit and there is a significant voltage difference, then uncontrolled current will flow. Lithium Iron cells do not like to be charged too fast (i.e. at too high a current), and will lose life if charged with too much current.

The other issue is that no EV pack is a single cell; there are always cells (or groups of paralleled cells) in series. Just because the average cell voltage is say 3.8 V, does not mean that no cell is exceeding 4.0 V, and that's bad for the cell. That's what battery management systems are about.
See less See more
Normally you'd set your charger to the max you want to charge the battery pack to as a whole to fill it (so far a 144V nominal LiFePo pack, say at 3.8V per cell when 'full' that would be 171V).

In this scenario you're not really trying to fill the battery - just keep it topped up. So surely you'd set the charge max voltage to close to the NOMINAL pack voltage - that way this is very little risk of overcharging any single cell (you're pack would need to be WAY out of balance to so do).
Usually, but only because a charger usually has a current limit built in. So with a cell at 3.0 V, you apply say a 20 A charger set to 4.0 V, and the charger immediately adjusts the voltage to say 3.02 V so that the current stays under 20 A. As the voltage of the cell rises, the output voltage of the charger rises, until it hits the limit you set of 4.0 V. At that point, the charger will be putting out little or no current, to maintain the cell at 4.0 V.
so really i should be talking in Kw not amps. so if i try and charge a 3v cell with 4v (and 20ish amps) the charge voltage will drop to (around) 3.2 and work its way up until the battery voltage reaches the charge voltage. So in a way the voltage will adjust to what i need it to be, to charge the batteries?

For that, you'll need some sort of current limit. Fortunately, it should be pretty easy to set up a circuit that adjusts the field voltage of the alternator to maintain about the right current into the batteries.
I'm not very technically minded so what ever i come up with has to be fairly simple. the generator can only put out so much current, and i plan only use it while i drive when i can watch what is happening. as long as the battery voltage doesn't go to high the generator shouldn't produce enough current to do any damage right?

I could put a 30amp fuse between the generator and batteries or find a way to shut off the generator automatically if the voltage or current went to high, would these options be a suitable failsafe?


If you connect a power source with no current limit and there is a significant voltage difference, then uncontrolled current will flow. Lithium Iron cells do not like to be charged too fast (i.e. at too high a current), and will lose life if charged with too much current.
I'm betting on the generator not being able to put out many kw, so it probably can't do much damage anyway, however i'll find a way.

what happens if i floor it and the battery voltage sags while i'm charging, i'm guessing the current will just go into the motor, and not into the batteries, but what do you think

The other issue is that no EV pack is a single cell; there are always cells (or groups of paralleled cells) in series. Just because the average cell voltage is say 3.8 V, does not mean that no cell is exceeding 4.0 V, and that's bad for the cell. That's what battery management systems are about.
like what dc braveheart said i'll be in the middle of the cell range to increase safety, i'll also have a decent BMS.


sorry to bombard you with questions but i guess that's the curse of knowing what you talking about.
See less See more
so if i try and charge a 3v cell with 4v (and 20ish amps) the charge voltage will drop to (around) 3.2 and work its way up until the battery voltage reaches the charge voltage. So in a way the voltage will adjust to what i need it to be, to charge the batteries?
Yes, but only if the charger or other power source is inherently current limiting. If you just connect a 144 V pack across a 120 V pack with a contactor, the current won't limit, and both packs would be damaged, and possibly the contactor as well.

If you connect a generator that doesn't have current limiting designed in, it could possibly attempt to charge at too high a current, and stall the engine as well as give a pulse of high current.

I'm not very technically minded so what ever i come up with has to be fairly simple. the generator can only put out so much current, and i plan only use it while i drive when i can watch what is happening. as long as the battery voltage doesn't go to high the generator shouldn't produce enough current to do any damage right?
I don't know the characteristics of a generator. It might be OK to just connect to a battery and it may be self limiting, but my suspicions are that it won't work and will just stall without suitable current limiting circuitry.

I could put a 30amp fuse between the generator and batteries or find a way to shut off the generator automatically if the voltage or current went to high, would these options be a suitable failsafe?
That should save the batteries, yes; one or two pulses of high current won't do much damage. But if the generator's engine stalls every time to try to charge when the pack is low, that's no good either. Batteries aren't like light bulb loads.

what happens if i floor it and the battery voltage sags while i'm charging, i'm guessing the current will just go into the motor, and not into the batteries
Yes, that's pretty much how it will work.

like what dc braveheart said i'll be in the middle of the cell range to increase safety, i'll also have a decent BMS.
I'm not a fan of limiting the charge voltage as a proxy for cell safety. Sure, if you only have 6 cells in a module, like with a car starter battery, then you usually don't have to worry about one cell going too high in voltage (though lead acid car starter batteries have other equalisation methods, involving gassing). But a typical pack is at least 30 cells in a series string. Even if you limit the volts per cell to say 3.4 V, which is pretty conservative, it doesn't take much imbalance (say ten cells down 0.1 V from the average) for one weak cell to suddenly have an extra volt across it, and 4.4 V will ruin a cell quickly.

You recognise the need for a BMS; that's great. I say don't limit the average volts per cell as a sort of "extra insurance"; give the cells what they need (and a little more, if the BMS has bypass capability), so you don't needlessly prolong charge times. I work on 3.65 VPC, since our cells (Sky Energy/CALB) are listed as 3.60 V maximum. Our BMS can bypass 1 A, and communicates with the charger to limit charge current to 0.9 A when a cell goes over-voltage.


sorry to bombard you with questions but i guess that's the curse of knowing what you talking about.[/quote]
See less See more
most 'smart charger' start with a CA phase to jam as much juice in as they can handle until voltage comes up to some trigger... typically around 3.65 x number of cells, then switch to a constant voltage and hold until amps drop to 'nothing'.

In a perfectly matched pack, all cells hit the trigger at the same time and the pack voltage reflects the intended CA to CV switch. If cells are NOT balanced, then high cells may 'take off' before pack voltage triggers the charger to CVunless controlled by a BMS.

The key is that balance is good to hit that trigger voltage at the same time, and precise control of that trigger voltage is required.
There is only one way to charge lithium batteries IMHO and manufactures.

It starts with a constant current of no more than 1C or per manufacture specified C rate.

Once the cell voltage reaches 4.2 volts per cell, or slightly lower of 4.1 for increased cycle like, the charger goes constant voltage until the charge current tapers to 3 to 5% of rated battery AH capacity.
There is only one way to charge lithium batteries IMHO and manufactures.

It starts with a constant current of no more than 1C or per manufacture specified C rate.

Once the cell voltage reaches 4.2 volts per cell, or slightly lower of 4.1 for increased cycle like, the charger goes constant voltage until the charge current tapers to 3 to 5% of rated battery AH capacity.

whoa Nelly.... I don't think you want to go above avg cell voltage of 3.8 to leave any margin for error! probably safer around 3.7 when the cells start climbing the 'knee', but well before meltdown.
While we are in a voltage cutting mood, I've quit going over 3.6 volts per cell on a regular basis. There doesn't seem to be much up there.
While we are in a voltage cutting mood, I've quit going over 3.6 volts per cell on a regular basis. There doesn't seem to be much up there.

I would agree.... and most of the conservative pre-set charger curves seem to use 3.65 x cells as the CA->CC voltage. This leaves a little room for a couple cells to be 'out of balance', and still have the series pack voltage hit the mark before the high ones melt down....

Having faith in a decent charger to accurately sense pack voltage, and catch that trigger point to switch to CV, is key to considering running without a cell-level BMS. It also means that the better job you do top-balancing, the more likely the pack will hit the mark all at nearly the same time.

By same token you have to accept that a top-balanced pack will NOT be bottom balanced, and if you run the pack down to low, the cell(s) with less capacity may be damaged.
whoa Nelly.... I don't think you want to go above avg cell voltage of 3.8 to leave any margin for error! probably safer around 3.7 when the cells start climbing the 'knee', but well before meltdown.
He is talking about different li-ion chemistry.
so I'm getting the impression that this could work, as long as i take appropriate precautions:


  • good BMS
  • limit the current
  • Fuse
  • find away automatically turn off the generator when the volts reach a certain level.
  • only charge at 3.65v
the plan so far

  • generator
  • current limiter, probably around 100amp, conservative but 100amps wouldn't hurt my 100amphour batteries
  • fuse, 110 amp
  • high voltage cutoff, a bit above me but i'll look into it.
See less See more
Here's something I posted on the BMS thread. I thought it may be useful and shed some light on this subject so here it is. I have Calb cells coming with a max V of 3.6. I'm not going to go that high to extend the cell life and since I'm not using a BMS, a charge voltage less than 3.6V will make the possibility of overcharging the lowest capacity cell much less likely. When these batteries are full the voltage spikes rapidly and you don't want to go there or you overheat the electrolyte and reduce the ah capacity. Nobody wants that!


I'm not experienced with the new batteries yet but my "plan" is to charge them with a voltage at 30 amps that will charge them sufficiently but not overcharge them. I'm expecting the batteries to be completely charged when the voltage averages 3.45 or so per cell.

Here's something that comes to mind that I teach my employees about electricity. For current to flow in a circuit, there must be a difference in potential. That's the definition of voltage. In this case, the charger is creating a voltage higher than the pack voltage. Because the pack resistance is extremely low, it will accept a charge as long as the voltage is only slightly above the cells terminal voltage.

Here's an example. The batteries I ordered were measured at around .00025 ohms. Multiplied times 50 batteries that I'm running and that's a total of .0125ohms for the pack. So in order for 30 amps to be able to flow into it I only need .375V more than the pack resting voltage using the formula E=I x R. E is voltage, I is current and R is the resistance in ohms. So you can see it doesn't take much voltage to charge one of these batteries.

That's how it's possible to bottom balance the cells all at once by simply tying them in parallel before you install them in the vehicle.
See less See more
1 - 14 of 14 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top