DIY Electric Car Forums banner

1 - 18 of 18 Posts

·
Registered
Joined
·
199 Posts
Discussion Starter #1
The Elithion Lithiumate BMS manual describes the charging phases of CCCV charging here, and with the attached figure. I understand the CC phase; I'm interested in understanding the CV part of the charging phase, where the voltage is held constant and the current is tapered off exponentially. Is anyone allowing their Elithion to control their charger?

I'm puzzled by their statement at the bottom of their web page,

If the charger is capable of current control, you may also set up a way for the BMS to tell the charger the maximum allowed current, either through the CCL line or through the CCL data in the standard CAN messages.
That is not absolutely necessary, though.
How can it not be necessary? How else is the current going to exponentially decay unless the charger is told explicitly to do so? All throughout their page on charging they advocate a dumb charger and letting the BMS do the thinking. And in order for the BMS to get it right then it must know what the charger is capable of.

Presumably the BMS must know the max value of each charger make/model that it is communicating with. I guess I'll have to conduct an experiment, and tell the BMS that I have a different charger, and watch the CAN messages to see if it changes the output current message.
 

Attachments

·
Registered
Joined
·
1,470 Posts
In the CV stage, when the cell voltage reaches, or nearly reaches, the set point CV voltage, then there is very little to no voltage difference left between the charger output terminal and the cell terminal to force current into the cells.

So the current level drops naturally in a somewhat exponential fashion--as electrons are absorbed the cell voltage increases, which reduces the forcing function, so fewer electrons flow.

The cell vendors have prescribed a charging procedure whereby if you follow it, you will fill but (hopefully) not overcharge or damage your cells, and you would expect to get the advertised number of cycles. To extend life some folks use a more conservative approach with lower limits than the advertised procedure--this is the current control to which he says can be done, but is not absolutely necessary.

You, or your BMS, must be diligent to ensure that the limits are never exceeded. So if you hold a CV point then the current should be monitored to shut off the charger before or at the current limit.

There is little to no benefit in trying to absolutely fill the cells to the max. Leave yourself some margin at the top and the bottom, enjoy peace of mind and longer lasting cell life.
 

·
Registered
Joined
·
740 Posts
Is anyone allowing their Elithion to control their charger?
I have not yet seen a BMS I would use for charge regulation, so I'll answer from a "generic" POV.


> How can it not be necessary?

The point at which charging transitions from CC (Bulk) to CV (Absorb) is not controlled by any outside regulation, but

by battery chemistry, SoC vs internal resistance and the rate of charge.

At some point these factors also cause current **accepted** by the bank to drop, regardless how many amps are "on offer" by the source.

All the charge regulation is doing is keeping voltage from climbing above the Absorb setpoint.

Now, the other critical requirement of charge regulation is "knowing" when to stop.

Ideally this is based on trailing amps accepted by the bank hitting a a 100% Full current setpoint, adjustable by the user to match the batt mfg spec, or their preference based on desire for longevity, as opposed to squeezing in the last possible Ah / range.

Whether or not it is the BMS controlling this end charge point or not (often dropping voltage from Absorb to Float), no "knowledge" of any values is required wrt the source of current.

It's just volts and amps, and in fact the decision tree is exactly the same for any power input.



Now of course, there are more complex algorithms, a current-limiting controller **could** reduce current in line with the Bulk stage's "striving to hit Absorb" rising voltage, in effect never quite hitting a true CV / Absorb stage.

But this would be a slower process, and I am not yet convinced of the advantage of doing so, other than perhaps such "gentler" treatment may allow for pretty-fast charging while reducing internal heat-production and perhaps extending longevity.
 

·
Registered
Joined
·
740 Posts
To extend life some folks use a more conservative approach with lower limits than the advertised procedure--this is the current control to which he says can be done, but is not absolutely necessary.

You, or your BMS, must be diligent to ensure that the limits are never exceeded. So if you hold a CV point then the current should be monitored to shut off the charger before or at the current limit.

There is little to no benefit in trying to absolutely fill the cells to the max. Leave yourself some margin at the top and the bottom, enjoy peace of mind and longer lasting cell life.
Well put, and agree completely.

In fact, where longevity and robust simplicity are priority goals and fast charging not,

a lower current charge into a high-CAR chemistry like LI, will get to a **very** high SoC in Bulk / CC stage before hitting Absorb V, so it becomes possible to simply terminate charging at that point, no Absorb / CV required at all.

In fact, at low enough current rates, it is possible to **overcharge** before ever hitting the mfg spec'd charge voltage, so dropping the setpoint by as much as .1V may be indicated.

For example with LFP at a .2C charge rate, my preference in normal cycling is to stop at 3.45Vpc.

When charging at say .1C, I'd stop at 3.40V or even 3.35V.

If determining actual SoC with CC load testing, not Ah counting of charger output, it becomes apparent that there is not more than 1-2% between this stop point and holding 3.6V until .02C endAmps.
 

·
Registered
Joined
·
199 Posts
Discussion Starter #5
Okay, thanks guys, that's kind of what I wondered about but wasn't sure. So no one is "programming" that exponential curve, but the BMS does program a current limit. If that current limit is "wrong" (let's say it's very high and the charger is programmed for 20A, at a time when it might be at the tail of that curve) it's not necessarily going to output 20A.

To extend life some folks use a more conservative approach with lower limits than the advertised procedure--this is the current control to which he says can be done, but is not absolutely necessary.

You, or your BMS, must be diligent to ensure that the limits are never exceeded. So if you hold a CV point then the current should be monitored to shut off the charger before or at the current limit.
How do I ensure I get this current limit correct?

I can enter two current-related parameters into the Elithion BMS: the max continuous and max peak charging currents (where I think "peak" is defined as < 10 sec). Beyond that, the BMS calculates a charge current limit. Exactly how it calculates this number will forever be a mystery.

For my nominally 90Ah pack, I have the max set to 90A, and the peak set to 135A. I picked max as 1C and peak as 1.5C (which it will only ever see for some seconds on regen). If I ask the BMS right now, sitting in the garage on a fully charged pack, what is the charge limit, it tells me 53A. That's nowhere near the tail of any exponential curve! If I plugged the charger in right now however, the BMS wouldn't let it charge because the cell voltage is too high.
 

·
Registered
Joined
·
1,470 Posts
... If I ask the BMS right now, sitting in the garage on a fully charged pack, what is the charge limit, it tells me 53A. That's nowhere near the tail of any exponential curve! If I plugged the charger in right now however, the BMS wouldn't let it charge because the cell voltage is too high.
That's THE GOOD THING™ Your pack is already full and you shouldn't try to charge it--Better to shut off too early than too late. Better safe than sorry, right?

What is the cell voltage to which you are charging?

i have seen datasheets that indicate absorption phase (CV) should continue until the current drops to 1/20 or 1/50 or even 1/100 C. But i know folks have lost cells because that limit was never hit and the charger never cut off.

From cell testing and reviewing literature from vendors i consider the full voltage of LiFePO4 cells to be 3.333 volts. If you charge to 3.6 vpc with 1/50C, then disconnect the charger and come back the next day, generally the cell voltage will have dropped down to ~3.3 to 3.4 vpc. There is just so very little energy to be gained by holding 3.6 versus just shutting off during the bulk CC phase when it hits 3.4.

With 90AHr cells, someone has set your charger cutoff limit to a conservative 53A, probably Davide did this since he knows the dangers of overcharging. He just cut the long tail off the exponential to save wear and tear on your pack. So this leads back to the question, what's your CV setting?

Actually what kind of charger are you using and what is the max current that it can do?
 

·
Registered
Joined
·
199 Posts
Discussion Starter #7
What is the cell voltage to which you are charging?
I have LG Chem Chevy Volt batteries (LiMn2O4). I have programmed the Elithion for Vcell-max=4.00 and Vcell-high=3.95.

According to Elithion, Vcell-max is the maximum cell voltage reached at the end of charge. If the voltage of any cell goes above the value the BMS disabled charging. Vcell-high is the cell voltage at the bottom of the steep rise that occurs when the cell is fully charged. If the voltage of any cell drops below this value (typically because of balancing) the BMS may re-enable charging.

I've read somewhere that the max for these cells is 4.15v. I'm not 100% sure about the sources I've read this, plus I don't want to charge to the max anyway, so I somewhat arbitrarily set the high to 3.95v and the max to 4.0v.

With 90AHr cells, someone has set your charger cutoff limit to a conservative 53A, probably Davide did this since he knows the dangers of overcharging. He just cut the long tail off the exponential to save wear and tear on your pack. So this leads back to the question, what's your CV setting?
First I have to tell you more about the pack. I have two "batteries" in parallel, each consisting of 3p54s cells. 54 cells in series at the max 4.15v/cell is 224v.

Elithion says somewhere (I can't find the reference right now) that the CV should be set higher than the pack voltage. Somewhere along the line this got set to 226v, though now with my new charger I have complete flexibility to adjust this (more on that in a moment). I gave my pack info to Elcon when they supplied my previous charger (a PFC2500) which they programmed for 226v output.

That PFC2500 just died and I have now purchased Elcon's new UHF charger. This charger requires a CAN message at 1Hz to stay on, a message which contains the voltage and current. So I have to tell it the max voltage and current. The max voltage I figured I would just keep at 226v. The max current I suppose I should get from the BMS, but I think this should have to be scaled proportionally to what the EVSE can provide.

Actually what kind of charger are you using and what is the max current that it can do?
I have the new Elcon UHF IP67 Sealed CAN Bus Charger, 6.6kW, which is capable of 20A output. I will probably never use the full output. I can only get 15.5A from my home EVSE, and the EVSEs at work are throttled based on demand, and I think give only 8A. Other than the EVSEs at work I have never had to use a public station, say on the street, so I don't know how much current they supply (but by the fact I've never used one, I'm not too concerned -- all of my charging has been at home or work).

That brings up another point. It seems that no one must be using J1772 because even though the Elithion BMS is capable of talking to the charger directly, you certainly couldn't have your BMS telling your charger to put out 53A (it can't anyway) nor even the 20A that it is capable of, because the EVSE may put out substantially less. And that is all part of the communications in J1772; the EVSE gets to tell the car how much current it can supply. I ran into this problem at work. I used to have the AVC2 from Modular EV Power, which was fine at home, but it completely ignores the message from the EVSE. Once they added charging stations at work the AVC2 didn't work for me. The EVSE would announce it was supplying 8A but since the AVC2 ignores that, my Elcon tried to run flat out at 12A, which the EVSE didn't like so it shut down. In order to charge at work I ended up having to build my own AVC2-equivalent, which listens to the EVSE's message and then drives the (optional) 2-5v analog input signal to throttle back the PFC2500's output.

With my new CAN bus Elcon, I've replaced the analog diver with a CAN driver, so it can tell the charger directly what the EVSE can supply. But I suppose I should also be listening to the BMS, too.

Now you see the origin of this post! :) I've got to tell my new charger SOMETHING, and I have to listen to the EVSE and I have to listen to the BMS, and I want to make sure I'm telling the charger the right thing!
 

·
Registered
Joined
·
740 Posts
So no one is "programming" that exponential curve, but the BMS does program a current limit. If that current limit is "wrong" (let's say it's very high and the charger is programmed for 20A, at a time when it might be at the tail of that curve) it's not necessarily going to output 20A.
OK, so charging from depleted, when the voltage is rising (CC / Bulk stage) then hits the CV setpoint, the charger's regulator is now in Absorb stage keeping V from rising.

At a .5C charge rate say the bank is now at SoC 93%.

At a lower rate the SoC would be higher, at a "low enough" rate there would be no need for any further CV charging at all, better for longevity to just stop at that transitiin point.

At a higher rate SoC would be lower, perhaps unacceptably so wrt your need for range.

What is your desired stop-charging SoC point?

Maybe just 10 more minutes will be enough.

Personally for longevity with LI chemistries, I would not pass trailing amps going below .02C. Without range (max capacity) being critical, .05C would be better.

And that with the CV setpoint being lower than mfg spec.

Some people just blindly keep pushing current until the batt's accepting .005C, or even zero, charging stops completely.

And at the too-high mfg spec voltage!

The difference in actual stored energy available might be 2%, but lots of lifetime cycles are being sacrificed.
 

·
Registered
Joined
·
740 Posts
The max current I suppose I should get from the BMS, but I think this should have to be scaled proportionally to what the EVSE can provide.
No, the max charge rate is a judgment call, your balancing longevity vs speed of charging.

That .5C rate you mentioned above is oriented toward longevity, possibly hanging around for 1.8 hours to refill the cost.

If the charge source can only put out less, then that will be longer, nothing can be done.

If the bank is trying to pull more from a higher current charger, there should be regulation in place keeping to the limit you set, in this case .5C.

Ideally that is part of the charger, thus the BMS acts as a failsafe fallback in case the primary current regulator fails.

For example if your CAN-enabled Elcon is set to max current of 53A, your BMS could be set to 60A as a fallback.

There is no dynamic current regulation going on (usually, with standard CC-CV gear).

After the CC to CV transition, the falling amps is determined by the **battery**, chemistry's SoC / resistance characteristics.

The charge regulation to determine the charge-stop point is ideally based on measuring that falling amps acceptance rate.

But sometimes it's a dumb egg timer. And sometimes it's the human watching an ammeter.

Which IMO should be regularly tested and calibrated, or a known good independent meter used - lots of BMSs and chargers get out of whack over time.
 

·
Registered
Joined
·
1,470 Posts
Until you get a datasheet for those cells and that chemistry to know what the real story is, i would suggest not to let the Elcon charger supply the higher voltage "226V". Start lower and work your way up as you gain confidence that the voltage and current values are "calibrated" correctly.

Use some good meters that you trust and check the charger voltage and current--trust but verify, since any little errors or offsets in their box can wreak havoc in your cell$ and pack. e.g. The advertised "226" may be the 20A-loaded value, which could be higher with only an 8A load.

Your BMS current values are obviously way too high since the charger can only do 20A max, but even that is current limited by the mains supply thru the EVSE. So the EVSE is your limiting supply factor.

The BMS voltage limits seem reasonable assuming the chemistry values are what you said.
 

·
Registered
Joined
·
374 Posts
I didn’t read every post in this thread but some of the information I read seemed incorrect. Perhaps I misunderstood.

These chargers do limit the current as the battery fills. The software limits the current to 0.2C in stage three cv stage and then in stage four it drops to 0.02C until the end point voltage is reached.
 

·
Registered
Joined
·
199 Posts
Discussion Starter #12
These chargers do limit the current as the battery fills. The software limits the current to 0.2C in stage three cv stage and then in stage four it drops to 0.02C until the end point voltage is reached.
What are stages? Maybe you're thinking of a different charger? Because the charger doesn't do anything unless I tell it; it requires a CAN message once per second or it will turn off.
 

·
Registered
Joined
·
740 Posts
How can it not be necessary? How else is the current going to exponentially decay unless the charger is told explicitly to do so?
First off, usual CC (Bulk)/CV(Absorb) charging involves no current control at all, other than a max limit made available.

Battery determines any current changes, tapering after the CC-to-CV transition, both are functions of resistance as SoC rises, not coltrolled by the charge source.

With LI chemistries, demand amps trailing off only happens at a very high SoC, depends on current C-rate made available by the charger.

Low enough C rate, can get all the way to 99% SoC with CC stage only. At higher C rates, the CV transition happens at much lower SoC, greater bounce back down to resting V.

____
So, no "stage" intelligence is required so far, dumb power supply is fine, keeping V capped to the max Absorb / CV setpoint.

The trick now is how to terminate charging.

If absolute Full is required, don't mind losing lots of life cycles, hold Absorb V until current drops to zero amps.

Lower SoC termination increases longevity, to a point, but sacrifices range.

To prevent too many lost cycles, stop when the current taper reaches an endAmps setpoint of .01C, likely ~99% SoC

.03 or even .05C is better for longevity, maybe .97% SoC

If current is low enough and Absorb voltage high, no CV at all, just Stop when the V setpoint is reached, gets you to 92-96% SoC.

Trying that at high current, or a V lower by a .1V or so, maybe 88-92%.

All the above is true, regardless of equipment used.

Whether the sensing of V and A and decision to terminate charge (or maybe drop to Float V) is handled by a BMS or charger really doesn't matter.

But chargers with a shunt at the bank to measure trailing current are very rare. At best they may have an intelligent Absorb Hold Time algorithm with user adjustability.

Most just have a dumb eggtimer approach.

All the BMS needs to know is what it can measure, stopping charge input can be a relay cutting off the chargers upstream supply.

But a comms protocol is more elegant.
 

·
Registered
Joined
·
374 Posts
What are stages? Maybe you're thinking of a different charger? Because the charger doesn't do anything unless I tell it; it requires a CAN message once per second or it will turn off.
My bad then, I thought your speaking about an Elcon Charger.
 

·
Registered
Joined
·
374 Posts
It is an Elcon. I have the new Elcon UHF IP67 Sealed CAN Bus Charger, 6.6kW but like I said, it requires a CAN message containing voltage and current @ 1Hz to stay on.
I have never worked on one of the new ones but the old ones had a can bus option and worked the same way. Once it reaches MAX voltage it ramps the current down to maintain that voltage until you reach cutoff current.

The supply has to reduce current in order to maintain the voltage at the level you want. Nothing naturally drops off. If your don't lower current then voltage will rise. I found all these explanations confusing in this thread.
 

·
Registered
Joined
·
740 Posts
Yes not talking about the internals of **how** the regulator controls voltage.

Many people think that the controller is "in charge of" the CC-to-CC transition point, whereas in effect the battery chemistry is, charge rate vs resistance.

The point is that the intentional control is Voltage only, stopping it from rising over the Absorb setpoint.

There are chargers that allow for de-rating current to protect the upstream source or for charging small batteries,

but that's usually a once-off "sticky" setpoint,

dynamically dropping the current rate ahead of the batt resistance-acceptance limiting that,

although IMO a good idea for exotic profile ideas like "pulse timer" CC-only targeting a precise **resting** voltage,

is not found in normal CC/CV chargers.

Of course with live constant CAN messaging, and sophisticated custom programmable microcontrollers on the scene, anything becomes possible.
 

·
Registered
Joined
·
740 Posts
Just a nitpick
In the CV stage, when the cell voltage reaches, or nearly reaches, the set point CV voltage,
Bulk / CC stage is voltage "striving" to hit the setpoint.

Absorb / CV stage is **after** the charger is holding voltage at the setpoint.

With lead chemistries, Absorb needs to be held for many hours before trailing amps taper off to the terminate-charge (batt is 100% Full) endAmps spec, usually then dropping to Float V.

With super high CAR of LI chemistries, at low rates Absorb CV stage may be only a few minutes, and even at high rates maybe 10-20.

Everything else there is spot on, 100% agree about gentle charging for longevity.

The standard spec'd charge profile is IMO way too aggressive, "you won't know how much lifespan you're losing, damage is not obvious", and the **only** upside is maybe 5% higher usable Ah.

And the vendors staying in business selling replacement banks more frequently.
 
1 - 18 of 18 Posts
Top