DIY Electric Car Forums banner
1 - 15 of 15 Posts

·
Registered
Joined
·
3,141 Posts
Discussion Starter · #1 ·
In a current thread on battery pack problems I noted that Lithium cells may have a temperature coefficient for the charging and open cell voltage, but also there may be effects from internal and external resistance. So I found a thread where a chart was supplied with internal resistance as a function of temperature, and found that the IR becomes much higher at low temperatures, which is the opposite of what would be expected for conductors:


Thus it would be expected that the terminal voltage would drop much more when discharged at a lower temperature than at normal and elevated temperatures. For a 100A discharge, a 100 Ah cell with a nominal voltage of 3.3 volts would drop by 50-100 mV to 3.2 volts at 25-50C, but at 10C it would be about 300 mv to 3.0V, and 600 mV at 0C to 2.7V. If the Ah charge remains the same, this represents a drop of about 10% to 20% of Wh energy capacity.

If this is really an ohmic internal resistance, there should be proportional power dissipation in the cell which should create heat and soon raise the operating temperature to more efficient levels. 600 mV at 100A is 60 watts, which would probably raise the temperature in several minutes time. But if not, and the cell is depleted after 100 Ah current drain, where has the extra energy gone? And why should cooling be required if the power dissipation is normally only 5-10 watts per cell at 1C? Of course at higher levels like 4C this would be 80-160 watts, so that may be the reason if that rate is sustained.

This also shows what the efficiency of these cells might be at various rates of discharge. 10 watts at 1C for a 100 Ah cell is about 10/330 or 97% efficiency, while 160W at 4C is 160/1320 or 88%. At 10 degrees C the losses would be 3 times and efficiency would be 88% at 100A and 52% at 400A.
 

·
Registered
Joined
·
571 Posts
If this is really an ohmic internal resistance, there should be proportional power dissipation in the cell which should create heat and soon raise the operating temperature to more efficient levels.
Yes, it works exactly like this. The efficiency is poor at start but it is restored as the losses heat up the pack. At large currents, it can happen pretty quickly as the heat is internal to the cells. The process is self-regulating, those cells with highest R heat up more which decreases their R.

Some energy is of course lost in the process (Q = c*m*dT), which is why it makes sense to externally heat up the battery pack while still connected to the grid, to maximize battery range on cold weather. Good thermal isolation is even more important.
 

·
Registered
Joined
·
514 Posts
Are you sure the resistance is electrical , not just a matter of transport speed of the 'ions'?

I doubt that significantly more heat is generated by the cells at low/freezing temperature (0Celcius ) than at regular confortable temperatures (20Celcius).
From what I understand, the reactions slow down and ions just are not released or released quick enough. 1khz AC impedance of a cell generally is lower than a charge/discharge test at elevated rates.
 

·
Registered
Joined
·
571 Posts
I doubt that significantly more heat is generated by the cells at low/freezing temperature (0Celcius ) than at regular confortable temperatures (20Celcius).
You can doubt is as much as you want to, but the test data is everywhere.

I tested it just a few days ago myself.

I'm amused that this comes as a surprise to anyone.

All "real" measured discharge curves at cold temperatures show a bump at the beginning while the cells heat up. So you can distinguish "theoretical" discharge curves from really measured ones easily. "Theoretical" would assume that the cell would be kept cool by a powerful cooling system that would be internal to the cell.
 

·
Registered
Joined
·
1,296 Posts
This also shows what the efficiency of these cells might be at various rates of discharge.
I have my doubts about this correlation.

Batteries are chemical reactions ... Ohms Law is not designed or intended for testing the efficiency of chemical reactions.
 

·
Registered
Joined
·
571 Posts
"Internal resistance" is of course only a model.

An "ideal" battery can be modeled as a voltage source and a series resistance.

The funny thing is that, unlike any other battery chemistry, li-ion is very close to that ideal model.

The resistance is of course not a constant, but varies with temperature and SoC.

But within similar operating conditions, R is constant so that voltage drop U = RI with different values of I, so it follows the ohm's law for some time.

It is just that a large current I may affect the operating condition by heating up the cell, in which case the R drops. OTOH, when the SoC drops near zero, R goes up.
 

·
Registered
Joined
·
1,526 Posts
i view the concept of an internal resistance as an indicator of the chemical reaction rate and not as an ohmic resistor which generates heat by the current squared.

In Paul's graph it appears that this rate is proportional and sensitive to temperature. And as the rate of the chemical reaction decreases, the "internal series resistance" temporarily increases until the temperature rises.

In another application of this concept, any damage to the electrodes due to over charge or discharge will also slow down the chemical reaction and cause the ISR to increase, but it will be an irreversible increase and result in a permanent energy capacity loss.
 

·
Registered
Joined
·
3,141 Posts
Discussion Starter · #9 · (Edited)
Chemical reactions are either exothermic or endothermic, but I think the energy normally released or absorbed as heat could instead be electrical power that would be transferred out through the electrodes and used for mechanical power in an EV or dissipated as waste heat due to inevitable inefficiencies. Since energy cannot be created or destroyed, the energy lost due to the "virtual" ESR must be converted to another form, and if it is not heat, then it must be a separate endothermic chemical reaction that absorbs the energy. If it later becomes available, then it must be a reversible reaction, and the full energy that was put into the battery during charging must become available and usable eventually. Otherwise it is an irreversible reaction, which should result in a loss of capacity.

I think the losses are essentially thermal, although the ISR may not be resistive and perhaps more like a diode, where the apparent resistance drops at higher currents because it has a voltage drop which is largely independent of current. Actually, the curve looks much like a thermistor, and self-heating should lower the resistance as the temperature increases. There is probably also a resistive portion which has a positive temperature coefficient, but it may be a small part of the measured apparent ESR.

Thermistor curve:

http://www.cantherm.com/products/thermistors/choosing_ntc.html
 

·
Registered
Joined
·
2,047 Posts
I doubt that significantly more heat is generated by the cells at low/freezing temperature (0Celcius ) than at regular confortable temperatures (20Celcius).
From what I understand, the reactions slow down and ions just are not released or released quick enough. 1khz AC impedance of a cell generally is lower than a charge/discharge test at elevated rates.
I drove my car all last winter as long as the temp was above about 20F (-7C) and I found that it is terribly difficult to get the batteries to self heat. I have the controller set to limit at 1000A but you never can reach that when it is that cold. I set the controller low voltage limit to 1/2 nominal which for my then 33 cell pack is 53 volts. Now I have 51 cells and I will be setting the low voltage limit to 110 volts because that is the minimum for the DC-DC converter. I can drive my 4.4 miles to work with the cells at these temps and the cells do not warm enough to notice. At temps above 40F (4C) the car feels pretty normal to drive. My plan is to add external heat so that the batteries stay at about 40F or above.

I think the internal heating is about the same when it is warm or cold. It is probably due to the I^2*R losses in the materials in the cell. This resistance is a very small part of the so called internal resistance of the cell.

In summary, the voltage sag is so terrible that you really can't get them to self heat when they are really cold. And actually with 100AH cells they dont self heat when they are warm. I have driven the car when the batts were starting at 95F (35C) and even driving hard with the batteries in an insulated box there is not much additional heating. By hard driving I am talking several 1000 amp starts and extended driving at highway speeds. This results in a 10-15 degree F (6 to 8 degrees C) rise in temp at about the time the cells are exhausted.
 

·
Registered
Joined
·
1,296 Posts
Since energy cannot be created or destroyed, the energy lost due to the "virtual" ESR must be converted to another form, and if it is not heat, then it must be a separate endothermic chemical reaction that absorbs the energy.
Like the chemical reaction of the battery ... endothermic or exothermic that converts the electrical energy being applied into chemical energy ... charging the battery ... or discharging the battery.

You will see a dVdI ... but ... by itself ... that doesn't tell you how much of that electrical power , or energy over a period of time ... is converted to heat , converted to chemical energy , converted to a magnetic field , etc.
 

·
Administrator
Joined
·
6,435 Posts
In a current thread on battery pack problems I noted that Lithium cells may have a temperature coefficient for the charging and open cell voltage, but also there may be effects from internal and external resistance. So I found a thread where a chart was supplied with internal resistance as a function of temperature, and found that the IR becomes much higher at low temperatures, which is the opposite of what would be expected for conductors:


Thus it would be expected that the terminal voltage would drop much more when discharged at a lower temperature than at normal and elevated temperatures. For a 100A discharge, a 100 Ah cell with a nominal voltage of 3.3 volts would drop by 50-100 mV to 3.2 volts at 25-50C, but at 10C it would be about 300 mv to 3.0V, and 600 mV at 0C to 2.7V. If the Ah charge remains the same, this represents a drop of about 10% to 20% of Wh energy capacity.

If this is really an ohmic internal resistance, there should be proportional power dissipation in the cell which should create heat and soon raise the operating temperature to more efficient levels. 600 mV at 100A is 60 watts, which would probably raise the temperature in several minutes time. But if not, and the cell is depleted after 100 Ah current drain, where has the extra energy gone? And why should cooling be required if the power dissipation is normally only 5-10 watts per cell at 1C? Of course at higher levels like 4C this would be 80-160 watts, so that may be the reason if that rate is sustained.

This also shows what the efficiency of these cells might be at various rates of discharge. 10 watts at 1C for a 100 Ah cell is about 10/330 or 97% efficiency, while 160W at 4C is 160/1320 or 88%. At 10 degrees C the losses would be 3 times and efficiency would be 88% at 100A and 52% at 400A.
Hi Paul
Is this resistance as measured by voltage drop and current or the IKHz AC resistance that cell makers use?
Does your source data actually tell you?
 

·
Registered
Joined
·
571 Posts
i view the concept of an internal resistance as an indicator of the chemical reaction rate and not as an ohmic resistor which generates heat by the current squared.
But in reality - you know, the place we live in - it does generate heat just like an "ohmic resistor", I^2R, regardless of how you are "viewing the concept". This is the most basic experiment and you can repeat it too if you don't believe it. And this is important to understand when designing battery systems.
 

·
Registered
Joined
·
3,141 Posts
Discussion Starter · #14 · (Edited)
I found this chart in a post by Davide Andrea of http://elithion.com/, and the website is:

http://liionbms.com/_misc/

This lists several charts that do not appear to be directly accessible from the parent website. I don't see where the measurement technique is explained. Here is an interesting plot:



It looks like the 125 mOhm resistance is for a 356V nominal pack which would be 108 cells at 3.3V each at zero current, and the resistance at 100 amps discharge is determined by the 12.5V drop in voltage to 343.5V, which is 125 mOhms, or 1.16 mOhms per cell. This matches the other graph at 25 degrees C, so I think the resistance is calculated the same way. It appears to be fairly linear over the current range of up to 150 amps discharge and 100 amps charge.
 

·
Registered
Joined
·
287 Posts
If I'm not mistaken, this is exactly why battery safety is an issue, and, chemistries like LiFeP04 are used in DIY instead of lithium cobalt oxide.

Conductors of a single component (like copper) usually have a positive temperature coefficient (resistance rises with rising temperature) But alloys and electrochemical devices almost always have a negative temperature coefficient (resistance falls with rising temperature)

Alloys like some semiconductor silicon also have a negative temperature coefficient

Regardless of if it is true electric ESR or chemical impedance, the results are the same; thermal runaway. The greater the temperature, the higher the peak current will be. In turn, higher peak currents can mean higher temperatures. Most seem to think the fire on the boeing 787 was due to thermal runaway after a battery electrode short circuited
 
1 - 15 of 15 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top