Joined
·
3,141 Posts
In a current thread on battery pack problems I noted that Lithium cells may have a temperature coefficient for the charging and open cell voltage, but also there may be effects from internal and external resistance. So I found a thread where a chart was supplied with internal resistance as a function of temperature, and found that the IR becomes much higher at low temperatures, which is the opposite of what would be expected for conductors:
Thus it would be expected that the terminal voltage would drop much more when discharged at a lower temperature than at normal and elevated temperatures. For a 100A discharge, a 100 Ah cell with a nominal voltage of 3.3 volts would drop by 50-100 mV to 3.2 volts at 25-50C, but at 10C it would be about 300 mv to 3.0V, and 600 mV at 0C to 2.7V. If the Ah charge remains the same, this represents a drop of about 10% to 20% of Wh energy capacity.
If this is really an ohmic internal resistance, there should be proportional power dissipation in the cell which should create heat and soon raise the operating temperature to more efficient levels. 600 mV at 100A is 60 watts, which would probably raise the temperature in several minutes time. But if not, and the cell is depleted after 100 Ah current drain, where has the extra energy gone? And why should cooling be required if the power dissipation is normally only 5-10 watts per cell at 1C? Of course at higher levels like 4C this would be 80-160 watts, so that may be the reason if that rate is sustained.
This also shows what the efficiency of these cells might be at various rates of discharge. 10 watts at 1C for a 100 Ah cell is about 10/330 or 97% efficiency, while 160W at 4C is 160/1320 or 88%. At 10 degrees C the losses would be 3 times and efficiency would be 88% at 100A and 52% at 400A.

Thus it would be expected that the terminal voltage would drop much more when discharged at a lower temperature than at normal and elevated temperatures. For a 100A discharge, a 100 Ah cell with a nominal voltage of 3.3 volts would drop by 50-100 mV to 3.2 volts at 25-50C, but at 10C it would be about 300 mv to 3.0V, and 600 mV at 0C to 2.7V. If the Ah charge remains the same, this represents a drop of about 10% to 20% of Wh energy capacity.
If this is really an ohmic internal resistance, there should be proportional power dissipation in the cell which should create heat and soon raise the operating temperature to more efficient levels. 600 mV at 100A is 60 watts, which would probably raise the temperature in several minutes time. But if not, and the cell is depleted after 100 Ah current drain, where has the extra energy gone? And why should cooling be required if the power dissipation is normally only 5-10 watts per cell at 1C? Of course at higher levels like 4C this would be 80-160 watts, so that may be the reason if that rate is sustained.
This also shows what the efficiency of these cells might be at various rates of discharge. 10 watts at 1C for a 100 Ah cell is about 10/330 or 97% efficiency, while 160W at 4C is 160/1320 or 88%. At 10 degrees C the losses would be 3 times and efficiency would be 88% at 100A and 52% at 400A.