DIY Electric Car Forums banner
1 - 3 of 3 Posts

·
Registered
Joined
·
30 Posts
Discussion Starter · #1 ·
Hi.
Most of the EV conversions have space and weight distribution constraints which demand that the battery pack will be split into sections. These will be mounted at different locations where the cooling and temperature conditions may be significantly different.
Additionally the end cells in a row of cells will receive more cooling from ventilation than the cells near centre.
This will obviously lead to some difference in the temperature of individual cells since they heat up from internal losses during usage.

Has someone on this forum studied this temperature difference issue, how serious it is and how it affects the balance and cell life in a battery pack ?

Any comments or references to reliable data regarding this issue ?

Agust
 

·
Registered
Joined
·
4,326 Posts
Has someone on this forum studied this temperature difference issue, how serious it is and how it affects the balance and cell life in a battery pack ?

I don't think anyone has made any scientifically rigorous study comparing cell temp difference and its affect on balance thru numerous cycles. The only data point I can contribute is:

I have one rack with 15 Thundersky 100ah cells in a 'block' one cell wide without a cover all summer, rear rack has been completely enclosed. I really only saw a little (5 degrees) temp difference with similar outdoor temp probes taped to the copper bus bars attempting to get a clue of internal temp compared to ambient after 20 minutes of extended highway drive at 200amps. That being said I have not seen any significant relative cell drift in the first 3000 miles in either rack or compared to each other. I have developed a 'method' of cell level checking that *should* show drift if it occurs, but so far nothing has drifted anywhere.

I run my cells BMS-less, using a low end Elcon 1500 with finish voltage set to 3.65vpc. Typical daily discharge is around 30% DOD. Vehicle is in the garage at night.

I have checked cell voltages twice. Once at 1000 miles, once at 3000 miles. Method is to catch system just post charge when it has partially settled from a peak of 139.0 to 136.0... because it slows down enough for me to measure all the cells before settle much more and I want to try and catch them as high as possible to see variations. The cells were initially top-balanced to +/- .02v at the post charge voltage.... and they settle even closer than that longer after charge.

so.... my initial conclusion on limited data is that there is not tha tlarge a temp differential, and it does not seem to have any affect on cell balance over time.
 

·
Registered
Joined
·
590 Posts
I don't have any hard data unfortunately, but I was talking to someone recently who is developing a BMS for OEM use. I mentioned the debate going on in the DIY world for and against battery management systems, and his response was that temperature variations within the pack were bound to cause long-term issues without active management of some form, even with cells that are perfectly matched in capacity and resistance. This seems pretty obvious when you look at the effects of temperature on lithium cell discharge rates and capacity, but it would be good to have some hard data.

This is something I've been thinking about for the pack I'm building now, especially as I'm in the UK and the car will be parked outside. Ideally I'd like temperature monitoring on each parallel cell group, but building this pack is already complex enough. I'll settle instead for insulating the exterior of the pack and using cooling fans to spread the waste heat from charging around the pack.
 
1 - 3 of 3 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top