Just ignore them.Stop assuming stuff! I am still going to use a BMS in a feedback control way as always, and yes it will go turtle when batteries are hot.
I don't want to do feedforward control, I never wrote that.
I am just trying to see if I can get a rough idea of the amount of energy getting lost in the batteries heating up.
My goal is to get an idea of what it would be like if I don't utilize any active cooling on the pack.
It will be a rough idea, there will be a lot of parameters that change the outcome in reality. However, it will still give me a basic idea. For example if it will be a lot in turtle mode when driving it in summer, I might need to go for a different strategy.
IMO, in the context of what you're trying to accomplish, I think you're on the right track with your approach.
Some things I would take into consideration:
- Joule heating (P=I^2R) is the predominant heating factor.
- There is also reaction (reversible entropy) heating (which in some cases can be endothermic). This is a smaller contributor than the joule heating, and at higher discharge rates you might be able to ignore this term in your calculations. Either way, you should look into it, and make a more informed judgement.
- I think that cooling rate will be the greatest source for error.
- One option is to go to the extreme and assume no heat transfer out, which will give you a conservative result. Which is great if the result is that your temperature rise is within the cell limits, because it can only get better with real-world cooling. But if the calculated temperature rise is too high, it doesn't really tell you much, except that you do need cooling.