So I believe
I've seen some threads around here about using capacitors to help ease the peak load stresses on a pack as well as potentially smoothing out the 'pain' of the pulses from PWM controllers.
That was before it really applied to me... now that it does I wonder how it can work. All I really remember was it seemed it probably wasn't worth it because for the money you could simply get more cells and be better off.
I just want to verify this is true -- because if I can reduce the loads on my cells a bit, I'd be very happy.
Reading this page: http://electronics.howstuffworks.com/capacitor2.htm
it's clear that using the capacitors for any significant storage of energy is simply not going to happen. But it does mention:
Capacitors can also eliminate ripples. If a line carrying DC voltage has ripples or spikes in it, a big capacitor can even out the voltage by absorbing the peaks and filling in the valleys.
That sounds great.
With the goal of taking some of the burden out of high-discharge events (ie: quick acceleration) I did a little mathing. I figured if I can take some of the load for 3 seconds that would cut almost all the high-level loads from my batteries. My pack is 150V and I have it limited via the Soliton Jr to only pull 400A max.
3 seconds at 400A is .33 amp-hours.
.33 amp-hours at 150V is roughly 8 farads.
I found these 9 Farad 2.5V capacitors
which can be purchased for $8.62 per. At 2.5V I'd need a set of 63 or so to match my pack voltage. That's 541 dollars. Not bad, really.
Now my first question: Is the above right? I think
I did the calculations correctly, but I'm not 100% positive.
And the second question: How exactly would this work? I would imagine that if you made a 'pack' of these in parallel to your main pack -- they would effectively 'charge' each other and always be at the same voltage. Then when you gave it lots of throttle they would both drop in voltage (the Lithium due to sagging, the capacitors discharging the amount of the voltage sag). Then when the discharge was over the lithium cells would recharge the capacitors up to the voltage levels of the lithium cells. Is the correct?
If the above is correct -- how quickly would the capacitors pull energy from the lithium cells after quick discharge? Would the lithium cells continue to discharge at a high level into the capacitors to the point that I would have effectively just 'time-shifted' the the load and not really evened it out at all? You know... it doesn't pull as hard from the pack when you hit the throttle... but it continues to pull hard after you let off. In the end, gaining you nothing...?
And the third question, which may be moot based on the answers in the second. If capacitors drop in voltage proportional to their charge level, then we would only really have the capacity of the delta from the 'charged pack' level and it's 'sagged' voltage level. So lets say I'm normally at 157v and I sag to 130 ... I only really get the 27volts delta in capactitive capacity... right? That's only 17% of the capacitors capacity. ... meaning one would really need 5-6x (or so) of their desired ah 'omph' in capacitor farads. Is this correct?
You can find 70F ones that aren't physically much larger than the 9 Farad one's above for only $11 (so, still affordable) -- that'd be a ~3ah capacitor pack (of which you could use up to .51ah during a massive voltage sag... right?
Anyone have any answers, thoughts, corrections, or comments?