DIY Electric Car Forums banner

1 - 11 of 11 Posts

·
Registered
Joined
·
23 Posts
Discussion Starter #1
I saw an interesting video on youtube with an eBike guy who was using a DROK numerical controlled CC-CV upconverting circuit ($30 on amazon) to charge a small eBike battery. It looked like a really neat product. He fed it from a modified HP server power supply that I think put out 24V IIRC.

I'll need a 48V feed to charge my battery, and it looks like I can require a very specific constant voltage at the end of the charging cycle. I understand that the device (the DROK thing) has some performance limits regarding throughput, and I think that I can live with them, as I may literally have all week to charge the battery, as it will be for a riding lawnmower. It looks like 48V power supplies or chargers are pretty expensive. So, I was thinking of driving the DROK thing with an HP server, and it looks like there are quite a few to choose from that may deliver 48V and about 500-1100 watts. I'll pursue the low end of that range, I think.

Any cautions or warnings.. or positive experiences to share ? Is there a much better way ?
 

·
Registered
Joined
·
23 Posts
Discussion Starter #3
If I buy one, it will be a model that can output 48V DC without needing to put a second in series. I do understand about the isolation issue, and would try to avoid that complication by buying a different model.
 

·
Registered
Joined
·
33 Posts
Battery charging is high up on my current project list. I've been working with a collection of both Volt battery modules, which are 12S BTW, not 13S, and tool batteries in 18V, 24V, 36V, and 48V configurations.

It's a challenging arena. The two mortal sins of lithium battery management is to either overcharge or overdischarge a cell. I already killed one cell of my Volt module by overdischarging it during testing.

The bottom lines in terms of charging are that charging needs to follow a strict constant current/constant voltage (CC/CV) regimen and that cells need to be monitored during charging. So unfortunately, just throwing a regulated power supply at the charging problem isn't really going to be sufficient.

The reason for constant current charging is the fact that the terminal voltage of the cells do not remain constant. As more energy is charged into the battery, the higher the terminal voltage goes. If a constant voltage supply is used, the smaller the difference between the regulated voltage and the battery terminal voltage, then the less current (and less power) is transferred. So the constant current power supply will pull up the supply voltage to the point that the current remains the same even though the terminal voltage of the battery is rising.

This is done until the battery reaches it maximum terminal voltage. For the Volt module that's 4.15V per cell of 49.8V for the battery. However, for some safety margin capping at bit under the maximum can extend the modules life. So targets of 4.05 to 4.1V/cell are appropriate. At that point the charger goes into the constant voltage stage, which provides enough current to the battery to maintain the voltage terminal set point. As the battery continues to charge, it'll draw less current until it reaches a cutoff current. At that point the charger is supposed to turn off as lithium chemistries are not fond of trickle charging.

The point is that charging is really supposed to be a carefully controlled, and monitored energy delivery process. To get maximum charging efficiency, the CC phase is important. To make sure not to overcharge, the CV phase is important. Because trickle is not allowed the charger needs to be cutoff.

But the monitoring needs to extend down to the cell level. The problem with only monitoring the battery terminals is that overall voltage may not be reflective of the individual state of the cells. In theory if the voltage at the terminals is 43.2V, then each cell should be at 3.6V. But if the cells are imbalanced and one is a 3.8V and another at 3.4V, you'd get exactly the same 43.2V at the battery terminals. The problem is that once the terminal voltage gets to the CV stage, that higher cell will be over the limit and will be overcharged.

So I'm currently trying to manage all the complexity, with the added challenge of having a wide voltage range of batteries to charge. So as I see it there is a three pronged approach to dealing with the issue:

1. A current/voltage regulator circuit that can deliver CC/CV and monitor both output voltage and current.

2. Cell level monitoring.

3. A control system that can manage the first two and deliver the appropriate CC/CV charging profile for a configureable terminal voltage and maximum current.

I hope there are no objections to discussing these issues here. I'll detail each in future posts.

ga2500ev
 

·
Registered
Joined
·
33 Posts
First up is CC/CV regulation and monitoring. As with all regulators there are two broad classes: linear and switching.

Linear regulators have the advantage of being simple. But the wasted energy using them can be horrendous. Almost all the high power ones use a bank of pass transistors as a variable resistance to lock the voltage or current to the set point. Feedback control occurs by measuring the output voltage, or current across a shunt in the load path. It's the easiest way to go if you're willing to heat your charging area.

Switchers are much more efficient. And honestly if they were as simple as the block diagrams all the tutorials show, would be a hands down winner. Unfortunately, there seem to be very few resources that meet my expectations. So I've had a frustrating time with them.

The basic concept for a buck regulator is that a switch between the power input and an inductor is used to charge the inductor. Because of the inductor property that prevents it from instanteaneously discharging, when the switch is turned off, the power charged into the inductor is delivered into the load. Typically in simple buck circuits, a diode is used to complete the circuit when the switch is off. For better control and efficiency, a second switch can be used for this activity. These type are called synchronous buck converters.

And that's where the simplicity ends. Almost every implementation uses boutique chips and calculus level design software to generate components that seem to be made of unobtainium. And inductors, unlike any other passive component, seems to be expected to be hand wound to specification. Then there's a bunch of selection criteria on inductance, core material, saturation, and coil heating that must be accounted for.

I find that little of it helps me because non of the parameters I'm interested in are variable. I posted a thread here 6 years ago with exactly the same discussion: http://www.diyelectriccar.com/forums/showthread.php?t=70211. I ended up talking to myself unfortunately. It would be great if anyone decided to join in.

Ultimately, I'd like to take essentially a fixed off the shelf inductor such as a handful of this one I bought at B.G. Micro: http://www.bgmicro.com/512uh-coil.aspx
and to be able the parameterize is so I can figure out what I can do with it. One of the equations I generated in the above thread long ago:

L = Vout / (f * Iripple)

seems to be the most direct relationship beween inductance (L), voltage (Vout), switching frequency (f), and current (based on Iripple ripple current). Rule of thumb is that the Iripple should run about 25-33% of load current.

So just quick real example I need with the 512uH coil above. We want to charge the Ryobi 40V battery, made of Samsung INR18650-13Q cells in a 10S2P configuration. A slight undercharge to 4.1V/cell. So the the CV is 41V. The max current for standard charging in a 2P configuration is a shade under 2A. So let's set the max current to 2 amps. So we have L=0.000512H, Vout=41V and Iripple = 600mA (30% of 2-amps). So the switching frequency should be:

f = 41/(0.000512*0.6) = 133.46 kHz

now that's likely a bit fast for the iron powder core for this toroid. So interestingly, the best thing to do it to lower the switching frequency. However, with everything equal, doing that will raise the ripple current. That means we could support a heavier load, but lowering it too far will take the inductor out of continuous conduction because if the ripple exceeds the average current, it runs out of juice. Doubling the ripple current to 1.2A lowers the switching frequency to 66 kHz, which is more reasonable. To compensate though I may consider charging at 3-4A instead of 2A. The max charging speed on the battery is 8A, so it's likely that the little bump won't do much damage.

Like it said: fun times.

ga2500ev
 

·
Registered
Joined
·
1,551 Posts
A million years ago back in the tube days we smoothed ripple with capacitors. What does a resonant circuit running at odd multiples of your switching frequency do to the ripple current.

Otoh it's a battery. Or double your inductance with 2 parallel.
 

·
Registered
Joined
·
33 Posts
A million years ago back in the tube days we smoothed ripple with capacitors. What does a resonant circuit running at odd multiples of your switching frequency do to the ripple current.

Otoh it's a battery. Or double your inductance with 2 parallel.
These circuits need both input and output capacitors. The PWMing of the input will slam the power source, so some capacitance is needed to buffer that switching. The output capacitor (low ESR) does smooth out the ripple. It needs to be low ESR (effective series resistance, I believe) in order for the ripple current not to cause the output capacitor to heat up.

BTW inductance operates exactly like resistance. So series increases the inductance while parallel reduces it.

ga2500ev
 

·
Registered
Joined
·
33 Posts
So next on my list is control systems. The current trend is to do systems integration with specialized "boutique" chips. If you want a battery charger, then buy a battery charger chip. Or a buck converter, then use a simple switcher. I find there are numerous problems with this approach:

1. Often made of unobtanium.
2. Often costs as much as unobtanium.
3. Functions in rigid, narrrow limits: pack size, voltage limits, current limits, switching frequency.
4. Either not adjustable, or adjustable in painstakingly painful ways: potentionmeters or voltage dividers for example.
5. Even when #1 isn't true, rarely do you have it on hand when you need it.

So in bucking the trend, I've settled specifically on Arduino Pro Mini's as my firmware level control system, and Raspberry Pi Zero W for higher level UI/network connectivity systems integration. Reasons include:

1. Prices if $5 or less for either board.
2. The Pro mini is DIP constructed for easy breadboard work.
3. Flexible and useful PWM and ADC for analog detection and control. This is specifically in the Pro Mini. ADC is one of the items lacking in a base RasPi.
4. Available programming environments, large user communities, and significant library and tutorial resources available.
5. Available both via online ordering or pickup any day of week.
6. Programmable adaptability to different situations.

ga2500ev
 

·
Registered
Joined
·
23 Posts
Discussion Starter #9
As far as off the shelf components, have you seen this item ? Its what I referenced in the original piece of this post ...

Search Amazon for "DROK Numerical Control Regulator DC 8-60V to 10-120V 15A Boost Converter, Constant Step Up Module Adjustable Output 48V 24V 12V DC Power Supply with LED Display"



Its what I am planning to use to manage the constant current and constant voltage charging of my battery. Incidentally, my battery is a 12S2P, that I bought on eBay, and the seller had recommended only charging to 4.0 to 4.1 volts per cell. I will likely keep it below that, as I think that I will have more battery than I need.

I plan to use that DROK device to charge with a very low level of constant current, like maybe 2 amps and then have it set to switch to constant voltage limited to say.. 48.0 volts. I have not yet charged my battery at all, but I want to play it safe. For me, the mower will run for 1/2 hour, once a week. I would literally allow a week to charge back up, if it was the safest way to do it. I plan to estimate the charge time, then have a physical timer that would turn off the charger at that set time, as an additional "dead man switch" to prevent overcharging my battery. My current plan to to try and use a big laptop charge from an alienware laptop. It puts out 19.5 volts and 12.3 amps. With the DROK device, I plan to limit it to only about 100Watts of its 240watt capacity. I have a simple calc at work, that I'll post tomorrow, to see if this is at all realistic.

And apologies, as a mechanical engineer, I am very much stretching my educational boundaries on this ,and will need to use an off the shelf solution, certainly one that is unobtanium free.
 

·
Registered
Joined
·
23 Posts
Discussion Starter #10
So now that I am back at my spreadsheet where I did my calculation, perhaps someone could check my rudimentary math and see if I have a giant error somewhere.



The tan is the laptop power supply. It could do 239.85 watts, but I'll as 2.75 amps of it, meaning it will be outputting 114.5 watts, which I figure is much easier on it. (I can select the amps during the CC phase, and he device tells me that the power supply cannot keep up, by quickly flashing LEDS, or by shutting down.) So, I plan to present to my battery 49V DC, no more than 2.75 amps which is 144.5 watts. (this is the brown area)

I am estimating that the mower will consume 40 amps just moving, and an additional 90 amps while cutting. Cutting 25 of the 30 minutes it takes to mow the yard. 48V x 40 amps x 30 mins + 48v x 90amps X25 mins =165,600 watt minutes.:eek:

Then 165,600 watt minutes divided by 114.5 watts is 1,446 minutes, or one full day. Does this strategy work ? I am guessing at 15% loss thru the DROK, so actually asking 131 watts , but delivering 114.5.

If this is the result, I am OK with it. I don't need to fast charge, maybe ever.;)
 
1 - 11 of 11 Posts
Top