DIY Electric Car Forums banner

The 18650 - 13s10p Project - 48V x 34Ah

15947 Views 95 Replies 8 Participants Last post by  PStechPaul
The 18650 - 14s10p Project - 48V x 34Ah

This is to chronicle a project to construct a 1.6 kWh module from 18650 cells. The configuration is 13s10p (edit: upgraded to 14s10p) using 130 (140) "Grade A" Panasonic NCR18650B (edit: unprotected) li-ion cells. These cells are nominally rated at 3.4 Ah and guaranteed to 3.25 Ah.

The first order of 100 cells (at US$3.28 each - $272/kWh) was received from Shanghai today and I have started testing individual cells, and plan to test all cells. Shipment was by air and arrived in 4 days. Shipment adds another $1 to the price, and with transaction costs amounts to $1.24 a cell. I was told it is not possible to ship via sea, and must be by air.

It has been observed that cells purchased from some suppliers in China have contained some fake mislabeled re-cycled cells or fake low-quality low-capacity Chinese cells. I plan to test for capacity, weight, impedance, and thermal behaviour during charging.

Photo of shipment - each cell has an individual white paper box with a safe handling warning, and a pair of these boxes are inside a green box with the same warning. These boxes were not from the original manufacturer (Panasonic / Sanyo). Each cell had a sticker that covered the manufacturer's label that says NCR18650B without giving the capacity. The sticker says "18650 3400 mAh 3.7V". There are also lot numbers on the cell's wrapping and probably on the steel casing, which may give a clue to the origins of the cell.

Attachments

See less See more
61 - 80 of 96 Posts
Sorry for my ignorance, but I could not find the schematic for the github/nseidle/BMS. There were the .sch files, but I cannot open them. What do I need?

Assume there is one ATmega MC per cell. The controller can tell the top (highest voltage) MC via an optoisolator to disable balancing or go standby. The MC would then send a message to the next MC to do same, and then go on standby. This will trickle down. Only one optoisolator needed.

If the controller needs to send a message to a particular MC, it will send it to the top MC which will eventually reach that particular MC.

If an MC detects an abnormal condition, it could send a message down the ladder which eventually gets to the controller. And the controller could periodically send a ping message to circulate and make sure all MCs are healthy.

What is it that a BQ7694 can do that 8 or 15 MC cannot do with proper distributed firmware?
See less See more
Sorry for my ignorance, but I could not find the schematic for the github/nseidle/BMS. There were the .sch files, but I cannot open them. What do I need?

Assume there is one ATmega MC per cell. The controller can tell the top (highest voltage) MC via an optoisolator to disable balancing or go standby. The MC would then send a message to the next MC to do same, and then go on standby. This will trickle down. Only one optoisolator needed.

If the controller needs to send a message to a particular MC, it will send it to the top MC which will eventually reach that particular MC.

If an MC detects an abnormal condition, it could send a message down the ladder which eventually gets to the controller. And the controller could periodically send a ping message to circulate and make sure all MCs are healthy.

What is it that a BQ7694 can do that 8 or 15 MC cannot do with proper distributed firmware?
Looks like an Eagle project, you can download it from Autodesk, there's a free version.

Sure, if you're up to the task then a microcontroller will do just fine, though you may still want to pair it with a good ADC, I was going to use a MAX11629 8-ch 12 bit ADC. I don't think the 10-bit one built in has enough resolution, especially when dealing with voltage dividers, but that's up to you. Here's some data taken with such a setup and that ADC: http://www.diypowerwalls.com/t-Pack-Cycle-Testing
I've looked into various design ideas for a BMS and cell balancer. I think this could be accomplished with a microcontroller for each set of 8 cells, and an 8 channel DG408 analog multiplexer to read the string voltages at 8 points into an 8:1 voltage divider. The DG408 can handle up to 44 volts so eight cells at 4.2 volts each would be 33.6 volts. The DG408 draws only 10 uA. A 100k voltage divider draws at most 420 uA while reading but that can be done in less than 1 mSec perhaps once every few seconds, so average current is only a few microamps.

Each set of 8 cells can perform balancing by putting loads on high cells, using transistors for level-shifting. Each 8-cell module can communicate with a central processor directly, or in daisy-chain fashion with other modules, using opto-isolators or digital isolators. The microcontroller can sleep when not sampling cell voltages or communicating. Twelve modules of 8 cells each would work for 403 volts with 4.2V Li-Ion cells.
See less See more
The DG408 idea is interesting PStechPaul. Would a digital isolator be different from an optoisolator?

With 10 bit resolution, that would be 0.005V, assuming the range is 0 to 5V? My feeling is that this may be sufficient resolution. If so, then all optoisolators could be eliminated except for one, when we use one MC per cell. I think an ATmega can be as cheap as $2.00. And there would be no need for precision voltage division. My gut reaction is that this is superior to using a specialized BQ7693 chip. But a nuanced analysis would be required to decide between these several designs.
fyi folks here design BMS's for fun so don't expect much production value.

caveats:

balancing: don't bother. More likely to hurt than help (except maybe oem units). If you can install a bms, you can manually balance, and it is rarely if ever needed. Just need an idiot light to tell you something is out of balance really. Ignoring something going out of balance and just letting it balance automatically is ignoring cell specific aging information anyway.

units that cause imbalance: This is just dumb.

diy packs occasionally are opened: deliberately or accidentally, if you break the circuit between two cells (or a cell fails open), whatever is bridging the gap will be exposed to high reverse voltage, and if it is level shifted signaled or multi-cell vs isolated it can cascade.

power draw: keep it very tiny or it will kill more batteries than you can count. having zero draw when not charging or running is ideal. A lot of batteries have died just sitting there. powering the bms via isolated dc converters is one, albeit expensive, method. op amps can reduce the voltage sensing load. Or an opto-controlled mosfet on the divider/cpu power, etc. etc.

accuracy: you probably still need a calibration method/rig after assembly. could be software (eprom) or some solder pads on unused logic pins or (gag) trim pots.

network speed, reliability, noise immunity, vs ease of installation, cost/etc: Depending what you are hoping to accomplish, having all the nodes measure the cells at exactly the same time (combined with a concurrent overall current measurement) might be important. Or it might not be. It is often acceptable to trigger a concurrent measurement with a broadcast, then go back and collect the results individually.

Bus type networks have addressing headaches though, i.e each node needs to be addressable, and often that is more solder pads for the end user or firmware/eprom updates to give it an address. And to give more meaningful messages to the user the address has to relate to the battery cell in question.

plus you should consider stuff like twisted pairs (power, signal/ground or differential, constant current (i.e. digital 4-20ma), possible clock lines, sr lines, etc. etc.).

for simple installation, I tend towards a uart token-ring style (preferably with hardware majority vote and an xtal). As it just needs a twisted pair between bms nodes, and can handle automatic addressing easily on initialization. If you are brave you can level shift between nodes, but otherwise it is just an opto per interconnection. It doesn't do concurrent readings, but it works. And you can add low power/sleep modes that wake on signal (just keep pinging the first node from the bms controller till it hears back from the last node), lots of protocol fun.

And having said and learned all that, this is what I use on 48v systems :) (monitor it like you would a gas gauge as it also gives you the pack voltage if you multiply the reading by 2, if the numbers get too different then time to investigate)
See less See more
The DG408 idea is interesting PStechPaul. Would a digital isolator be different from an optoisolator?

With 10 bit resolution, that would be 0.005V, assuming the range is 0 to 5V? My feeling is that this may be sufficient resolution. If so, then all optoisolators could be eliminated except for one, when we use one MC per cell. I think an ATmega can be as cheap as $2.00. And there would be no need for precision voltage division. My gut reaction is that this is superior to using a specialized BQ7693 chip. But a nuanced analysis would be required to decide between these several designs.
Yes if you did one microcontroller per cell you could get away with it, there are some very small/cheap ATtinys, the ATtiny102 is $0.36 each (lower in bulk) and has a 10bit ADC. You need to consider voltage reference, the on board 1.1V bandgap reference is only accurate to a few % and varies over supply voltage and temperature. There are very few reference ICs they have supply current at the uA level so you may want to just use a TL431 or similar but switch it on only when needed.

As for balancing and whether it's needed, yes it's very much needed for an 18650 type pack, please see the conclusion to my experiment here: http://www.diypowerwalls.com/t-Pack-Cycle-Testing

After around 40 charge/discharge cycles at room temperature in a lab environment, even charging only to 4.125V/cell, one cell already got out of balance enough to charge at greater than 4.2V during the charge cycle and I had to end the test. I don't know about you but I don't want to have to crack open my pack once a month to manually balance it. Note this was with a pack balanced to within 30mAH or so also. You can't expect it to be as well balanced as a proper EV battery with all cells made on the same factory line like a Nissan Leaf pack.
See less See more
And having said and learned all that, this is what I use on 48v systems :) (monitor it like you would a gas gauge as it also gives you the pack voltage if you multiply the reading by 2, if the numbers get too different then time to investigate)
Thanks DCB - very interesting. My first pack, a 14s10p has the same BMS as yours! Except there are 4x4PDT selectors and a switch that allows me to inspect each cell manually. Also using one 4 digit Vmeter.

Another option is to use 4 DG408s and two Vm and two counters and pushbuttons, and is just too much work.

I reckon that if using new 18650s one is not cycling too often and only charges to 4.15V, then it will take a long time till things go out of balance.

Nevertheless I will use a cheap brainless 14s balancer-protector board.
Here's what I've got for my board (using a dedicated 12-bit ADC, 8 dividers (switchable using P-FETs) and 8 balancers. Board to board communication is still with an isolator but can make a version with just a divider/level shifter for the packet forward approach:

Board is 3.05x1.25", haven't finished routing, just initial placement.


Schematic for the balancers:



Schematic for the Vsense section:


Solar, if you'd be interested in collaborating on the firmware side I'd be happy to handle hardware design/test/build. I hate writing code but I can do it if I have to ;)
See less See more
4
imbalance inducing bms alert

if this was the type of monitoring you used in your cell test then it is no wonder it went out of balance. Also stop charging them to 4.2v.
A 10 bit ADC with an 8:1 divider has a resolution of about 0.004 volts. A 12 bit ADC would make that 0.001 volt. Except for the bottom cell, readings would need to be calculated as the difference in readings for successive taps. A differential ADC might work, but requires two multiplexers.

It might be possible to design a flying capacitor circuit where MOSFETs on adjacent cell taps would take a sample of the cell voltage desired, and then another pair of MOSFETs would apply that capacitor to the ADC. This would eliminate the voltage divider and allow full 10 bit resolution. The current draw for the first sample would be significant, but for subsequent samples the only current would be that required to stabilize the reading, and would even provide some charge balancing when the capacitor charged to a high cell would impart that energy into a lower cell.

It has been considered previously:

http://americansolarchallenge.org/ASC/wp-content/uploads/2013/01/SAE_2001-01-0959.pdf

http://www.google.com/patents/US7362588

https://patents.google.com/patent/US8786248B2/en

http://www.utdallas.edu/essl/projects/charge-balancing-problem/

https://www.edn.com/design/analog/4334442/Analog-multiplexer-uses-flying-capacitors
See less See more
Re: The 18650 - 14s10p Project - 48V x 34Ah

Wow - thanks. What a productive set of replies from everyone here. Can't wait to find the time to study these in depth, and complete this project and thread.

As for my other 100kWh 14s or 27s project, I am leaning to used Tesla banks. But the Tesla BMS would be problematic as it is designed for 96s.

That does not mean I wish people to have crashes in their T3 - but to get my hands on some 2170s, I may have to cause some totals. :D:D Just kidding.
Re: The 18650 - 14s10p Project - 48V x 34Ah

Wow - thanks. What a productive set of replies from everyone here. Can't wait to find the time to study these in depth, and complete this project and thread.

As for my other 100kWh 14s or 27s project, I am leaning to used Tesla banks. But the Tesla BMS would be problematic as it is designed for 96s.

That does not mean I wish people to have crashes in their T3 - but to get my hands on some 2170s, I may have to cause some totals. :D:rolleyes: Just kidding.
I'm now looking at trying to make the simplest possible single cell stackable BMS. It's an interesting challenge but some things get easier/more elegant. So far it's looking like 3 FETs (1 for level shifting for communication between boards, 1 for enabling the voltage divider and voltage reference, and 1 for balancing), 1 shunt voltage reference, and an ATtiny102. I think communication could be as low as 2 bytes per board (shunt bit + 7-bit board ID, 8-bit cell voltage) or a command byte. Cells would be read out in groups of as many as can fit in the RAM space of the ATtiny102.
I'm now looking at trying to make the simplest possible single cell stackable BMS.
"attiny24"
why do you need a divider though, and a voltage reference in that case?

fwiw, this is a pretty deep rabbit hole though, and you should know how to program to work with microcontrollers effectively. I did sort out an assembly routine for the tiny44 once (same as 24), no hardware uart made assembly the only choice for timing (literally counting instructions). It had majority vote, and I reduced the frame size (below what a typical hardware uart would allow) so it would be good without a crystal oscillator, in daisy chain (tx of cell 0 goes to rx of cell 1, etc). Auto addressing (command packet had counter that each node stored and incremented and forwarded). chain functions, like just give me the max voltage in the string, or the min voltage, or the sum, plus individually addressable voltages, with room for a temperature reading as well (thermistor, with max and min chain commands and apa), plus an led to help troubleshooting (i.e. follow the chain till the led's stop lighting for network problems, or look for the blinking led to identify the battery in question) and sleep/wake modes.

It wasn't one cpu per cell though, but easily modifiable for such, it is literally just the cpu and an opto for bare minimum at that point since the voltage range is acceptable. Though you still may need a calibration step after assembly (and it might need to measure vcc and internal temp to be accurate).

so while "simplest" can be optimized for hardware, it doesn't make the software (the part you are avoiding :) ) simpler. Siloing hardware from software in microcontrollers while trying to optimize is a bad mix. If you can wash your hands at the programming side, then by the same token the programmer can wash their hands at the hardware side, and just make whatever you give them work, more or less, suboptimally.

Though I'm concerned that you are ignoring very relevant comments regarding methodology, as if you see fortune in this direction, and that is what kills batteries and floods the market with junk. I think you missed that ship by about 8 years though.
See less See more
I'm now looking at trying to make the simplest possible single cell stackable BMS.
"attiny24"
why do you need a divider though, and a voltage reference in that case?

fwiw, this is a pretty deep rabbit hole though, and you should know how to program to work with microcontrollers effectively. I did sort out an assembly routine for the tiny44 once (same as 24), no hardware uart made assembly the only choice for timing (literally counting instructions). It had majority vote, and I reduced the frame size (below what a typical hardware uart would allow) so it would be good without a crystal oscillator, in daisy chain (tx of cell 0 goes to rx of cell 1, etc). Auto addressing (command packet had counter that each node stored and incremented and forwarded). chain functions, like just give me the max voltage in the string, or the min voltage, or the sum, plus individually addressable voltages, with room for a temperature reading as well (thermistor, with max and min chain commands and apa), plus an led to help troubleshooting (i.e. follow the chain till the led's stop lighting for network problems, or look for the blinking led to identify the battery in question) and sleep/wake modes.

It wasn't one cpu per cell though, but easily modifiable for such, it is literally just the cpu and an opto for bare minimum at that point since the voltage range is acceptable. Though you still may need a calibration step after assembly (and it might need to measure vcc and internal temp to be accurate).

so while "simplest" can be optimized for hardware, it doesn't make the software (the part you are avoiding
) simpler. Siloing hardware from software in microcontrollers while trying to optimize is a bad mix. If you can wash your hands at the programming side, then by the same token the programmer can wash their hands at the hardware side, and just make whatever you give them work, more or less, suboptimally.

Though I'm concerned that you are ignoring very relevant comments regarding methodology, as if you see fortune in this direction, and that is what kills batteries and floods the market with junk. I think you missed that ship by about 8 years though.
The internal bandgap reference is only good for +/-10% in ideal conditions, clearly a better reference is needed. A divider is needed to get the cell voltage below VCC which would itself be the voltage reference of 2.5V (common reference value below 3V).

Trust me I hear you about the challenge and problems of making a custom board and programming it, I still don't think it's the way to go for my project, this is more a fun academic exercise for me, if others find this work useful I'll continue, otherwise not much has been lost except time.
See less See more
The internal bandgap reference is only good for +/-10% in ideal conditions, clearly a better reference is needed.
Not if you know that there is also an internal temperature sensor and calibrate it (in flash/eprom) and also account for vcc changes (and have that calibrated). A lot of things can be made to be software problems, that is why knowing programming (and data sheets) is important for "simplest" hardware with microcontrollers. And even with an external reference you need to calibrate the ADC as part of the assembly/programming step, and account for changes in vcc and temp probably too if you are trying for accuracy, only now the temperature of the reference isn't as tightly coupled to the internal temp sensor...

Did you know the attiny24 also has differential adc and an op amp? (which, sigh, also needs calibrating, and very careful hardware considerations)
Not if you know that there is also an internal temperature sensor and calibrate it (in flash/eprom) and also account for vcc changes (and have that calibrated). A lot of things can be made to be software problems, that is why knowing programming (and data sheets) is important for "simplest" hardware with microcontrollers. And even with an external reference you need to calibrate the ADC as part of the assembly/programming step, and account for changes in vcc and temp probably too if you are trying for accuracy, only now the temperature of the reference isn't as tightly coupled to the internal temp sensor...

Did you know the attiny24 also has differential adc and an op amp? (which, sigh, also needs calibrating, and very careful hardware considerations)
Yes it probably would be better to start with a more capable microcontroller and work down from there (the ATtiny102 is a factor of 2 cheaper, and correspondingly less capable, no temp sensor, no EEPROM, less RAM and flash). Just thought I'd share that I do have significant experience with programming microcontrollers at low level (C and some in-line assembly for precision delays when needed): http://rev0.net/index.php/Main_Page To prove I'm not an armchair philosopher :)

I would be interested to review if you had completed or documented your BMS you mentioned with a similar approach.
ah, RC enthusiast! (I like the sailplanes)

Just FYI a lot of the research shows significant degradation in lithium batteries if charged fully to 4.2, i.e. in my leaf I used the timers to set it to %80 charge and keep it out of the red, as has been discussed on the mynissanleaf forum. Also sitting at full charge is bad, and if it is hot then it gets worse, so the low voltage you reported at receiving is actually a reasonable long term storage voltage (%30 SOC IIRC). I charge my leaf cells to ~4.1v. Over charging and discharging reduces cell life exponentially so that you get less total energy out of the cell over its lifespan. I know the RC guys tend to max it out and just replace the pack more often, and cell phones want to maximize their battery claims, but on car scale that gets expensive.

This is part of the battery management domain that you are entering.

I think you missed my point on the cpu though, the attiny24/44 can get reproduce-able results with minimal external hardware, and is cost effective even at the 1 cpu per cell level, but it is a large initial software cost, plus building a test/calibration rig that you validate each unit produced. And you are likely to run into the same problems with any microcontroller, unless you pay a lot for it. The devil is in the details as they say.

So for a "cheap" personal pack, it doesn't make a lot of sense to get fancy with the bms hardware either. If it costs 3x the cost of the battery, then just get 4x the battery instead and simplify the monitoring.

also there are concerns with the stated need for balancing, I think there may be errors in your testing procedure, and you are comparing 1p to 10p). But you are claiming without really backing it up that there is a need, and you have the solution. It is rather circular. Like inventing a bms with built-in imbalance then using that as the reason for adding balancing. Then saying you don't want to "crack open" the pack when you know each cell will have a lead routed externally anyway...

There are a lot of holes in the reasoning here, and in the implementation, and this is for an extra 5 miles range on a leaf? Do you have any idea how to interface with the leaf battery?

I guess the problem is you are saying things as absolutes, but you haven't done all your homework yet or properly backed up those assertions, despite being told to the contrary from numerous experienced folks.
See less See more
ah, RC enthusiast! (I like the sailplanes)

Just FYI a lot of the research shows significant degradation in lithium batteries if charged fully to 4.2, i.e. in my leaf I used the timers to set it to %80 charge and keep it out of the red, as has been discussed on the mynissanleaf forum. Also sitting at full charge is bad, and if it is hot then it gets worse, so the low voltage you reported at receiving is actually a reasonable long term storage voltage (%30 SOC IIRC). I charge my leaf cells to ~4.1v. Over charging and discharging reduces cell life exponentially so that you get less total energy out of the cell over its lifespan. I know the RC guys tend to max it out and just replace the pack more often, and cell phones want to maximize their battery claims, but on car scale that gets expensive.

This is part of the battery management domain that you are entering.

I think you missed my point on the cpu though, the attiny24/44 can get reproduce-able results with minimal external hardware, and is cost effective even at the 1 cpu per cell level, but it is a large initial software cost, plus building a test/calibration rig that you validate each unit produced. And you are likely to run into the same problems with any microcontroller, unless you pay a lot for it. The devil is in the details as they say.

So for a "cheap" personal pack, it doesn't make a lot of sense to get fancy with the bms hardware either. If it costs 3x the cost of the battery, then just get 4x the battery instead and simplify the monitoring.

also there are concerns with the stated need for balancing, I think there may be errors in your testing procedure, and you are comparing 1p to 10p). But you are claiming without really backing it up that there is a need, and you have the solution. It is rather circular. Like inventing a bms with built-in imbalance then using that as the reason for adding balancing. Then saying you don't want to "crack open" the pack when you know each cell will have a lead routed externally anyway...

There are a lot of holes in the reasoning here, and in the implementation, and this is for an extra 5 miles range on a leaf? Do you have any idea how to interface with the leaf battery?

I guess the problem is you are saying things as absolutes, but you haven't done all your homework yet or properly backed up those assertions, despite being told to the contrary from numerous experienced folks.
Understandable about your skepticism of my project, I never really explained the "big picture" anywhere. I bought a Leaf with 100k+ miles on it and a battery at death's door (only 5 capacity bars), which I had replaced with cells from a 2013 just a couple weeks back, I'm pretty well aware how expensive it is (the replacement pack cost me twice what the car did). I originally intended to buy a Leaf with a few bars missing and supplement with lost range with a modular system such that I could add more 1.2kWH packs whenever I had the time and money to. I was inspired to take on an EV project by the likes of Jehu and his video about how a DIY built car got 700+ miles of range using recycled batteries. I'd like my Leaf to be able to scale someday to a couple hundred miles or more if I came across a good deal on batteries, and as battery tech and energy density increases over time as well.

I'm new to the EV world, true, but very familiar with Li-Ion batteries and how they degrade, and how to take care of them to prevent such. The Leaf only charges to a max of 4.125V and discharges to a min of 3.4V, which is already conservative, even at "100%" charge cycle. I've seen most of the implementations of range extenders on Leafs as well, with my favored route being to tap in right after the battery connector (after the battery's internal contactors), and monitor the contactor signals to know when to switch in the extender pack and when to disconnect it. I've got the HV connector (both sides) already to make an in-line tap, and I'm trying to get a hold of the circular connector which contains the CAN bus signals and contactor relay controls.

So for my first 96s2p pack, cell balancing will be relatively important, there's not much in parallel to offset the effects of cell imbalance. As I add more packs in parallel (every cell will be in parallel with the original pack by connecting together the balance leads) the system should get progressively easier to balance. I also am skeptical about how required balancing is, I've done the reading from both sides of the argument, and there just wasn't enough data to convince me one way or another. My preference is to be able to monitor all cells and make the judgment on my own whether balancing is needed or not, but the extra hardware is only a few extra cents when designing a custom board, so why not add it. And as a disclaimer again, I will be using a Nissan Leaf BMS for my project, which already has the automotive grade HW to monitor and balance all 96 cells with proper isolation, and a nice Android graphic interface on top ;) I'm just looking into my own BMS for the fun of it, maybe to use in other projects.
See less See more
I'm now looking at trying to make the simplest possible single cell stackable BMS. It's an interesting challenge but some things get easier/more elegant. So far it's looking like 3 FETs (1 for level shifting for communication between boards, 1 for enabling the voltage divider and voltage reference, and 1 for balancing), 1 shunt voltage reference, and an ATtiny102. I think communication could be as low as 2 bytes per board (shunt bit + 7-bit board ID, 8-bit cell voltage) or a command byte. Cells would be read out in groups of as many as can fit in the RAM space of the ATtiny102.
I am trying to get my head around this. In one of the other designs linked, there was no need for disabling the voltage reference for balancing. Only one FET for the balancer and the voltage reference appeared to draw current only when voltage exceeded reference?

Is a FET needed for level shifting communication? Communication is one way from upper to lower board only, and is serial. Could you not use another zener/resistor to drop the voltage so that when out is low, there is no current drain? Assuming daisy chain serial communication, it can be as many bytes as you want, and you want it to be synchronous (token ring) and periodic I think. If the controller detects no pack charge or discharge, then it would minimize the polling, and eventually put all on standby. A data space of 128 bytes would be sufficient with logarithmic coding of the voltage. No board ID needed in the packet, as position of data is the ID.

I think the stackable BMS architecture makes a lot of sense. A lot more elegant, and replicative, configurable and extensible. A minimalist design that can be ported to a variety of power packs. And all the parts are off-the-shelf and the complexity is migrated to the software, which is a lot easier to hack than circuit boards and specialized ICs. No CANbus or proprietary decoding.
See less See more
61 - 80 of 96 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top