DIY Electric Car Forums banner

EV Display version 2 - feedback wanted

13533 Views 85 Replies 17 Participants Last post by  MalcolmB
I'm working on a prototype of next version of EV Display. Trying to improve as much as possible to compete with big boys :)

Solid zero point across wide temperature range is the most challenging part, since I have no hands on experience with other E-meters I can't compare how well they done it. So, question to those with experience, how well does your E-meter hold the zero point while the car is sleeping in the cold/hot garage, driveway, parking lot, etc? Do you have to recalibrate it when temperature swings a lot?

With my new 16bit AD converter I am able to read currents in the range of +/- 0.1A - 600A using hall effect sensor. The problem is that when I freeze it to 0F or heat to 100F the zero point fluctuates between +/-0.4A. This is within the specs of the parts I am using, so there is no way to improve it. To address this I am thinking to ignore readings less than 0.5A, to stabilize the zero point range. Currently EV Display is ignoring less than 4A current for same reasons, which works OK for larger EVs, but isn't good for smaller packs/chargers. I was limited by 10bit AD converter, hence larger error margin.

I think for lower cost device 0.5A min readout in the +/- 600A range should be acceptable, right?

I am also adding voltage scaling for packs up to 500V, since some people insist on knowing their pack voltage :rolleyes:. This will also allow precise Watt and Watt/Hour tracking, if that is something you fancy ;)

I'm adding PWM output to drive mechanical fuel gauge in addition to low fuel relay I already have.

It will be powered from 12V source, but will be DC-DC isolated from the pack voltage.

Can't think of anything else right now, its been a long week :)
1 - 20 of 86 Posts
I have to recalibrate my EV Display every time the zero point is out of whack like the temperatures change from warm (69F) to cold (42f) or from cold to warm. One day when it was warm outside, the meter showed 3A out so I recalibrated to zero point. When it got cold in the evening, the meter showed 6A in. I recalibrated it again. I messed around with the temperature compensation. I set it at 50%, a few days already and the meter seems to okay.

I’m using a voltage scaler to power the EV Display and it works great.


It’s great if you can make the backlight comes on when you press the throttle and certain amount of amps out, so we don’t have to push the bottom for the backlight to see in the dark.


Another feature if you could integrate an output like a low fuel relay but this one is for a full fuel relay to cut-off the charger when the fuel reaches 100%. Use it to connect to an external relay for the charger cut-off.






You are talking about the current readout, not Ah, with no load? My TBS Expert Pro always reads 0.0 A with the ignition key off in temperatures from -5 F to 110 F. With key on, no throttle, it displays a small current I assume is due to the DC/DC converter. Haven't paid attention to whether this varies with temperature.

I like this idea:
Another feature if you could integrate an output like a low fuel relay but this one is for a full fuel relay to cut-off the charger when the fuel reaches 100%. Use it to connect to an external relay for the charger cut-off.
I'm always looking for more redundancy to avoid overcharging.
What about a current sensor around the charger lead only and when the current in the main wire drops to below 0.5A or what ever AND the current reading from the charger lead is near zero then ignore the 0.5A otherwise keep it. The current sensor around the charging lead could be of a higher sensitivity and be used to track Ah into the pack if desired.

Another possibility which could be done in addition to the above idea, is to assume 0A if some amount of time has elapsed from when the current was say >1A or something. This way when sitting at a stop light with no accessories draining power the low draw of the other components gets counted.

For your 12V isolated DC-DC: Is it connected across the full pack? I like the CycleAnalyst because it uses full pack voltage so it doesn't rely on another DC-DC and doesn't imbalance the pack.

You could also install a feature that when the pack voltage reaches a set level that the Ah counter resets to zero or 100%SOC. This would need to be user adjustable.
See less See more
You are talking about the current readout, not Ah, with no load? My TBS Expert Pro always reads 0.0 A with the ignition key off in temperatures from -5 F to 110 F. With key on, no throttle, it displays a small current I assume is due to the DC/DC converter. Haven't paid attention to whether this varies with temperature.

It's the current read-out and of course the Ah with no load, the car is parking. For example if the current read-out 3A for one hour it would take 3A from the total AH, from 180Ah to 177Ah. My EV Display doesn't read the current in or out if less than plus or minus 3Ah. I guess your TBS Expert Pro doesn't vary with temperature like the EV Display. I think by increasing the temperature compensation percentage could reduce the effect of varying with temperature.
One possibility is duel current sensors. One for the main traction wire that can read up to 1000 amps. The other one for charging and small HV loads like the DC to DC converter and designed to read +/- 50 amps. The larger one can have a much larger offset to ignore and the one powering smaller loads can be more sensitive.
Thanks for feedback!

Automatic backlight is easy, will add it to the feature list.

Full charge relay might be doable, but it might interfere with charger in some cases. For example, if you reset the device for whatever reason while the pack is not fully charged, you should then do a full charge to match real SOC with display SOC, but you can't since device inhibited your charger at the reset point. This is catch 22. I'm on the fence about this one, especially since I am very tight on the room inside the device and adding another optocouple is a challenge.

Dual sensors idea is logical, but not practical for low cost universal E-meter. I'd rather give up some sensitivity on the low end, then go with dual sensors.

I will try to programmatically stabilize the zero point at 0.1A across the temp range, but will also add user adjustable zero range setting, to avoid issues in extreme conditions.
Another possibility which could be done in addition to the above idea, is to assume 0A if some amount of time has elapsed from when the current was say >1A or something. This way when sitting at a stop light with no accessories draining power the low draw of the other components gets counted.
This is interesting idea, I need to think about it some more to make sure there is no down side.

For your 12V isolated DC-DC: Is it connected across the full pack? I like the CycleAnalyst because it uses full pack voltage so it doesn't rely on another DC-DC and doesn't imbalance the pack.
Voltage sensing will be across the pack, but device power needs 12V nominal. Internal DC-DC will isolate pack negative from 12V ground, but I can't power device from the pack using internal DC-DC since my planned range is too wide. I will support packs up to 512V, I can't include DC-DC for such wide range. Most EVs have pack voltage DC-DC already, so its not an issue to power display from 12V nominal as long as its isolated from the pack. I think this is most logical and cost effective approach.

You could also install a feature that when the pack voltage reaches a set level that the Ah counter resets to zero or 100%SOC. This would need to be user adjustable.
I already have this feature in current version, but its useless unless you power device from the pack, which needs external voltage scaler. New version will have everything integrated, no need for external scalers, just feed 12V for power and attach voltage sensing wires to a pack from 12V to 512V. Jumpers on the sending board will provide selection of popular voltage ranges for best accuracy.
Thanks for feedback!

Automatic backlight is easy, will add it to the feature list.

Full charge relay might be doable, but it might interfere with charger in some cases. For example, if you reset the device for whatever reason while the pack is not fully charged, you should then do a full charge to match real SOC with display SOC, but you can't since device inhibited your charger at the reset point. This is catch 22. I'm on the fence about this one, especially since I am very tight on the room inside the device and adding another optocouple is a challenge.

Dual sensors idea is logical, but not practical for low cost universal E-meter. I'd rather give up some sensitivity on the low end, then go with dual sensors.

I will try to programmatically stabilize the zero point at 0.1A across the temp range, but will also add user adjustable zero range setting, to avoid issues in extreme conditions.
It's true about the full charge relay might interfere with the charger if you reset or recalibrate the EV Display when your pack is not fully charged. Well if you can manage to add another optocouple in the meter, put a user setting ON/OFF option for the full charge relay. So you can turn off the full charge function and allow the charger to fully charge your pack. After the pack is fully charged, set the full charge function back on.

Adding user adjustable zero range setting is a lot better than a temperature compensation setting. Over all the next EV Display gen will be much better. Great job Dimitri...now you can compete with big boys.
Another idea would be to add a half pack voltage comparison feature with a center zero adjustment so that it can work with an odd number of cells. It would be nice to have a 2-3V swing each side, maybe with a more sensitive center region. This would be useful whether you have a cell level BMS or not since you could see if you have a low cell or a high resistance connection in one side of the pack.
I agree that if it is possible to build in a system for half-pack voltage differential comparison to see if something is out of whack if it isn't too difficult or expensive to integrate.
I think for lower cost device 0.5A min readout in the +/- 600A range should be acceptable, right?
Hi Dimitri
I think you were referring to resolution here... but thought I would jump in on the max amp range issue. I think you should make it to 1000 amps. I accelerate using over 600 amps often, granted not for all that long..but
I wouldn't want it not measuring the energy above 600 amps. Thanks.
Hi Dimitri
I think you were referring to resolution here... but thought I would jump in on the max amp range issue. I think you should make it to 1000 amps. I accelerate using over 600 amps often, granted not for all that long..but
I wouldn't want it not measuring the energy above 600 amps. Thanks.
Are you sure you are measuring on the battery side of the controller? For how many seconds at a time do you go over 600Amp? How much over 600Amp? I'm pretty sure that AH accuracy will remain acceptable even if we dismiss some short spikes over the 600Amp limit. I am using largest sensor available.
+1 here!

1000A readout will be very usefull for me to. The conversions than use 1000A (controller, batterie, motor) are really available these days.

Is it that much complex to find a 1000A rated ampermeter instead a 600A to do the same work?
Are you sure you are measuring on the battery side of the controller? For how many seconds at a time do you go over 600Amp? How much over 600Amp? I'm pretty sure that AH accuracy will remain acceptable even if we dismiss some short spikes over the 600Amp limit. I am using largest sensor available.
Yes, I'm sure. Not only do I pull more that 600 amps, I have pulled 1000 amps on many occasions. Again, it's not for long periods of time however, if I am going to capture consumption, I want to capture it all. :) The CA I have does capture this now.
The CA I have does capture this now.
Looking at CA site I don't see any reference to 1000A measuring capability. I suppose if you selected your own shunt, not the ones CA sells, then it might scale to 1000A, but then you will also lose fine resolution at low end, which is advertised as 0.1A for their shunts.

Accoding to datasheet of hall effect sensor I use it will generate data past 600A, up to 1000A, but it won't be as accurate. It looks like my device would work past 600A, if only I had a way to test this. I don't have any way to generate 1000A current.
Looking at CA site I don't see any reference to 1000A measuring capability. I suppose if you selected your own shunt, not the ones CA sells, then it might scale to 1000A, but then you will also lose fine resolution at low end, which is advertised as 0.1A for their shunts.
Yes, I used a shunt out of an industrial DC welder.

Accoding to datasheet of hall effect sensor I use it will generate data past 600A, up to 1000A, but it won't be as accurate. It looks like my device would work past 600A, if only I had a way to test this. I don't have any way to generate 1000A current.
I could test it for you... except I just started pulling things apart for my winter upgrades. . unless u want to wait for a bit. What about Tesseract? U tested his beta Soliton....seems fair to me. He has a dyno and batts set up to draw 1000 amps... at least he use to.
I am using largest sensor available.
What about LEM HTFS 800-P? It can read to 1200A.
The accuracy of 1% is it the problem?
Looking at CA site I don't see any reference to 1000A measuring capability. I suppose if you selected your own shunt, not the ones CA sells, then it might scale to 1000A, but then you will also lose fine resolution at low end, which is advertised as 0.1A for their shunts.

Accoding to datasheet of hall effect sensor I use it will generate data past 600A, up to 1000A, but it won't be as accurate. It looks like my device would work past 600A, if only I had a way to test this. I don't have any way to generate 1000A current.
For temporary testing you should be able to get away with a smaller gauge wire passing through the sensor. You could run 2 passes through the sensor by passing through then looping around to pass through again. The hall effect sensor should "see" twice the actual amps for testing purposes. Two pieces of bus bar with heat shrink passing through the sensor may work out easier. Either way, the goal is to read twice actual amps for setup purposes above 600 amps.
Update point. Major milestones reached with V2 prototype.

Able to read current with 0.1A resolution and hold zero point across wide temp range.

Able to read pack voltages up to 512V, pack voltage isolated from 12V power.

Have 3 output pins , isolated open collector type, one for Full gauge, one for Empty gauge ( selectable level ) and one PWM to drive mechanical fuel gauge.

Full relay circuit only trips when climbing from under 100% to 100%, so no false tripping upon initial power up.

Auto backlight upon non-zero current or button press.

Hardware is all done, remaining stuff is all software:

1. Charge efficiency. 100% for LiFePO4, but less for Lead Acid etc. Same as Puekert setting. Do we really need it? If so, how to best implement it? I'm thinking scale down AH count during charge, so partial charges will have meaningful impact on Fuel Gauge value? Feedback from lead acid users wanted...

2. How to reference WattHours? AmpHours are referenced against pack size AH setting to know when pack is "full". Current is integrated over time to count AH. Not sure how to reference WH data? Introduce nominal pack voltage in setup menu? What would be the reference point? Nominal WH? Not sure how to make this feature useful. Feedback from users of other meters wanted...

3. Current is tracked with 0.1A resolution, but display size is limited, so need to scale data output based on some ranges, so data can make sense at a quick glance while driving. For example, only show decimals when value is less than 10. Only show kW and kWh when value is over 10000. Any ideas in this regard?
See less See more
1 - 20 of 86 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top