Joined
·
1,773 Posts
OK this is just a drill to make sure I got my noodle wrapped around Series Wound motors and choosing the right Controller. Thought I had it figured out but after another thread has me double checking. My experience with motors is all Industrial AC Induction motors on a fixed 60 hz operating frequency and some VFD’s thrown in once in a while.
So here goes, someone throws a motor on the bench and ask what can it do? All we have is nameplate data that reads:
Operating Voltage = 48 vdc
Load Torque = 5.237 ft-lbs
Load Speed = 4150 Rpm
Load Current = 88 amps
Peak HP = 10
Continuous HP = 4.14
First thing it tells me I would be using 48 volts x 88 amps = 4224 watts. From that if I divide 4224 watts / 4.14 HP = 1020 w/hp efficiency. Here is where I get stuck and might be making an errors. This is a real motor with real numbers and it is recommended with a 400 amp controller. That does not jive with a Peak 10 HP motor. If I assume a 1020 w/hp efficiency, a 10 HP would be roughly 10,200 watts. Realistically I know efficiency will not be that high, so say 70% efficiency 10 HP is 10.700 watts. My protein computer says 10.700 watts / 48 volts = 222 amps. At 400 amps input would be 19,200 watts. Efficiency??
Having gone through all that and from a little research both methods are all wet. At least for selecting controller amperage. Everything I have been taught is you want as much as connecting the battery directly to the Terminals will produce called Stall Current at 0 RPM. If Stall Current is not listed in the specs, one can use a DRLO making several measurements while rotating shaft, and then averaging resistance to determine Stall Current = Battery Voltage / Motor Resistance.
So what is right if any?
Engineer inside me says, just a little less than Stall Current, and let the motor take care of the rest when it spins up. Allow for voltage sag of 25% with current limiting to protect the motor.
So here goes, someone throws a motor on the bench and ask what can it do? All we have is nameplate data that reads:
Operating Voltage = 48 vdc
Load Torque = 5.237 ft-lbs
Load Speed = 4150 Rpm
Load Current = 88 amps
Peak HP = 10
Continuous HP = 4.14
First thing it tells me I would be using 48 volts x 88 amps = 4224 watts. From that if I divide 4224 watts / 4.14 HP = 1020 w/hp efficiency. Here is where I get stuck and might be making an errors. This is a real motor with real numbers and it is recommended with a 400 amp controller. That does not jive with a Peak 10 HP motor. If I assume a 1020 w/hp efficiency, a 10 HP would be roughly 10,200 watts. Realistically I know efficiency will not be that high, so say 70% efficiency 10 HP is 10.700 watts. My protein computer says 10.700 watts / 48 volts = 222 amps. At 400 amps input would be 19,200 watts. Efficiency??
Having gone through all that and from a little research both methods are all wet. At least for selecting controller amperage. Everything I have been taught is you want as much as connecting the battery directly to the Terminals will produce called Stall Current at 0 RPM. If Stall Current is not listed in the specs, one can use a DRLO making several measurements while rotating shaft, and then averaging resistance to determine Stall Current = Battery Voltage / Motor Resistance.
So what is right if any?
Engineer inside me says, just a little less than Stall Current, and let the motor take care of the rest when it spins up. Allow for voltage sag of 25% with current limiting to protect the motor.