I wondered if someone here could answer a question I have concerning the advantages or not of higher voltage controllers for series DC motors. As I understand it controlling a DC series motor with PWM, involves the rapid switching of the supply (battery pack) and varying the pulse width (up to 100% on) to control output power. The biggest advantage of a higher voltage controller/battery pack (other than reduced I*IR losses) would be greater power at higher RPM because you could push the current against the greater back EMF generated at higher motor speeds.... In my very rudimentary understanding of series DC motors, the torque is proportional to the current so what happens at lower RPMs where a lower voltage controller could supply the same current? If we had a 120 volt controller supplying an average of 1000 amps from a 120 V pack at 500 rpm, how would it compare in torque production to the same motor receiving an average of 1000 amps from a 300 V controller coupled to a 300 V pack at 500 rpm... would it be the same (and hence of lower efficiency), or would it be greater reflecting the increased wattage supplied (in which case the torque would not be proportional to the average current), or am I missing something???
Thanks Phil
Thanks Phil