1 - 4 of 51 Posts

#### Duncan

Joined
·
6,643 Posts
Hi 24seven

My motor cost me \$100NZ
The controller was about \$800NZ
Batteries \$3300NZ all up (Volt pack)

No gearbox
1200 amps and 340 v

I built it a bit too heavy - 805 Kg - 900 kg with me in it

55% of the weight on the rear wheels, Subaru diff with LSD

It will smoke the rear tyres! - I have ordered some super sticky only just road legal tyres for next year

I normally drive it at 45% power - 100% is a bit too exciting

#### Duncan

Joined
·
6,643 Posts
Max voltage for a DC motor

You can't actually just "overvolt" a DC motor
You controller controls the motor voltage to achieve the commanded current

So it may need 10 v to get 1000 amps when stationary
But as the motor starts to spin it develops a back EMF and that adds to the voltage

When I had a 130 v battery and a controller set to 1000 amps it took off like a scalded rat - and then as the revs rose the required voltage increased

With 130 volts it topped out at 200 amps and 100 kph (3500 rpm)
Which means that the controller was at 100% but the voltage was only driving 200 amps through the motor

To need a high voltage you need high current and high rpm

My current battery is 300v empty and 340v full - I'm still accelerating at 150 kph and 5300 rpm but I have passed 100% on the controller and the current is dropping as the rpm's increase

If using more sensible numbers I would say that 150v was enough
Or you could use 400v and simply let the controller adjust it to what the motor wants

#### Duncan

Joined
·
6,643 Posts
Hi Duncan

Good to talk to you again.

Thanks for sharing your experiences. So just to make sure I have this straight in my head, current = torque and voltage = rpm. If you set your controller to your max current yoy have max acceleration. Then you will keep accelerating until you run out of rpm (voltage) which is like hitting the rev limiter in an ICE car.

So my next big question (and hopefully its not a stupid one), how do you decide on the best voltage for your car? I know it will depend on what you want your max rpm to be but is there a way to calculate what voltage is needed?

Just to be clear I know how to use tyre size and diff ratios to calculate speed and rpm I'm more interested in voltage.

Thanks
Brennan
The back EMF is proportional to current and to rpm

So you then need the RATIO! - and that is down to your motor!

Which is the rub

If you have a Warp9 or Warp11 you can use the graphs they give to make a very good estimate

I am using a Hitachi 11 inch - no graphs!

But when I used 130 v the effect was "noticeably"

With 130 v - my Device topped out at 100 kph and 200 amps

So 130 v (probably sagged to 105 V) - allow 5 volts for the resistive load

100 volts of EMF = 3500 rpm and 200 amps

So to drive 1000 amps at 3500 rpm - would take 500 volts

At half of that rpm (1750 rpm) 1000 amps would take 250 volts

At the end of the 1/8th when I'm doing about 5250 rpm it would take 150 volts for 200 amps
My actual voltage of 300 volts would give me 400 amps

That is all approximations - but it gives you an idea of what is happening

1 - 4 of 51 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.