Hi Duncan
Good to talk to you again.
Thanks for sharing your experiences. So just to make sure I have this straight in my head, current = torque and voltage = rpm. If you set your controller to your max current yoy have max acceleration. Then you will keep accelerating until you run out of rpm (voltage) which is like hitting the rev limiter in an ICE car.
So my next big question (and hopefully its not a stupid one), how do you decide on the best voltage for your car? I know it will depend on what you want your max rpm to be but is there a way to calculate what voltage is needed?
Just to be clear I know how to use tyre size and diff ratios to calculate speed and rpm I'm more interested in voltage.
Thanks
Brennan
The back EMF is proportional to current and to rpm
So you then need the RATIO! - and that is down to your motor!
Which is the rub
If you have a Warp9 or Warp11 you can use the graphs they give to make a very good estimate
I am using a Hitachi 11 inch - no graphs!
But when I used 130 v the effect was "noticeably"
With 130 v - my Device topped out at 100 kph and 200 amps
So 130 v (probably sagged to 105 V) - allow 5 volts for the resistive load
100 volts of EMF = 3500 rpm and 200 amps
So to drive 1000 amps at 3500 rpm - would take 500 volts
At half of that rpm (1750 rpm) 1000 amps would take 250 volts
At the end of the 1/8th when I'm doing about 5250 rpm it would take 150 volts for 200 amps
My actual voltage of 300 volts would give me 400 amps
That is all approximations - but it gives you an idea of what is happening