I'm new to this so someone should check my logic, but here's how I see it:
For BLDC motors (that's what this is, right?), if the controller can provide enough power, they will make constant torque up to max rpm, so that's where max power will occur.
If one were to use the tritium controller and the motor revs to 10752 at 650 V, then at 450 V, it would rev to 450/650*10752 = 7543 rpm. With 6.86:1 diff and typical tire size, that's 75-80 mph, so more than enough.
Ignoring controller/motor loss, 50kW @ 4610 rpm (3279V) requires 179A because 50,000W/(650V*4610rpm/10752rpm) = 179A. My guess is this is peak power because the batter pack is only 288V so above this, full current can't be supplied... or something. Also, the pack is only rated for 45kW, so it makes sense that it can't keep supplying current beyond about this voltage.
For a sanity check, 50,000W/(4610rpm*2pi/60) = 104N*m (76lb-ft), so pretty close to constant torque all the way from max of 96lb-ft at 0 rpm.
With 180A up to 450V, the motor's power should increase to about 80kW without too much more stress than its OEM application, right? Also, the tritium controller only supplies 300A max so it's not a complete waste, but not a perfect match either. Plus I imagine the motor could be pushed beyond 180A for short periods.
Even if this is correct, this motor/controller/diff combo would only work for a very small car. Max wheel torque would be 96lb-ft*6.86 = 659lb-ft. A Subaru Justy 2wd in 1st gear with the smallest engine available puts up to 804lb-ft to the wheels ignoring losses: 59lb-ft*3.071*4.437 = 804lb-ft. A typical compact car, say a mid 90's impreza, sends to the wheels 110lb-ft*3.785*4.11 = 1711lb-ft.
I don't know what I've decided from all this, maybe just use the diff with a better motor since the size and ratio seem well suited to EMs.
I seem to be on a different page than a few others here, so feedback would be appreciated.