I recall seeing something like this concept previously, and it may even have attempted to match the bus voltage roughly to the sine wave voltage being created by PWM. But let me try to analyze the specifics of a VFD PWM.

Assume a 325 VDC bus for a 230 VAC 60 Hz motor. If the carrier frequency is 21.6 kHz, there will be 180 steps for the PWM, which is one PWM cycle per degree. At very low voltage, the PWM approaches 50%, and the effective voltage is determined by the difference in the duty cycles, which can be expressed as a time difference.

At 230 VAC, the PWM varies from 100% to 0%, which can be otherwise expressed as +50% to -50%. Some PWM values for various phase angles is:

Code:

```
1 0.0174 / 21.6 = 0.81 uSec
5 0.0870 / 21.6 = 4.03 uSec
30 0.5000 / 21.6 = 23.1 uSec
90 1.0000 / 21.6 = 46.3 uSec
```

A microcontroller like the PIC18F4431 has a resolution of 11 bits at 20 kHz with a maximum clock frequency of 40 MHz, which is about 50/2048 = 24 nSec. Thus the precision of the PWM for 1 degree is 24/810 or 3%.

The lowest drive frequency for an ACIM is the slip speed, which is typically about 1800-1750 = 50 RPM, or about 3%. Thus with a constant V/F the 1 degree PWM would be 0.81 * 0.03 = 24.3 nSec, and the precision will be about 100%.

If my calculations are correct, then there does not seem to be a huge problem with the microcontroller. But the problem may be with the gate driver circuitry and the IGBTs, which I think are in the order of 10 to 50 nSec rise and fall times. So to achieve switching losses of 10% or less, it seems that the PWM period must be in the order of 100 to 500 nSec, which means that the 1 degree PWM may be just barely achievable at full motor voltage. But it could pose a major problem at low speeds near the slip frequency where maximum torque is often required for start-up (locked rotor).

Since the PWM waveform at very low speeds is very close to 50%, the actual switching losses will be pretty much the same as at the peaks, where the duty cycle approaches 100% and 0%. But as I understand it, the full DC link voltage is being applied as a 20 kHz square wave when the motor is stopped or rotating slowly. And it seems to me that the magnetizing current will be higher at this full voltage than it would be at a much lower voltage. Since there are resistive and magnetic losses in the motor, this will lower the efficiency and cause more heating under these low speed high torque conditions.

I would like to propose a design where the three phase motor controller operates on a continuously variable DC bus voltage, which could be supplied by a standard PWM DC motor controller or a variable output buck or boost DC-DC converter. The VFD can sense the DC bus voltage and adjust the frequency of the PWM according to the voltage, maintaining a predetermined V/F ratio, and the torque can be determined by the DC current consumed by the VFD. In its simplest form, the sine conversion can be determined by a fixed table, or even by using an EPROM driven by a variable frequency clock based on the voltage. Thus the PWM waveform will always be identical in terms of pulse width, and will vary only in height as determined by the drive voltage.

I think this may have several advantages even beyond those already discussed. It may offer a drop-in replacement for a series wound or PM DC motor where the existing DC controller can be used. And it provides a level of safety because the AC motor terminal voltages will be very low when it is stopped or rotating slowly. And in case of insulation breakdown that happens at higher RPMs, it may be possible to "limp home" at low speed.

Anyway, that's my idea, and I think it should be fairly easy to implement for a test.