I found that, for a 500 ohm load in place of the 100, the OUT2 voltage reached 400V and then throttled back to about 385, where with the 100 ohm load it rose to about 300. This would be OK for my purposes of generating about 300 VDC from a nominal 24 VDC battery pack. I made some other changes for this, where the primary of the transformer is 5 uH and the secondary is 200 uH, so I can get the boost with less stress on the components.

http://enginuitysystems.com/pix/PFC_1249_DCDC_24-240.asc
I am still exploring various approaches to these designs and there will probably be quite a few differences between the PFC front end and a high power charger, compared to my need for a DC-DC voltage booster. In general I feel more comfortable with a transformer design that just transfers energy from a low voltage DC source to a higher voltage output, using the turns ratio of the transformer and not the flyback effect.

My method for design and analysis is more intuitive than mathematical, and I am still trying to wrap my brain around some of the concepts. I'll try to explain my understanding of the difference between a transformer (energy transfer) design and one that uses an inductor for energy storage and release, like a flyback switcher.

For a transformer, I would select a core with rather high permeability and no gap, so that at the frequency being considered, relatively few turns would provide an inductance high enough to draw minimal current when unloaded. Thus, for a 24 volt input at 50 kHz, a 10 uH primary would present an impedance of 3.14 ohms and the applied voltage at 50% duty cycle would be about 12 VRMS, and magnetizing current would be about 4 amps. For a 1000 watt converter, this 48 VA is reasonable. As the load on the secondary increases, the current is reflected on the primary at quadrature (90 degree phase angle) and the magnetizing current becomes a small part of the total. The transformer will work until the ampere-turns create enough flux to saturate the material, at which point no more power can be transferred.

For an inductor-based topology, a material with low permeability may be used, or the effective permeability may be reduced by adding an air gap. In this case, you want to apply voltage and store energy in the magnetic material of the core, and then switch off the applied voltage and allow the stored energy to be released. In buck mode, the current increases when voltage is applied, and decreases when removed. This is ideal for a current source, and the output voltage will be lower than the input. For a boost topology, the input voltage is applied until a certain current (energy storage) is attained, and then the drive is removed. In this case the energy in the inductor will be applied to the load, and the voltage will be higher than the source.

The limitations of a single inductor buck or boost is that the inductor must "work harder" under conditions of large ratios of primary to secondary. To boost 24 VDC to 240 VDC, at 5 amps, the inductor needs to have 50 amps built up, and then the energy will be dumped into the load at a higher voltage and lower current. In continuous mode, the inductor is "recharged" before all of it energy has been transferred, so the ripple is less, but higher levels of DC are maintained. Using coupled inductors in a buck or boost can reduce the wide levels of current and voltage swing, and can provide isolation. But an isolated converter requires twice as much wire for the windings, so the size and weight are increased.

Feel free to critique and correct anything I have said here. This is just my understanding of the principles and there is obviously much more to it than my brief explanation.