Originally Posted by Zappo
The article didn't mention one other problem with the original design Edison batteries. They weren't very efficient. From everything I've read, you only got out between 50% and 60% of the energy you put into the battery. I have some that were made in the 70s. They still work but I have never checked to see if the (lack of) efficiency claims were true though.
"Efficient" compared to what? If the electricity you use comes from a plant that is 3 times as efficient as burning the same fuel in your vehicle for propulsion, then a battery 50% efficient is better than fueling up with gasoline.
Nickle Iron batteries were in fact a tad inefficient. The biggest impact on efficiency seemed to be the rate of charge and discharge, and the second biggest was that they do tend to lose some charge over time. I've seen claims of up to 80% efficient if they were charged slowly and used shortly afterwards - the type of use you expect normally from EVs.
Energy density wise, NiFe is better than lead acid but not better than NiMH. The bonus was you could run them flat dead and recharge them; the disadvantage was that they were not capable of high C discharge rates, they don't work well in the cold, and that high C rates (either charging or discharging) made them less efficient. The article's described approach essentially causes an explosion of surface area - and since chemical reactions vary proportionally to surface area, I suspect that that approach may mitigate the disadvantages quite a bit if they can overcome the declining capacity issues.