A guy who goes by the handle “Biff” on elmoto just put together a lengthy and very informative explanation of the pros, cons, and factors to weigh in making your voltage decision. For the record, you can take his advice and information to the bank. I can’t tell you who he is, but he knows his shit. He graciously allowed me to repost it here:
Power lines are high voltage for good reason. Power = Volts x Amps and to transfer lots of power you either need high voltage or high current (or both). High current means thicker conductors, and power line towers couldn’t support the weight of thick conductors required to transfer the power if the voltage were lower. I believe that power lines are actually aluminum rather than copper to reduce weight. Reduced weight also means less material, which means less cost, since the power cables are a significant part of the total cost of the power grid it is a bigger concern.
Partially for the same reason, but not quite as evident, car manufacturers are working on high voltage too. Conductor size isn’t so critical on a car or motorcycle because the distances are so short, even a cable that can conduct 4000 or 6000A is not an insurmountable problem, but because of this power transmission issue, subways, trolleys and other non-battery high power traction motors have been working on high voltage for quite some time, usually 600VDC. So even though having lighter weight conductors isn’t that big of a deal in cars, the inverters and motors are already designed and well understood so they continue to use the technology that is known. Developing 100V 4000A controllers (something that would be suitable for the Tesla for example) is certainly possible, but would require some new development to get the FETs working well, and there are other considerations as well, like packaging a 6000A fuse, maybe contactors and such.
Another consideration is charging, plug and charge station conductor size. If you have a 85kWh battery that you want to charge in 15 minutes, a 400V battery requires 850 Amps, a 100V battery would need 3400Amps. An 850 Amp cable and connector is getting pretty large, 3/0 gauge I think would do it and a connector that can fit in your hand. to have the same conductor losses with 3400A, you would need 16x the cable (because conductor losses = Current x Current x Resistance (I^2 x R )) which would end with a cable from the charger that would be pretty heavy and a pretty large connector. Something that would weigh against using a low-voltage system.
For a car, 85kWh in 15 minutes (340kW) could be a reasonable goal, even if battery technology gets better we might not need to charge faster than that, this would translate to needing to stop for 15 minutes every 200 miles or something like that? (I am not too familiar with the highway range of electric cars) but that doesn’t seem too bad.
If you could charge an electric motorcycle at 340kW, you could charge a 15kW battery (about as large as is in an electric motorcycle today) in about 3 minutes, or if you had an 85kWh you would need to stop for 15minutes every 550miles. That would be amazing, but probably not something everybody needs. Also current battery technology limits charging rates to around 1-2C , i.e you can only charge a battery safely in 30 minutes to 1 hour, so a 15kWh battery on a motorcycle is likely only to charge at 15kW, and at 100V that is 150A which is a reasonable size cable, even to charge a 15kWh battery in 15 minutes (60kW) can be done with reasonable cable sizes at 100V.
Motors can be wound to run at any voltage, so that isn’t really a big factor in system voltage. Controllers are different based on the voltage. With 100V systems you can use FETS, with 300V and higher you need to use IGBT’s
FETS have an internal resistance, the more you put in parallel the lower the resistance, as FETS get better the resistance gets lower as well. So if you want to minimize your conductive losses in the controller you can use better, or more FETS, the conductive loss in a FET based controller are I^2 x R and you can reduce R based on your needs and the price you are willing to pay.
IGBT’s have a diode drop, so regardless of how much current (to some extent) you pass through the IGBT you have a 0.7V drop, you can’t put more IGBT’s in parallel to reduce that voltage. So conductive losses in an IGBT based controller are Vd x I
So if you are a math wiz, you can see that because of the different formulas for determining conductive losses for the different voltage controllers, there will be optimization equation, and in some conditions one technology will be better than another, but really this efficiency isn’t that big of deal, it all comes down to what can be built at what cost. The powertrain requires a controller that can deliver the peak and continuous power as needed for the platform, the controller designer needs to look at that system and determine what is feasible, or a manufacturer that wants to assemble a powertrain from available components, they need to look at what is out there, and the complexities if integrating it.
Another consideration is the number of series cells. With lithium batteries, each group of parallel cells needs to be monitored, with a 400V system, that means there is around 4x as many parallel groups of cells to monitor compared to a 100V system. The number of inter-connects and other systems like heating or cooling elements need to be considered.
That original post is here: Voltage, high or low?
(Pretty sure Biff reads this occasionally – if you do happen to see it, sir, thank you!)