Battery recharge efficiency, versus SOC?

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

donald

Well-known member
Joined
Jul 29, 2013
Messages
917
Hi.

This might prove to be a technically complex question that I'm hoping can be summed up with just a bit of simple advice - because it should result in a simple decision as to what SOC to recharge at!

Having searched around for tech info on the internet extensively today, I do not appear to be getting a clear answer on this from the theory. However, I have an energy meter shipping to me, with which I'll test this out practically in any case.

..Or maybe someone has already got a grip of this conundrum (maybe another thread somewhere I did not find?)...

Q: My commute uses a half a battery load. Disregarding the recommendations on optimum discharge ranges for long life for a moment, if I recharge that 50% daily, is it better for wall-wheel efficiency to keep topping it up from 50% to 100%, or 40% to 90%, or 10% to 60% .. etc.. ? What is the best starting SOC to get maximum wall-wheel efficiency?
 
donald said:
...Q: My commute uses a half a battery load. Disregarding the recommendations on optimum discharge ranges for long life for a moment, if I recharge that 50% daily, is it better for wall-wheel efficiency to keep topping it up from 50% to 100%, or 40% to 90%, or 10% to 60% .. etc.. ? What is the best starting SOC to get maximum wall-wheel efficiency?
In general, wall-to-wheels efficiency is lower when charging to 100%. Some of the losses are due to the charging "overhead", such as cooling pumps. Since the charging speed ramps down when approaching 100% that means more overhead for a given amount of charge placed in the battery. If you charge to 80% you should see increased efficiency versus charging to 100%.

My impression from your other posts is that you are trying to get a handle on whether the battery itself is more efficient when being charged at various SOC levels. I can't recall ever seeing any data on this. If there is such a difference I would guess that it it pretty small.

Charging efficiency losses are small enough, especially with level 2 charging, that they are less significant than driving efficiency and other operating characteristics (climate control system use, for example). Unless charging efficiency is of academic interest it probably isn't worth worrying about. Is the difference between 86% and 89% really significant in terms of electricity use? If so, you can get more bang for your electricity buck by hypermiling, I would think.
 
The internal resistance follows roughly a bathtub-shaped curve, and the efficiency will decrease at very low and very high SOC. Most of the wasted energy will turn into heat within the battery pack. Assuming that internal resistance will go up about 50% on either end of the SOC range, you could see an extra loss of 1% or 2% of the delivered energy. Although it might be enough to cause a measurable temperature increase in the battery, the energy loss itself is quite small and will likely get lost in the noise of other losses. If this academic argument wasn't convincing enough, you might be hard-pressed to find any owners who have observed this effect. The only plausible explanation is that the effect is small and difficult to see for the casual observer.
 
You also lose some regen capability when driving on a fully charged pack. Since regen can contribute to better efficiency by avoiding friction losses in braking, you may find better overall efficiency by charging to less than a full charge. This seems to be less of an issue on the 2013 cars, however, since they appear to allow more regen capability at higher SOC than the earlier LEAF.
 
There's a fixed overhead for the On-Board charger's cooling system (300w). So closer to 100% as the charge rate begins to slow, the efficiency would suffer.
 
But .. the flip-side is that as/if Li-ion ends its constant current charging and enters saturation charging (last 30% [of 'real' cell capacity], there will be less heating? So maybe the heating losses drop in the last 20% or so, no? However, the charger is then not operating at its peak current, thus (?) peak efficiency?

Though, there again it may not enter saturation charging because the charge current is sufficiently low (<0.2C)?

These are the particular tech details I think might complicate matters. I suspect real measurements are the only way forward, but that this requires consistent driving cycles, which may not be achievable and/or introduce some 'user bias'.

The charging losses appear to be around 20%, based on both members real-world posted figures and the DOE results. That seems a big wedge of inefficiency worth thinking about!
 
I believe that you are overthinking this. The EPA and DOE figures are wall-to-wheels, and they include both charging and discharging losses. Avoiding both the bottom and top SOC range will help battery efficiency, which is already quite good. If you wanted to measure elevated battery losses without the current ramp, you can likely do it at very low SOC. The battery will heat more near the top of the SOC range despite current ramp down. This is based on field observation, and not on academic research. Gentle driving will help with discharge losses. The charger is by far the most inefficient component, and while ignoring some details, this has nothing to do with the SOC or the battery.
 
Back
Top