DaveinOlyWA said:
80% provides a maximum level of safe reliable charging. Now that does not mean that 100% is not safe or risky. But as much as we would like to think we have battery management down, in actuality there are several things that are not completely understood.
I'm sorry Dave, but I have to respond here. I build battery packs with lithium cells, and do custom battery management system builds/installs for LiFePO4, LiMn, and LiPo packs. What I'm trying to say is coming from the cell-level looking 'up' the complexity scale, if that makes any sense.
In general, rechargeable lithium cells do give a very long life when 80% of capacity is used. No argument here at all. But this is 80% of the TOTAL capacity of the cell. It can be difficult for the general public to find total capacity numbers, because many manufacturers report consumer max/min voltages or consumer capacity rather than 'ultimate'.
Let's design a battery - we want to use 80% capacity for long life and want to set the max and min voltage for our new pack. Let's use A123-Systems LiFePO4 cells.
Our first stop is the publically-available datasheet from the A123-Systems website:
http://www.a123systems.com/cms/product/pdf/1/_ANR26650M1A.pdf
Recommended standard charge method 3A to 3.6V CCCV, 45 min
Recommended fast charge current 10A to 3.6V CCCV, 15 min
Maximum continuous discharge 70A
Pulse discharge at 10 sec 120A
Recommended pulse charge/discharge cutoff 3.8V to 1.6V
We see two 'end of charge' voltages - 3.6V for standard 'continuous input' charging and 3.8V for pulse charging. We'll use continuous, so will set the BMS to 3.6V.
The pulse discharge low voltage point is 1.6V, but we notice that their continuous load charts use 2.0V for the low voltage point. We'll use 2.0V for the 'empty' but will make sure our cells don't drop under 1.6V when loaded. So far so good.
Now we start to think/worry/analyze/what-if. We believe that using 80% of total capacity will give us a longer life - but are our numbers 100% or 80%? Should we contact A123-Systems and get a capacity chart and select more conservative numbers for our management system?
(Here's where we'd likely register with A123-Systems as a developer and get access to the ultimate numbers. We're not doing that, so let's see what else we can find from public sources.) Here's a report from a June 2006 Plug-in Prius conversion fire that includes developer-level info from A123-Systems:
http://www.evworld.com/library/prius_fire_forensics.pdf
Maximum cell voltage 3.85 volts recommended; 4.20 volts absolute
Minimum cell voltage 1.60 volts recommended; 0.50 volts absolute
Our public datasheet max is 3.6V, yet 3.85V is the 'recommended' max and ultimate is 4.20V
Same for minimum - 2.0, 1.6, and 0.5.
All Right! The numbers we've selected - 3.6V and 2.0V are 80% of ultimate capacity - we don't have to restrict any further because we're ALREADY at the desired 80% point.
Bottom line for us Leafers is that we do not have to second, third, or fourth guess the cell or pack voltages or charge level because Nissan has already included a capacity buffer top and bottom for their pack. We do not have access to the ultimate 100% of the capacity the way RC modelers or DIY-Ebikers or DIY EV builders do - Nissan has designed and delivered a consumer-grade product that we can simply plug-in and use.
DaveinOlyWA said:
Add to that he changing capacities of the pack based on temperature age, depth of discharge, etc.
We know that the car is capable of measuring capacity, temperature, cell age, internal resistance/degradation, current in and out, and depth of discharge. We also know that this info is reported thru the car on the network. We also know that Nissan's been at this 'lithium' thing for a very long time. I don't expect there's a chance in the world that they'd field a car without knowing absolutely how to manage the battery - and it appears from the info we have that Nissan is properly managing the pack.