LEAF voltage measurement accuracy impact on capacity

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

RegGuheert

Well-known member
Joined
Mar 19, 2012
Messages
6,419
Location
Northern VA
Nissan has made the following claim quoted in the Canberra Times in Australia (September 4, 2012 article: http://canberratimes.drive.com.au/motor-news/nissan-to-step-up-ev-production-20120904-25b60.html" onclick="window.open(this.href);return false;):
Canberra Times said:
Palmer also dismissed recent reports of battery problems in hot weather for the Leaf. A number of owners in America complained of reduced range during summer, but Palmer says the problem is a faulty battery level display.

"We don't have a battery problem," he says.
Palmer above refers to Andy Palmer, executive vice president of Nissan.

While I do not believe for one minute that the LEAF does not have a battery problem, I do think there could also be an instrumentation problem which is either adding to the battery problem or possibly even leading to some of the damage.

More specifically, if the Nissan LEAF's ability to ACCURATELY measure the DC voltage of the main traction battery is not extremely high and extremely stable over temperature and time, the available capacity of the LEAF battery could be severely affected. Please note that there is no doubt that voltage measurement inaccuracies will impact LEAF battery capacity. The question here is how much of what we are seeing is due to this effect.

I am focusing only on DC voltage measurement here, since both charging and discharging in the LEAF appear to both be terminated based upon voltage measurement. Any errors in current measurement should be almost completely inconsequential in regards to available battery capacity and actual range.

Here are a few scenarios that might be affecting the capacity in our LEAF batteries:

1) Coming from the factory: LEAF measurements of battery voltage are higher than the actual battery voltage: In this scenario, the LEAF would terminate charge and discharge at a lower battery voltage than intended, resulting in lower overall capacity available when new. However, the battery will tend to be stored at a lower SOC which may improve calendar life of the battery.

2) Coming from the factory: LEAF measurements of battery voltage are lower than the actual battery voltage: in this scenario, the LEAF would terminate charge and discharge at a higher battery voltage than intended, resulting in higher overall capacity available when new. However, the battery will tend to be stored at a higher SOC which may reduce calendar life of the battery.

3) As temperature rises: LEAF measurements of battery voltage trend higher than the actual battery voltage: In this scenario, at higher temperatures the LEAF would terminate charge and discharge at a lower battery voltage than intended, resulting in lower overall capacity available when hot. However, the battery will tend to be stored at a lower SOC which may improve calendar life of the battery.

4) As temperature rises: LEAF measurements of battery voltage trend lower than the actual battery voltage: In this scenario, at higher temperatures the LEAF would terminate charge and discharge at a higher battery voltage than intended, resulting in higher overall capacity available when hot. However, the battery will tend to be stored at a higher SOC which may reduce calendar life of the battery.

5) As the LEAF ages: LEAF measurements of battery voltage trend higher than the actual battery voltage: In this scenario, as the LEAF ages it would terminate charge and discharge at a lower battery voltage than intended, resulting in lower overall capacity available. However, the battery will tend to be stored at a lower SOC which may improve calendar life of the battery.

6) As the LEAF ages: LEAF measurements of battery voltage trend lower than the actual battery voltage: In this scenario, as the LEAF ages it would terminate charge and discharge at a higher battery voltage than intended, resulting in higher overall capacity available. However, the battery will tend to be stored at a higher SOC which may reduce calendar life of the battery.

Note: All of the above scenarios are focused around the ability of the LEAF to measure 393V.

So while I do not expect a huge variation in 1) an 2) due to the ability to calibrate at the factory, it could explain part of the range differences seen in new LEAFs.

Scenarios 3) and 4) may explain some of the seasonality we see in the LEAF. Scenario 3) may actually be intentionally implemented in the BMS to try to extend the life of the battery.

Scenarios 5) and 6) could also be an issue. If scenario 5) were occurring, we would have a difficult time distinguishing it from battery degradation. If scenario 6) we're occurring, the battery would tend to get more and more abuse from the car as time goes on and would tend to wear out more quickly.

So, how hard is it to measure battery voltage? It's not as easy as many here seem to think. Let's look at accuracy available in a few voltmeters [http://www.fluke.com]Fluke[/url] makes: Fluke 80, Fluke 77 and Fluke 113.

Fluke 80: + or - 0.1% equates to a reading of 392.6 to 393.4 V for a battery at 393 V. Frankly, I seriously doubt the LEAF has the accuracy of a Fluke 80.

Fluke 77: + or - 0.3% (best) equates to a reading of 391.8 to 394.2 V for a battery at 393 V. That level of accuracy would equate to about a 1-kWh total discrepancy range in LEAF battery capacity. I HOPE that the LEAF has at least this much accuracy.

Fluke 113: + or - 3.0% equates to a reading of 381.2 V to 404.8 V for a battery at 393 V. I Seriously doubt the LEAF is anywhere near this bad. If so, then Houston, we have a problem.

Possibly: + or - 1.0% equates to a reading of 389.1 to 396.9 V for a battery at 393 V. While I do not think the LEAFs are possibly this bad when new at room temperature, but I wonder if there could be this much error over time and temperature. I think that would be entirely reasonable to occur.

I will say that TickTock's voltage charging curves appear to me to show about a 3-V difference in voltage at which the knees occur at both ends of the charging curve. This would correspond with either scenario 3) or scenario 5), but it could simply be operation of the BMS temperature compensating the voltage that is displayed.

So, how can we go about figuring out how much impact the LEAF's voltage measurement accuracy is having on our battery capacity? It's not a simple matter to resolve, but I am confident it is a part of the battery story with the Nissan LEAF. How big of a part it plays I have no idea.

Thoughts?
 
If it were a simple voltmeter problem, why would it take them 2 months to let us know? A new voltmeter in the car is a very inexpensive solution for Nissan. They should have been able to replace these sensors in the cars they took to Case Grande and they would have been restored to "like new" condition. They also could have a new more stable sensor available for us all by now.

Fluke (287, 289) also makes meters with 0.03 accuracy which is 10 times better than the one you chose.

I'm not convinced that there is a voltage measurement issue. Anyone with a good voltmeter and access to the data via a Gidmeter or other means should be able to quickly verify this.
 
RegGuheert said:
I will say that TickTock's voltage charging curves appear to me to show about a 3-V difference in voltage at which the knees occur at both ends of the charging curve. This would correspond with either scenario 3) or scenario 5), but it could simply be operation of the BMS temperature compensating the voltage that is displayed.
Thank you for taking this topic out of the other thread. With regards to TickTock's plots, as I mentioned earlier, the higher voltage at the knee could be indicative of higher SOC due to lower battery capacity at the end of the constant-current portion of the charging process.

Althouth SOC calculation is a complex game, we might be overthinking this. TickTock's graphs show no evidence of a different end voltage, it looks like 393.5 or 394 Volt for both charges, which is what I always saw on my Leaf as well. Winter or summer. There was never a shade of a difference, but my local ambient does not vary a great deal, perhaps only 60 degrees.

I would like to quote a friend, who has worked on the software side of a similar design problem, but it was a private conversation, and I don't have the permission to share it publicly. Perhaps I can at least mention that SOC is (or should be) calculated on a per-cell basis. Since not all cells will charge to the exact same voltage, despite balancing, we would need to know if the indicated SOC in the Leaf was based on the low cell, average cell, or other assumed voltage.

That might be equally if not more important than any voltage sensor error consideration. He also confirmed that what ultimately matters is actual measured range and any other data or assumptions should be backed up by range tests. These can vary (even with access to a dyno), but if conducted methodically, significant trends will emerge.
1
 
surfingslovak said:
RegGuheert said:
I will say that TickTock's voltage charging curves appear to me to show about a 3-V difference in voltage at which the knees occur at both ends of the charging curve. This would correspond with either scenario 3) or scenario 5), but it could simply be operation of the BMS temperature compensating the voltage that is displayed.
Thank you for taking this topic out of the other thread. With regards to TickTock's plots, as I mentioned earlier, the higher voltage at the knee could be indicative of higher SOC due to lower battery capacity at the end of the constant-current portion of the charging process.

Althouth SOC calculation is a complex game, we might be overthinking this. TickTock's graphs show no evidence of a different end voltage, it looks like 393.5 or 394 Volt for both charges, which is what I always saw on my Leaf as well. Winter or summer. There was never a shade of a difference, but my local ambient does not vary a great deal, perhaps only 60 degrees.

I would like to quote a friend, who has worked on the software side of a similar design problem, but it was a private conversation, and I don't have the permission to share it publicly. Perhaps I can at least mention that SOC is (or should be) calculated on a per-cell basis. Since not all cells will charge to the same voltage, we would need to know if the indicated SOC in the Leaf was based on the low cell, average cell, or other assumed voltage.

That might be equally if not more important than any voltage sensor error consideration. He also confirmed that what ultimately matters is actual measured range and any other data or assumptions should be backed up by range tests. These can vary (even with access to a dyno), but if conducted methodically, significant trends will emerge.

Are we talking about the same charts?

This one http://www.mynissanleaf.com/download/file.php?id=549&mode=view" onclick="window.open(this.href);return false; from this thread
http://www.mynissanleaf.com/viewtopic.php?f=30&t=8802&start=2336" onclick="window.open(this.href);return false;

This curve is a simple result of less capacity. The voltage is 3 volts higher because it can reach voltage quicker due to less capacity in the pack. The voltage stops at the same place, but it just got there quicker. Put a pack with more capacity at it will track on the same slope and a few volts lower and end up terminating at the same voltage but with more capacity. I'm not doing a very good job of explaining this, but I think you guys are misreading this graph somehow.

surfingslovak said:
Perhaps I can at least mention that SOC is (or should be) calculated on a per-cell basis. Since not all cells will charge to the same voltage, we would need to know if the indicated SOC in the Leaf was based on the low cell, average cell, or other assumed voltage.

The total capacity of the pack is limited to the capacity of the lowest capacity cell. If you have one cell with lower capacity then it establishes the capacity of the entire pack. Nissan is monitoring the voltage of the individual pairs, so if one cell pair starts dropping in voltage indicating that it is at the end of the discharge, then that marks the end of the entire pack, even if the other cells are still showing plenty of capacity remaining. So SOC and capacity comes from the cell pair that has the lowest capacity.
 
palmermd said:
If it were a simple voltmeter problem, why would it take them 2 months to let us know? A new voltmeter in the car is a very inexpensive solution for Nissan.
Sure, but only if Nissan knows how to solve the problem. Getting to 0.3 % accuracy over the life of the car may not be something commonly (or even ever!) done in the automotive industry. Perhaps they don't have a solution in hand.
palmermd said:
They should have been able to replace these sensors in the cars they took to Case Grande and they would have been restored to "like new" condition. They also could have a new more stable sensor available for us all by now.
Clearly there are degraded batteries in Phoenix. I'm not disputing that. But the degradation rate is increased in scenarios 2), 4) and 6).
palmermd said:
Fluke (287, 289) also makes meters with 0.03 accuracy which is 10 times better than the one you chose.
O.K., I didn't see those. But, like the Fluke 80, I don't think Nissan can achieve that level of accuracy in-car in the LEAF.

I do now wonder if Nissan performs a voltmeter calibration at the annual battery checks. Does anyone know.
palmermd said:
I'm not convinced that there is a voltage measurement issue.
Neither am I, but we have virtually no evidence one way or the other.
palmermd said:
Anyone with a good voltmeter and access to the data via a Gidmeter or other means should be able to quickly verify this.
Not so. One measurement of the voltage in one car will not give ANY indication of what goes on in the entire fleet of LEAF worldwide.
 
surfingslovak said:
Thank you for taking this topic out of the other thread. With regards to TickTock's plots, as I mentioned earlier, the higher voltage at the knee could be indicative of higher SOC due to lower battery capacity at the end of the constant-current portion of the charging process.
palmermd said:
Are we talking about the same charts?

This one http://www.mynissanleaf.com/download/file.php?id=549&mode=view" onclick="window.open(this.href);return false; from this thread
http://www.mynissanleaf.com/viewtopic.php?f=30&t=8802&start=2336" onclick="window.open(this.href);return false;

This curve is a simple result of less capacity. The voltage is 3 volts higher because it can reach voltage quicker due to less capacity in the pack.
You both have said that the voltage knees are higher because of reduced capacity of the pack. I do not believe that is true. The knees in the curves should be a function of the chemistry involved, the number of series cells and the temperature.

In TickTock's graphs, both knees moved left, which is due to the change in capacity, and upward about three volts, which should be due to either temperature effects or voltage measurement issues. We cannot tell which.
surfingslovak said:
Althouth SOC calculation is a complex game, we might be overthinking this. TickTock's graphs show no evidence of a different end voltage, it looks like 393.5 or 394 Volt for both charges, which is what I always saw on my Leaf as well. Winter or summer. There was never a shade of a difference, but my local ambient does not vary a great deal, perhaps only 60 degrees.
palmermd said:
The voltage stops at the same place, but it just got there quicker. Put a pack with more capacity at it will track on the same slope and a few volts lower and end up terminating at the same voltage but with more capacity. I'm not doing a very good job of explaining this, but I think you guys are misreading this graph somehow.
Yes, the two charges in TickTock's graph terminated at the same *indicated* voltage. That says nothing about the accuracy of the two indications or whether or not temperature compensation is being applied. In other words, the software MAY be applying temperature compensation to the displayed battery voltage and comparing that with a fixed value. There is no way for us to know without an independent verification of the voltages.
surfingslovak said:
I would like to quote a friend, who has worked on the software side of a similar design problem, but it was a private conversation, and I don't have the permission to share it publicly. Perhaps I can at least mention that SOC is (or should be) calculated on a per-cell basis. Since not all cells will charge to the same voltage, we would need to know if the indicated SOC in the Leaf was based on the low cell, average cell, or other assumed voltage.

That might be equally if not more important than any voltage sensor error consideration. He also confirmed that what ultimately matters is actual measured range and any other data or assumptions should be backed up by range tests. These can vary (even with access to a dyno), but if conducted methodically, significant trends will emerge.
I agree with you there, but perhaps we can focus just on the voltage instrumentation issues in this thread. The per-cell instrumentation is another issue entirely.
palmermd said:
The total capacity of the pack is limited to the capacity of the lowest capacity cell. If you have one cell with lower capacity then it establishes the capacity of the entire pack. Nissan is monitoring the voltage of the individual pairs, so if one cell pair starts dropping in voltage indicating that it is at the end of the discharge, then that marks the end of the entire pack, even if the other cells are still showing plenty of capacity remaining. So SOC and capacity comes from the cell pair that has the lowest capacity.
Nissan has a standard test for this and many LEAFs have been tested. There has always been some variation, but it is not a "smoking gun" for the issues in Phoenix.
 
palmermd said:
Are we talking about the same charts?

This one http://www.mynissanleaf.com/download/file.php?id=549&mode=view" onclick="window.open(this.href);return false; from this thread
http://www.mynissanleaf.com/viewtopic.php?f=30&t=8802&start=2336" onclick="window.open(this.href);return false;

This curve is a simple result of less capacity. The voltage is 3 volts higher because it can reach voltage quicker due to less capacity in the pack. The voltage stops at the same place, but it just got there quicker. Put a pack with more capacity at it will track on the same slope and a few volts lower and end up terminating at the same voltage but with more capacity. I'm not doing a very good job of explaining this, but I think you guys are misreading this graph somehow.
Yes, and I believe that I said the same thing. In other words: the higher voltage at the knee is likely a result of diminished battery capacity and the charging protocol.
palmermd said:
The total capacity of the pack is limited to the capacity of the lowest capacity cell. If you have one cell with lower capacity then it establishes the capacity of the entire pack. Nissan is monitoring the voltage of the individual pairs, so if one cell pair starts dropping in voltage indicating that it is at the end of the discharge, then that marks the end of the entire pack, even if the other cells are still showing plenty of capacity remaining. So SOC and capacity comes from the cell pair that has the lowest capacity.
Yes, thank you for explaining that, this is generally true. I should have qualified the statement I included in my previous comment. It was from someone who has worked as systems lead on a well-known commercial EV project. This is a distinction the two of us likely don't share, and if he says that SOC calculation is fairly complex and can be based on average cell voltage or and even some other value, I'm inclined to believe it.

RegGuheert said:
In TickTock's graphs, both knees moved left, which is due to the change in capacity, and upward about three volts, which should be due to either temperature effects or voltage measurement issues. We cannot tell which.
Agreed, but I would prefer to assume that there is a straightforward explanation instead of a complicated one. We know that charging protocols start with constant current. We also know that the battery in the other graph had less capacity. The more likely explanation is that the battery reached higher potential and higher voltage due to its diminished capacity. Temperature certainly plays a role in SOC and voltage readings, and although I wouldn't want to hazard a guess how much of a difference it makes, can we at least assume that the car was charged in TickTock's garage? I would think that it has a temperature equalizing effect and the difference might not be very significant.

RegGuheert said:
Yes, the two charges in TickTock's graph terminated at the same *indicated* voltage. That says nothing about the accuracy of the two indications or whether or not temperature compensation is being applied. In other words, the software MAY be applying temperature compensation to the displayed battery voltage and comparing that with a fixed value. There is no way for us to know without an independent verification of the voltages.
Agreed, but with all due respect, this is starting to sound as we were debating the relativity theory. Yes, there might be temperature compensation applied, but what would be the delta. Is it worth considering? What Michael said below is true. You can take one car and examine how the CAN bus voltage value behaves relative to temperature and other parameters. While reverse engineering one car and applying the results to the rest of the fleet is not the most scientific method, I really don't know what else we could do, short of asking Nissan.


RegGuheert said:
I agree with you there, but perhaps we can focus just on the voltage instrumentation issues in this thread. The per-cell instrumentation is another issue entirely.
Why would that be a different issue entirely? You rightly raised the point of voltage sensor accuracy and temperature compensation. While we can only speculate, the error is on the order of what, 0.5%. More or less? What if the pack voltage we are looking at on the CAN bus was a synthetic value as well? A multiple of all module averages? If that were the case, the difference from actual would be perhaps even more significant than the sensor error and temperature effect combined.

The reason why I mentioned what this person said is simple, he has worked on a commercial EV project and has gone the distance. There could be more things happening behind the scenes than we realize or want to admit.

Personally, I believe that a simple well-executed range test as discussed in the other thread is enough. If we wanted to collect more data, Gids are great and they work for us, but we may as well collect pack voltage if possible, even though we may not know where these numbers came from and how accurate they are.

What Michael suggested is likely a step in the right direction. Phil or someone else who has disassembled the car and verified some of the values from the CAN bus against accurate instruments can give us an idea of how to interpret the collected data better. I don't think that we will be able to arrive at that conclusion by talking about it.
 
The Battery controller has more varied data about voltage than it has about temperature. One can see the voltage of each module individually with Consult. This is good. We techs use this info not just for diagnosis, but for balancing a new module prior to installation. The voltage is displayed in mV.

There are 4 (if i remember right) temp probes. 2 on the rear bank, one for each side bank of modules. I'm pretty sure the battery controller can display them individually on Consult, though I've never looked.

I felt more temp sensors were needed the first time I saw the battery. But im a tech, not an engineer. I wish engineers asked for my input.
 
surfingslovak said:
Yes, and I believe that I said the same thing. In other words: the higher voltage at the knee is likely a result of diminished battery capacity and the charging protocol.
surfingslovak said:
Agreed, but I would prefer to assume that there is a straightforward explanation instead of a complicated one. We know that charging protocols start with constant current. We also know that the battery in the other graph had less capacity. The more likely explanation is that the battery reached higher potential and higher voltage due to its diminished capacity.
As explained previously, the knee voltage is not a function of capacity, so changing capacity can only move the knee left or right on TickTock's curve, not up or down.

Here's a simple example: Take a single cell and charge it with a constant current and observe the knee voltage. Now charge two identical cells in parallel with the same constant current source. Would you expect the knee voltage to go down because the capacity is now doubled? I wouldn't. The knee voltage would be exactly the same.
surfingslovak said:
Temperature certainly plays a role in SOC and voltage readings, and although I wouldn't want to hazard a guess how much of a difference it makes, can we at least assume that the car was charged in TickTock's garage? I would think that it has a temperature equalizing effect and the difference might not be very significant.
I don't know much about the temperature effect on that knee voltage. As temperature tends to affect all chemical process, I'm sure there is some effect, but I don't really know how to find out how much.
surfingslovak said:
Agreed, but with all due respect, this is starting to sound as we were debating the relativity theory. Yes, there might be temperature compensation applied, but what would be the delta. Is it worth considering? What Michael said below is true. You can take one car and examine how the CAN bus voltage value behaves relative to temperature and other parameters. While reverse engineering one car and applying the results to the rest of the fleet is not the most scientific method, I really don't know what else we could do, short of asking Nissan.
As mentioned, measuring one car will tell us very little. If that one car matched the calibrated measurement, you cannot conclude that the entire set is accurate.
surfingslovak said:
Why would that be a different issue entirely? You rightly raised the point of voltage sensor accuracy and temperature compensation. While we can only speculate, the error is on the order of what, 0.5%. More or less? What if the pack voltage we are looking at on the CAN bus was a synthetic value as well? A multiple of all module averages? If that were the case, the difference from actual would be perhaps even more significant than the sensor error and temperature effect combined.
Sure, we do not know what Nissan is doing in their firmware, but still the important point is that if they have poor measurement accuracy it really doesn't matter. The scenarios that I laid out all still apply regardless of their methodology. Of course they can make things even worse if they do dumb things.
surfingslovak said:
The reason why I mentioned what this person said is simple, he has worked on a commercial EV project and has gone the distance. There could be more things happening behind the scenes than we realize or want to admit.
Sure, but he may have dealt with a very accurate measurement system where the algorithms were the key issue. I'm saying Nissan may have skimped (or made a mistake) on the voltage measurement system and this may have led to a wide range of symptoms throughout the fleet. I have laid out what would be the consequences in a LEAF with various forms of voltage measurement inaccuracies. What we don't know is how accurate the systems are.
surfingslovak said:
Personally, I believe that a simple well-executed range test as discussed in the other thread is enough. If we wanted to collect more data, Gids are great and they work for us, but we may as well collect pack voltage if possible, even though we may not know where these numbers came from and how accurate they are.
I am confident that Tony's test will confirm what we all feel is the case: those cars in Phoenix have lost a significant amount of range. It is important that we know that definitively and this test will give the answer.

I am discussing the next question which is why these batteries are not holding up, even in mild climates like San Diego. Battery chemistry is clearly part of the issue. But the degradation we are seeing appears to go beyond something that can be explained by the chemistry alone. To me, it seems worse than expected. One possible explanation is that at least some LEAF battery packs are being tortured by a BMS which cannot accurately determine the battery voltage. It's a theory that I have not seen discussed here before.
surfingslovak said:
What Michael suggested is likely a step in the right direction. Phil or someone else who has disassembled the car and verified some of the values from the CAN bus against accurate instruments can give us an idea of how to interpret the collected data better. I don't think that we will be able to arrive at that conclusion by talking about it.
Agreed. It would be great if Phil had some information on the voltage accuracy for his LEAF's measurement system. It would be one data point more than we have today, which is none.
 
KtG said:
The Battery controller has more varied data about voltage than it has about temperature. One can see the voltage of each module individually with Consult. This is good. We techs use this info not just for diagnosis, but for balancing a new module prior to installation. The voltage is displayed in mV.

There are 4 (if i remember right) temp probes. 2 on the rear bank, one for each side bank of modules. I'm pretty sure the battery controller can display them individually on Consult, though I've never looked.

I felt more temp sensors were needed the first time I saw the battery. But im a tech, not an engineer. I wish engineers asked for my input.
Thanks, again! That's good information!

Yes, I've noticed from reading the service manual that the cell voltages are displayed in millivolts. That much resolution implies and accurate measurement was made.

Hopefully Nissan made the system truly accurate rather than just deciding that high resolution would be sufficient to keep the pack balanced, which it certainly is.
 
RegGuheert said:
KtG said:
The Battery controller has more varied data about voltage than it has about temperature. One can see the voltage of each module individually with Consult. This is good. We techs use this info not just for diagnosis, but for balancing a new module prior to installation. The voltage is displayed in mV.

There are 4 (if i remember right) temp probes. 2 on the rear bank, one for each side bank of modules. I'm pretty sure the battery controller can display them individually on Consult, though I've never looked.

I felt more temp sensors were needed the first time I saw the battery. But im a tech, not an engineer. I wish engineers asked for my input.
Thanks, again! That's good information!

Yes, I've noticed from reading the service manual that the cell voltages are displayed in millivolts. That much resolution implies and accurate measurement was made.

Hopefully Nissan made the system truly accurate rather than just deciding that high resolution would be sufficient to keep the pack balanced, which it certainly is.

Not that im disagreeing at all with your theory, but I've every confidence in the design of the voltage monitoring. I can even ID the physical locations of each module based on the info Consult gives.

The ability to control voltage in/out of each is limited though. Resolution there is by stack only, as each stack has its own bus ribbon that connects to the main relays. Short of putting a micro controller in each module (hello costs!), there isnt much of a way to get past that.

Still, with limited temp sensing its all a moot point. Even if the BMS saw a temp spike in one module and a corresponding voltage change, there isnt much it can do beside set a DTC.

I'd say its obvious they (the engineers, but more likely a bean counter) made a boo boo. But sometimes one makes a SWAG, because you can only add so many sensors before you muddy the data. I'd love AFR sensors on every cylinder of my gas engines (and aftermarket systems exist for that), but the cost vs gain is nutty. Same thing for having Intake air temp data in the port right by the valve rather than 3' up the air stream. Only development motors get that level of sensing, and only then when absolutely needed.

I'd like to think that Nissan had temp probes all over this battery in testing, and in production placed the 4 where they give the most accurate picture painted by the 100 sensors. Thats how I would have done it. but again, not an engineer. I'll try to make time to get the Consult on our Leaf Shuttle and see what I can see for individual temps. I'm willing to bet the rear stack is hottest, and is where the problem lies.
 
I've not monitored my Leaf that often, but initially and as recently as yesterday (9/17/12) I did check the accuracy. According to my calibrated 100k count Fluke 45, it's still well within a half volt of the voltage reported by the Battery ECU/LBC (Lithium Battery Controller).

There is cross-checking going on in the LBC (module sums vs total pack), so I would imagine if something went awry you'd have a code P30F4. It's highly unlikely 2 different measurement systems could drift the same way so as to not be caught in self-diagnosis.

There's nothing to say that somehow the chemistry hasn't been altered in the pack in such a way to alter the knee voltages though.

-Phil
 
For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?

The three-bar-loser Leaf I drove in Phoenix (courtesy of wiltingleaf) behaved rather strangely after the low battery warning. It's tough to say if it was because the owner never runs a deep cycle and the software starts misjudging remaining battery capacity or if the discharge curve changes significantly as the battery degrades. I'm under the impression that Leaf owners in Phoenix get squeezed in whatever range they have left above the low battery warning, which many don't want to hit. Driving with flashing numbers or dashes, and without a Gid meter can be anxiety inducing. It appears that the range below the low battery warning expands with time, and the range above the warning diminishes rather quickly.
 
surfingslovak said:
For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?

The three-bar-loser Leaf I drove in Phoenix (courtesy of wiltingleaf) behaved rather strangely after the low battery warning. It's tough to say if it was because the owner never runs a deep cycle and the software starts misjudging remaining battery capacity or if the discharge curve changes significantly as the battery degrades. I'm under the impression that Leaf owners in Phoenix get squeezed in whatever range they have left above the low battery warning, which many don't want to hit. Driving with flashing numbers or dashes, and without a Gid meter can be anxiety inducing. It appears that the range below the low battery warning expands with time, and the range above the warning diminishes rather quickly.

I think this makes sense because the lower range becomes more delicate/vulnerable as the capacity degrades. In my mind, their TMS/BMS is heavily driven by machine learning and as you know... machine learning is all about training and retraining. It works great in retrospect, but struggles with fast moving training sets or in our case fast moving or unpredictable environmental swings. As such, it must use a set of guard bands because it has nothing else to save it from challenging conditions/situations. What do I know... I am likely over-thinking it.
 
palmermd said:
If it were a simple voltmeter problem, why would it take them 2 months to let us know? A new voltmeter in the car is a very inexpensive solution for Nissan.

Probably not inexpensive at all since you have 48 voltage measurements to do, at a millivolt scale with lots of connectors, cables and so on.. and all this just to accurately access the little bit of energy left below the battery curve. Not a good investment. Running the car below the knee will always be risky and provide low returns.
 
mdh said:
In my mind, their TMS/BMS is heavily driven by machine learning and as you know... machine learning is all about training and retraining. It works great in retrospect, but struggles with fast moving training sets or in our case fast moving or unpredictable environmental swings.

I think you are dead on..
 
surfingslovak said:
For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?
That's interesting. 2 possible reasons I can think of:

1. Some modules are weaker than the others and the LEAF uses the voltage of the weakest module (or cell-pair?) to determine when the party ends.
2. With it being Arizona, the pack will be warmer. With a warm pack, voltage will sag less under load. Of course, the test was run in the morning during pretty normal temps that one can easily recreate in the Bay Area in the summer.

surfingslovak said:
It's tough to say if it was because the owner never runs a deep cycle and the software starts misjudging remaining battery capacity or if the discharge curve changes significantly as the battery degrades. I'm under the impression that Leaf owners in Phoenix get squeezed in whatever range they have left above the low battery warning, which many don't want to hit. Driving with flashing numbers or dashes, and without a Gid meter can be anxiety inducing. It appears that the range below the low battery warning expands with time, and the range above the warning diminishes rather quickly.
If I had to guess, I'd guess that someone who rarely charges to 100% and rarely gets to LBW would make things really tough for the LBC to determine the actual health of the battery - just keeping the pack well above LBW should have a similar effect. Historically, lithium battery capacity monitors have needed to have had their battery drained down fairly far periodically to get a good reading of actual capacity thanks to the shallow discharge voltage curve typical of lithium batteries. I think we did see some evidence of the LBC learning on TickTock's car which immediately went back to 11 bars after the test (unfortunately in the wrong direction, but was expected).

The question is - how far down do you have to run the pack to give the LBC enough information to correct itself - and is doing it once enough? One hates to cycle the pack more than they really need to....
 
Ingineer said:
I've not monitored my Leaf that often, but initially and as recently as yesterday (9/17/12) I did check the accuracy. According to my calibrated 100k count Fluke 45, it's still well within a half volt of the voltage reported by the Battery ECU/LBC (Lithium Battery Controller).
I was hoping you would see this thread and report your findings! Thanks for the information!

So for your car it seems you are seeing accuracy of around 0.1% which should be sufficient to do the job. Hopefully all of the LEAFs are approximately this good. Given that the voltage difference between a 100% charge and an 80% charge is only 1.2%, not much error can be tolerated.

Have you ever tested another LEAF besides yours?

Also, have you ever tested the voltage accuracy with the pack temperature elevated? I'm just wondering if things could get worse in a climate like Phoenix.
Ingineer said:
There is cross-checking going on in the LBC (module sums vs total pack), so I would imagine if something went awry you'd have a code P30F4. It's highly unlikely 2 different measurement systems could drift the same way so as to not be caught in self-diagnosis.
So these two systems are largely independent of each other? If so, that sounds like a good check, since it provides a way for the car to identify voltage measurement problems. It will be interesting to see if anyone ever gets that code.
Ingineer said:
There's nothing to say that somehow the chemistry hasn't been altered in the pack in such a way to alter the knee voltages though.
Perhaps. Temperature is also likely part of it and perhaps pack balancing plays a role here, too.

I have a couple of related questions about voltage measurements that perhaps you or someone else can answer:

1) How does the LEAF make measurements during charging? There are really three questions/concerns here:
a) There is resistance in the pack wiring and also the internal resistance of the cells changes over their life. As a result, the resting voltage should be different than the charging voltage. Given the accuracy requirements on the measurement system and the potential impact on battery life, it would seem that measurements would need to be done after a resting period. Does the system use an estimate all of these resistances in order to correct the charge termination voltage? If so, how much error does this approach introduce? Does charging accuracy degrade as the cells degrade due to different rates of increase in the cell resistances? Is the four-hour delay between charge termination and cell balancing related to measuring these cell resistances or is that more designed to ensure all of the cells are nearly isothermal or perhaps both?
b) My understanding is that this battery is a top-balanced system and that the shunts are capable of about 1A. If the pack charges at 8A normally, the shunts will be quite limited unless the charging tapers off drastically. Does the charge terminate whenever the *highest* cell reaches 4.1V or does it target the *average* cell. Given that surfingslovak reports seeing only a small difference in charge-termination voltages, it seems like it may target the average. Do you know what the range of cell-pair voltages is at the end of a 100% charge (worst case, for a poorly-balanced pack)? How about at the end of an 80% charge?
c) Perhaps KtG's excellent suggestion that the pack is not sufficiently isothermal for proper charging is right on. And others have questioned Nissan's decision to place cells in multiple orientations. KtG believes the rear stack gets the hottest. I wonder if the top modules in the horizontal stacks get the hottest. The question is: How much temperature variation do you see between the four temperature probes during pack charging? Is it 2C or 20C? Something in between? Certainly having a long resting period before initiating charging gives the best possibility for an isothermal pack, at least at the beginning of charging.

I must say that I am a big proponent of Nissan's decision to forego a TMS for the battery system. Clearly that is not workable for hot areas like Phoenix with the current generation of batteries, but I feel it is a good solution for future battery designs which incorporate heat-tolerant cells. But I must say that I am now wondering how often some cells end up sitting at or near 100% after an 80% charge due to pack imbalance and temperature variations. I am becoming more-and-more convinced that keeping the LEAF between about 25% and 60% SOC when not in use is a key for a long life.
 
drees said:
surfingslovak said:
For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?
That's interesting. 2 possible reasons I can think of:

1. Some modules are weaker than the others and the LEAF uses the voltage of the weakest module (or cell-pair?) to determine when the party ends.
2. With it being Arizona, the pack will be warmer. With a warm pack, voltage will sag less under load. Of course, the test was run in the morning during pretty normal temps that one can easily recreate in the Bay Area in the summer.
Personally, I think it is the first explanation. I see this as evidence that some of the cells in these degraded LEAFs are worse than others, perhaps significantly worse. I think it would be interesting to run the CELL VOLTAGE LOSS INSPECTION test in the EVB service manual and plot a histogram of the 96 cell-pair voltages to see what the distribution looks like for these degraded batteries. It might also be telling to see a chart showing module voltage from that test versus location in the pack. Hopefully we will be able to access such data using a LEAFscan.
 
The biggest problem with the Leaf's BMS (in my opinion) is the use of the Hall-effect current sensor. These are not very accurate for coulomb counting and subject to accuracy degrading effects, such as centerline drift, effects of the earths magnetic field, temperature, etc. The inaccuracy of this is why "some gids are more equal than others". Nissan compensates for this inaccuracy by applying corrections to the SoC by sampling voltage and using it formulas that also take into account the temperature, internal resistance, aging, etc. This is why you can gain/lose SoC suddenly sometimes after power cycling. It will apply changes all at once if the car is power cycled, but if in use, it will apply a correction in the form of a drift which appears as faster/slower SoC counting than real energy out/in.

I was able to meet with the Nissan engineers from Japan last December, including the battery system engineer (I had a one-on-one with him). Their explanation for why we have no direct SoC display in the car was basically that they were afraid to show it and have these corrections occasionally make it "jump" which would "confuse the customer". The Battery Systems Engineer told me that cost was the reason they used the Hall-Effect current counter rather than a more-accurate galvanic shunt.

It's looking like there is some degradation in these hot-climate packs, but it appears that the BMS (LBC) is not dealing with it properly, and not only indicating incorrect loss figures, but also possibly not allowing for full use (charging) of the packs real capacity.

Keep in mind, Nissan did a lot of testing, but the bulk of it is accelerated life tests, which attempt to simulate a much longer real-world use scenario. Unfortunately sometimes there is no substitute for real-world life testing, and it sounds like there are some unexpected results that the BMS software is not equipped to deal with.

Also remember that large automakers, especially Japanese ones, are very methodical about changing things, and it takes a long time to properly implement a fix. If that fix involves software in a critical system, (the LBC for example) it will take many hours of testing before they will even consider releasing it. I believe they will fix this, but it will be done on their terms which means it will take some time before we see a solution.

-Phil
 
Back
Top