Battery Upgrades are very possible

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
DougWantsALeaf said:
I believe we have now seen at least 2 Kona fires, and a few Tesla ones.
And at least one Bolt fire, a Porsche Taycan fire, and an eTron fire. The eTron fire was in Germany, car went off the road at a high rate of speed. Driver was not injured, walked to road, caught a ride to town, car caught fire shortly before driver and police returned to it.

The issue is the cooling fluid is water (+ antifreeze and such). If the battery is damaged, water gets into the cells and starts a fire. These fires are generally slower and less violent than gasoline fires.

DougWantsALeaf said:
I have yet to year about a Leaf Pack blowing. Not saying the 0 thermal mgmt is a good idea. I wish it had it, just saying Nissan Leaf's have been extremely safe vehicles. On par with SUVs.
https://www.torquenews.com/1083/volkswagen-golf-and-nissan-leaf-defy-safety-trends-much-safer-other-small-cars
For some uses, and not others, I'd rather have the LEAF style passively cooled pack. It is safer.

If your idea of driving is blasting across the desert at not even close to a legal speed and anyone that sees you looks down for the legal disclaimer "Closed course. Professional Driver. Do not attempt"..... Then the LEAF isn't for you.

If your idea of driving is a second car, very safe for around town and/or for commuting a distance the LEAF can do, then I'd recommend the LEAF. Assuming it meets requirements other than drive train, of course.

Courses for horses.
 
mux said:
As a very general rule of thumb: SOH = 80% when internal resistance at 25C is about 2x as high as a new battery.

My 2013 (11/22/13) Leaf resistance data;

11/20/14 -13,700 miles, 76 mohms per LeafDD, 20 Deg, 73% SOC
11/27 -13,800 miles, 67 mohms per LeafDD, 25 deg, 63% SOC
11/30 - 13,900 miles, 56 mohms per LeafDD, 27 deg, 71% SOC
12/2 - 14.100 miles, 55 mohms per LeafDD, 28 deg, 67% SOC, 90% SOH
12/16 - 14,500 miles, 89 mohms per LeafDD, 15 deg, 93% SOC
12/27/14 - 14,800 miles, 103 mohms per LeafDD, 11 deg, 24% SOC
3/10 - 17,400 miles, 60 mohms per LeafDD, 30 deg, 73% SOC
3/14 - 17, 550 miles, 56 mohms per LeafDD, 32 deg, 47% SOC, 85% SOH
4/14 - 19,100 miles, 59 mohms per LeafDD, 25 deg. 38% SOC
5/4 - 19,989 miles, 64 mohms per LeafDD, 24 deg. 48% SOC
5/15 - 20,400 miles, 73 mohms per LeafDD, 20 deg. 41% SOC
5/22 - 20,700 miles, 58 mohms per LeafDD, 28 deg. 50% SOC
12/10/15 - 28,000 miles, 90 mohms per LeafDD, 19 deg. 92% SOC
4/5 - 32,000 miles, 74 mohms per LeafDD, 24 deg, 55% SOC
5/16 - 33,700 miles,89 mohms per LeafDD, 22 deg, 47% SOC
5/16 - 33.700 miles, 58 mohms per LeafDD, 31 deg, 76% SOC, 80% SOH
10/5 - 39,300 miles, 100 mohms per LeafDD, 22 deg, 50% SOC
10/6 - 39,400 miles, 61 mohms per LeafDD, 30 deg, 51% SOC
10/7 - 39,500 miles, 80 mohms per LeafDD, 25 deg, 56% SOC
10/15 - 40,000 miles, 71 mohms per LeafDD, 27 deg, 45% SOC
10/30 - 41,000 miles, 74 mohms per LeafDD, 23 deg, 66% SOC
12/26/16 - 43,000 miles, 110 mohms per LeafDD, 13 deg, 77% SOC
6/10/17 - 49,600 miles, 89 mohms per LeafDD, 19 deg, 70% SOC
7/1/17 - 51,000 miles, 62 mohms per LeafDD, 33 deg, 44% SOC
8/15/17 - 53,400 miles, 61 mohms per LeafDD, 35 deg, 57% SOC, 75% SOH

mux said:
SOH scales with roughly IR^2. So you don't generally see exactly 20% capacity loss when the battery internally reports 80% SOH. And we use this to estimate SOH during driving.

Please explain.
 
Perhaps LeafDD is not as accurate as it could be. IR is also highly dependent on SOC and temperature.
 
coleafrado said:
Perhaps LeafDD is not as accurate as it could be.

Also checked & verified with LeafSpy. Besides both app developers basically used the same BMS CAN bus data.

coleafrado said:
IR is also highly dependent on SOC and temperature.

The data presented does corroborate temperature as a key independent variable, but with hardly any SOC effect,
unless you're past the voltage "knee". Most data indicate that Li-ion battery internal resistance changes with SOH
but to a lesser degree than other battery technologies, e.g. lead acid.
 
LeafDD does not calculate IR anywhere near correctly. You can't use that data. Neither can you use Leaf Spy. Both try to use short-term voltage drop and pack voltage, both of which are dominated by other effects. IR is not a straightforward computation in that way.

Internal resistance as calculated by e.g. LeafDD basically only measures what's commonly (definitely in Nissan's documentation) called the polarization characteristic of the pack - the short-term ability of the pack to provide current, typically on 30-60 second timescales. But batteries are chemical devices - they run on ion migration between the electrodes, and that ion migration is set up by a chemical potential inside the cell. In short bursts, almost all of the energy you extract from a li-ion cell is from ions inside or near the electrodes which can quickly migrate in and thus the cell appears to have a low internal resistance. As time goes on, the cell depletes this store of ions and has to draw ions from across the cell, and THAT regime is what we mean by internal resistance. Sort of. There's obviously even more to it. Here's a graph of how a typical current pulse train IR meter works and what you'd see:

a-HPPC-test-of-lithium-ion-battery-b-Simplified-lithium-ion-battery-ECM-based-on-EIS.png


The Leaf BMS has a - somewhat broken - algorithm for trying to extract all three of those resistance and time characteristic values from both driving and charging scenarios. These values are expressed in the engineering data in 0x7BB requests but - as we all know - aren't that straightforward to understand if you don't have a background in battery engineering.
 
mux said:
LeafDD does not calculate IR anywhere near correctly. You can't use that data. Neither can you use Leaf Spy. Both try to use short-term voltage drop and pack voltage, both of which are dominated by other effects. IR is not a straightforward computation in that way.

Internal resistance as calculated by e.g. LeafDD basically only measures what's commonly (definitely in Nissan's documentation) called the polarization characteristic of the pack - the short-term ability of the pack to provide current, typically on 30-60 second timescales. But batteries are chemical devices - they run on ion migration between the electrodes, and that ion migration is set up by a chemical potential inside the cell. In short bursts, almost all of the energy you extract from a li-ion cell is from ions inside or near the electrodes which can quickly migrate in and thus the cell appears to have a low internal resistance. As time goes on, the cell depletes this store of ions and has to draw ions from across the cell, and THAT regime is what we mean by internal resistance. Sort of. There's obviously even more to it. Here's a graph of how a typical current pulse train IR meter works and what you'd see:

The Leaf BMS has a - somewhat broken - algorithm for trying to extract all three of those resistance and time characteristic values from both driving and charging scenarios. These values are expressed in the engineering data in 0x7BB requests but - as we all know - aren't that straightforward to understand if you don't have a background in battery engineering.

Where're your actual longitudinal data, i.e. years of data and not sequential charges/discharges for 10 hrs, supporting significant
degradation of the Li ion battery conductance over time? That's what's key, right? You haven't provided any, only your theory. From any
battery's functionality standpoint, a measurement of no-load output voltage minus the full-load voltage divided by the load current is
the typical/standard engineering approach used. Using that approach has utility in meaningfully determining the power consumed
by a battery as a function of its load in a given design application. Battery power consumption relates to the battery's efficiency and
to the overall BEV's efficiency. Furthermore, when that approach is used on the Leaf, those data correlate with the LeafDD & LeafSpy
data unrelated to BMS calculations/data. Where're your references to technical papers refuting using that standard engineering approach
for Li ion batteries, for that matter any battery in any application?
 
Do you need to defend something? Did you make LeafDD or something? That's an awfully strong response to just a straight up textbook explanation of how IR works. If you want an education on battery technology, get some relevant courses on a university and don't attack the people trying to carefully explain concepts that you basically can't find anywhere outside of academic and highly specialized engineering fields.

And no, internal resistance is not volts divided by amps. This is an erroneous approach to engineering: taking a term at face value and reverse engineering its meaning. Internal resistance is something that acts LIKE resistance but is in fact made up of various different electrochemical effects, and the only way to properly measure it is by measuring its constituent parts. This is not trivial, because as discussed before a BMS can generally not do lab condition testing on its attached battery, so a bunch of very sophisticated algorithms are designed to try and isolate the effects we're interested in from the currents generated by real-world use of the battery. Even then, really estimating battery health accurately is still a challenge, something we see in the quirks of the Leaf's BMS for instance - but in other BMSes as well, like that of the Renault Zoe.

Now, you make another big mistake, and that is equating battery energy loss to dissipation in internal resistance. You'll find that the energy efficiency of a lithium ion cell is very poorly modeled this way. If we take a real-world example, the datasheet value for internal resistance of the 40kWh Leaf battery cells is approx. 1 mohm per cell. With 192 cells in 2P/96S configuration, that's 48 mohm for the entire pack. If we take this value and simulate a 40kW/50min DCFC session, over the course of that charging session, the energy dissipation in the stated datasheet value of the internal resistance of all the cells should be approx 500Wh (I^2R power is 110 * 110 * 0.048 ~ 580W), or 2MJ. At a pack weight of 300kg and 0.8kJ/kgK of thermal capacity, we'd expect 2MJ to heat up the pack by 2e3/(0.8*300)~8.3K. However, in real life we see the pack consume approx. 2kWh (close to 7MJ) extra and heat up by 25-30C in such a situation. Where did all that energy go if the internal resistance is so low - even if we measure it during charging?

TL:DR: what you're talking about is ESR, not internal resistance. ESR is not a relevant value for battery health calculations, except for extreme situations like catastrophic cell failure, delamination, ultra-low SOH way beyond the service life of the cell even in stationary applications, etc.
 
mux, you are a true asset to the EV community. Thanks!

And.....any suggestions for on-line university courses in Li battery tech? I'm a MSEE but I'm pretty sure Li batteries didn't exist when I was in school :eek: I'm interested in learning more about this stuff but it would have to be free or low-cost as I have no plans to enter the business.
 
mux said:
And no, internal resistance is not volts divided by amps. This is an erroneous approach to engineering: taking a term at face value and reverse engineering its meaning. Internal resistance is something that acts LIKE resistance but is in fact made up of various different electrochemical effects, and the only way to properly measure it is by measuring its constituent parts.

Still no references? Here're a few;

"Comparison of Several Methods for Determining the Internal Resistance of Lithium Ion Cells"

Using “Ohm’s Law”, the total effective resistance is subsequently calculated by dividing the change in voltage by the change in current. This common method is described in literature [3,9,10]. The internal resistance is in series with the voltage of the battery, causing an internal voltage (change) drop. With no current flow, the voltage drop at the internal resistance is zero, thus, the voltage at the output terminals is governed by open circuit voltage. If a load is applied to the battery (positive or negative during charge and discharge), the load resistance is in series with internal resistance (Ri) of the cell.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3247723/

Two methods are used to read the internal resistance of a battery: Direct current (DC) by measuring the voltage drop at a given current, and alternating current (AC), which takes reactance into account. When measuring a reactive device such as a battery, the resistance values vary greatly between the DC and AC test methods, but neither reading is right or wrong. The DC reading looks at pure resistance (R) and provides true results for a DC load such as a heating element. The AC reading includes reactive components and provides impedance (Z). Impedance provides realistic results on a digital load such as a mobile phone or an inductive motor.

https://batteryuniversity.com/learn/article/rising_internal_resistance


mux said:
If we take a real-world example, the datasheet value for internal resistance of the 40kWh Leaf battery cells is approx. 1 mohm per cell. With 192 cells in 2P/96S configuration, that's 48 mohm for the entire pack. If we take this value and simulate a 40kW/50min DCFC session, over the course of that charging session, the energy dissipation in the stated datasheet value of the internal resistance of all the cells should be approx 500Wh (I^2R power is 110 * 110 * 0.048 ~ 580W), or 2MJ.

Your assumption is that no tapering occurs, so that a total battery energy loss of 580 * 50/60 = 480 Wh, occurs. So no BMS control,
an unrealistic test setup.

mux said:
At a pack weight of 300kg and 0.8kJ/kgK of thermal capacity, we'd expect 2MJ to heat up the pack by 2e3/(0.8*300)~8.3K.

Based on your numbers.

mux said:
However, in real life we see the pack consume approx. 2kWh (close to 7MJ) extra and heat up by 25-30C in such a situation.

Your source for kj/kgK? You haven't defined the battery's thermal resistance to the Leaf body and at what ambient.
Where're the actual fully documented test results, e.g. 2000 Wh versus 480 Wh losses, total energy from the DCQC
data printouts? Or is this your interpretation?

All that's provided is your interpretation of a single arbitrary uncontrolled/unscientific test. You still haven't provided any longitudinal
internal resistance data or referenced any papers supporting your views.
 
goldbrick said:
I'm a MSEE but I'm pretty sure Li batteries didn't exist when I was in school :eek: I'm interested in learning more about this stuff but it would have to be free or low-cost as I have no plans to enter the business.


Here's another reference;

A study of the influence of measurement timescale on internal resistance characterisation methodologies for lithium-ion cells


The pure Ohmic resistance R0 can be calculated from the falling edge of a voltage response pulse, from the instantaneous voltage drop when current stops. The value of R0can also be calculated from switching current, by measuring instantaneous voltage change due to any current change. In principle, the R0 value calculated with these methods should be equal to that calculated from the rising edge of the pulse. However, due to the preceding current load, in the falling edge scenario, the electrode surface of the cell is highly polarised30. When the current load is switched-off, the system equilibrates and the non-intercalated cations in the double-layer diffuse back into the electrolyte bulk. The difference in Li-ion concentration at the electrode/electrolyte interface between the rising and falling edge of a pulse, results in a small voltage differences and consequently R0.

https://www.nature.com/articles/s41598-017-18424-5

Conclusion

From comparison of the results, for the first time it has been shown that it is not the non-linearity of the lithium-ion battery, as suggested in other studies, rather the timescale associated with the technique itself that influences measured internal resistance. If the timescales at which the measurements are taken can be reconciled, the resulting values of resistance are comparable across the techniques.

Again:

mux said:
However, in real life we see the pack consume approx. 2kWh (close to 7MJ) extra and heat up by 25-30C in such a situation. Where did all that energy go if the internal resistance is so low - even if we measure it during charging?

That implies a resistance of 165 mohms (2000/110^2), i.e. over 3X the measured value of the 40kWh battery using an accepted methodology
for determining battery resistance noted in research papers. The quoted statement is based on the unproven assumption, of thermal capacity and
that the power dissipated will be about 2kW versus 200W, a circular analysis.
 
lorenfb said:
Your source for kj/kgK? You haven't defined the battery's thermal resistance to the Leaf body and at what ambient.

0.8 kJ/kg-K is pretty typical for lithium batteries, it's closer to the upper limit of thermal capacity for the cathode/anode/separator. Steel is closer to 0.5, so any additional modeling would just lower it (and increase the temperature rise). As for taper, if you can CHAdeMO a Leaf 40 to 100% in a single hour, your average power is going to be about 40 kW. Heat loss in the hour of charging is basically negligible, maybe 0.5-1 kelvin's worth. Any model can be criticized, all are "wrong," but some are useful (as is this one)

Disclosure: am physicist
 
lorenfb, you've kind of disqualified yourself from the discussion at this point. I've put you on my foe list so as to automatically ignore you. There is nothing worse than somebody who thinks they know something but obviously don't - and then become beligerant when you point it out. Obviously I'll unblock you if what I said is untrue or there was some misinterpretation along the way that caused this spat, but I've had enough for now.

There is also nothing wrong about being wrong. Honestly, battery engineering is a very opaque field and most misconceptions addressed here are a direct consequence of an information vacuum. Some of the biggest 'internet sources of information' on the subject are wrong or so incomplete as to be useless, but often taken as gospel, e.g. battery university and Straubel's claims. And you then see this echoing years later in e.g. the MNL threads about the engineering information given by the BMS (what's Hx? this was a mystery for 7 years - any BMS engineer could have pointed it out in seconds though).

But to get to the point: this lack of information on the wider internet does not mean this information is secret or unimportant. You need to know a fair chunk of this battery engineering stuff before you can make safe and durable batteries for electric cars. The devil really is in the details. This is obviously gatekeeping a little bit, but considering the risks and liability involved here I'd call that justified. This is the root of my criticism of RdS/EVBR. And obviously all this information is learnable, and academia is not the only place to do that.

Because it is actually fine to make mistakes and **** around a bit, within muxsan we have tons of aspects of our engineering that aren't nearly perfect yet. Many unimplemented features that I'd consider standard fare for any respectable battery manufacturer. This is just a fact of life when you have to balance business interests and engineering perfectionism, especially within a startup. But... that is engineering at heart: making tradeoffs. In order to make safe and appropriate tradeoffs, you need to know the design space - you need to be fully aware of the risks of every cut corner and weigh its pros and cons. For this, you need to have an accurate model of the world to work with and use your wording precisely.
 
mux said:
lorenfb, you've kind of disqualified yourself from the discussion at this point. I've put you on my foe list so as to automatically ignore you. There is nothing worse than somebody who thinks they know something but obviously don't - and then become beligerant when you point it out. Obviously I'll unblock you if what I said is untrue or there was some misinterpretation along the way that caused this spat, but I've had enough for now.

Typical, when one can't rationally present a plausible alternative argument, an ad hominem attack results. It's really unfortunate.
 
ginetto said:
Another custom battery for leaf... just to have more flame ;)

That looks a bit more than a one-off custom! I can't interpret what he's saying, but it looks like they've put a bit of engineering in it. Looks like 18650's inside all those compartmentalized packs.
Am I reading that correctly, it's 41 kWh?
 
dean said:
ginetto said:
Another custom battery for leaf... just to have more flame ;)

That looks a bit more than a one-off custom! I can't interpret what he's saying, but it looks like they've put a bit of engineering in it. Looks like 18650's inside all those compartmentalized packs.
Am I reading that correctly, it's 41 kWh?
Switch on CC and switch the translation to English. ;)
 
It looks like an interesting effort, the mounting hardware looks well-made, but it's a shame they didn't add cooling. Cylindrical cells really hate heat. Also unclear how they've bonded the cells mechanically and electrically.
 
Back
Top