Level 2 charger and rooftop solar

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Megunticook

Member
Joined
Aug 30, 2019
Messages
13
I bought a Leaf in 2019 and have been using the Nissan-supplied level 2 charger which is mounted in a shed and supplied with a 40A 240V dedicated circuit. Works great so far.

My house has a grid-tied rooftop solar system with battery backup. I've never tried charging when the grid is down but thinking I should get set to make that possible. At solar noon I can produce as much as 6.4 kW (will be 7.2 after I install another string this spring). The battery bank is modest--pulling it down to 50% capacity would yield only about 10kWh so not nearly enough to charge the car fully. So am thinking I would just want to charge for a couple hours around the middle of the day.

My understanding is the Leaf will draw no more than 6.7kW when charging, or about 28 amps at 240V. My system can handle that (inverter maxes out at 33 amps) but wondering about getting a charger that allows you to step the charging current down which would probably work better for me when solar charging. The Grizzl-e seems like a great choice (16A, 24A, 32A, 40A). Anybody use this charger?

Anybody out there charging off a solar system that could offer any real-world advice?

I plan to continue using the grid to charge the car for the most part, I just don't want to be caught with my pants down then the grid goes black. And at some point if we lose net metering then I may go completely off grid.

Thanks for any advice.
 
Given the modest need for charging, and the limited capacity of the system, I think that you should be looking at L-1 charging, with the existing EVSE. You could probably, depending on local and national codes, tap the existing circuit for a 120 volt outlet - IF you have a Neutral in place. Then you could just switch the EVSE plug between outlets, after slipping on the 120 volt adapter. If not, then how about a lower power L-2 charging station - say, 16-20 amps? Since this is essentially an emergency backup, I don't see a need for a 30A L-2.
 
Level 1 is so inefficient I think I won't even consider that for home.

A lower amp Level 2 is probably the way to go. The Grizzl-e can do 16 or 24 amps. I'm sure I could charge at 16 amps most sunny days between 10-2.
 
We have a ChargePoint EVSE which is adjustable, we have it set at the minimum amperage which is 16A. Runs on the inverter, but like you say the batteries will soon be depleted if the sun quits shining. I am always amazed at how much power it takes to drive a car compared to an entire household (with efficient utilities and lifestyle, which is critical for off-grid use).
 
Megunticook said:
Level 1 is so inefficient I think I won't even consider that for home.

A lower amp Level 2 is probably the way to go. The Grizzl-e can do 16 or 24 amps. I'm sure I could charge at 16 amps most sunny days between 10-2.

I use this for my overnight charging: https://www.amazon.com/BESENERGY-Current-Switchable-220V-240V-Portable-Compatible/dp/B07Y9WSBJS

10A @240v works out to about 2kwatts net to the battery and you keep the L2 charging efficiency while putting minimal load on your PV system. There's also a button to charge at 16A@240v but I haven't used that setting since my goal was slower charging to allow for better cell balancing since the overall charge rate is roughly 1/3 of the OEM L2 charge rate. Speaking of which I also have the OEM EVSE hooked up when I need the 'faster' L2 charge rate on occasion.
 
HerdingElectrons said:
LeftieBiker said:
Heh. I've been charging on L-1 only for 8 years. Now there are two of us doing it, one an ePlus. But we're retired.

And you're consuming almost 10% more electricity by using L1 instead of L2 for 8 years...yikes

The old-hydroelectric dam that produces it is 3/4 of a mile away, and if the math still bothers you, we drive - combined - less than 2k miles per year. When I think of the thousands and thousands of miles that other people drive in their grid-powered EVs, well, "yikes" is what I think. ;)
 
So about 500 kWh per year, which might cost as little as $50 annually, so L1 "wastage" is about $5. It would take a long time to offset the cost of installing an L2 EVSE...
 
L1 is slower but what makes it inefficient? The same circuitry is used in the OBC which includes power factor correction.
 
nlspace said:
L1 is slower but what makes it inefficient? The same circuitry is used in the OBC which includes power factor correction.

Stray loads (cooling pumps and computers and such) and operating the charger well below designed peak efficiency operating point.

A charger is a DC to DC converter, and some power used in the charger is constant regardless of load current. So efficiency falls at lower power levels.
 
LeftieBiker said:
HerdingElectrons said:
LeftieBiker said:
Heh. I've been charging on L-1 only for 8 years. Now there are two of us doing it, one an ePlus. But we're retired.

And you're consuming almost 10% more electricity by using L1 instead of L2 for 8 years...yikes

The old-hydroelectric dam that produces it is 3/4 of a mile away, and if the math still bothers you, we drive - combined - less than 2k miles per year. When I think of the thousands and thousands of miles that other people drive in their grid-powered EVs, well, "yikes" is what I think. ;)

Fair point & that makes your exclusive use of L1 practical. I drive 20k miles/year so that 10% loss in conjunction with California's electrical rates makes L2 a minimum choice IMHO. Also my ability to connect/wire both of my EVSE's myself makes the labor cost a non issue for myself.
 
nlspace said:
L1 is slower but what makes it inefficient? The same circuitry is used in the OBC which includes power factor correction.

Electrical 'pressure' via the lower 120V vs the higher voltage of 240V is where the efficiency difference comes from. The roughly 10% difference is true across any comparable electrical device that can operate at either voltage. I first learned about this reading computer power supply reviews that were conducted outside the US because I have been building & modifying my computers for 25 years.
 
WetEV said:
Stray loads (cooling pumps and computers and such) and operating the charger well below designed peak efficiency operating point.

A charger is a DC to DC converter, and some power used in the charger is constant regardless of load current. So efficiency falls at lower power levels.

I think you meant to say the onboard charger is a AC to DC converter but it's not lower power levels causing the efficiency drop, it's the lower input voltage.

I bought a 240 10A EVSE to charge my car at roughly 1/3 the rate of the OEM 240V 28A rate to allow for the individual cells to have more time to balance themselves while charging but the overall power consumption from the wall is the same because the voltage is the same. In fact ironically it might be slightly less because the inverter cooling loop fan doesn't run the entire charging session due to lower heat load in the charger. It tends to periodically cycle vs running constantly. In the summer months that will likely not be true but it's an interesting anecdotal observation.
 
nlspace said:
L1 is slower but what makes it inefficient? The same circuitry is used in the OBC which includes power factor correction.

I'm not sure of the specific reason but I know research has shown that a greater percentage of electricity is wasted when charging with L1.
 
Megunticook said:
nlspace said:
L1 is slower but what makes it inefficient? The same circuitry is used in the OBC which includes power factor correction.

I'm not sure of the specific reason but I know research has shown that a greater percentage of electricity is wasted when charging with L1.

The higher the voltage, the less resistance the wire puts up. The cooling pump, electronics, etc. are all using the same power, but given the same wire that the 120V or 240V will flow through, the 120V have more "waste" heat than the 240V going through the same wire. Granted, it's a small amount of waste heat, but little things add up. So if the (as an example), the power going in was 120V @ 12 amps instead of 240V @ 12 amps, the cable would get warm to the touch over time, as that is the waste heat you are feeling. The same line using 240V would feel right around the ambient temperature after the same amount of time because the 240V would have less resistance through the line. So, it's a compounding effect, the 120V takes longer to charge and is also giving out more waste heat, that adds up in the long run (hours or days) when it comes to charging in the grand scheme of things of total power from the wall to the battery.
 
HerdingElectrons said:
WetEV said:
Stray loads (cooling pumps and computers and such) and operating the charger well below designed peak efficiency operating point.

A charger is a DC to DC converter, and some power used in the charger is constant regardless of load current. So efficiency falls at lower power levels.

I think you meant to say the onboard charger is a AC to DC converter but it's not lower power levels causing the efficiency drop, it's the lower input voltage.

The onboard charger is converting AC to rectified DC, then that rectified DC to DC to match the voltage and control the current into the battery. Two step process.

Lower input voltage also reduces efficiency, in both the AC to DC and in the DC to DC sections.


DC to DC design is optimized with inductor and FET switch design. I've designed lower power units. The optimization is usually for the usual maximum voltage and current.


HerdingElectrons said:
but the overall power consumption from the wall is the same

More curiosity than anything. Measured how?
 
So if the (as an example), the power going in was 120V @ 12 amps instead of 240V @ 12 amps, the cable would get warm to the touch over time, as that is the waste heat you are feeling. The same line using 240V would feel right around the ambient temperature after the same amount of time because the 240V would have less resistance through the line.

12A is 12A and the resistance of the wire shouldn't change noticeably at 120V vs 240V. There may be some slight differences due to inductive changes since the power is AC but the resistance of the wire (per unit of length) varies mostly with the wire size and temperature. I've never heard of voltage being a factor.

I think you are referring to constant power instead of constant current. So comparing 120V at 12A vs 240V at 6A (for identical power transfer) will result in a smaller transmission loss in the wire using 240V since the same amount power requires only 1/2 the current. The heat lost in the wire is mostly just I_squared* R so reducing the current by 2 cuts the waste heat by a factor of 4.
 
I'm fairly sure that L-1 at 12A pulls 1300 watts as the input energy. I'm also fairly sure, though, that the main loss is through the cooling fans/pumps running for twice as long.
 
Blarg... I got it all wrong in my head, amps is amps. :lol:
There is something about the voltage that affects the efficiency, so for example, trying to run 12 amps through a wire at 1V will produce more heat than trying to run the same 12 amps through a wire at 100V, but I can't remember why.... :?

goldbrick said:
So if the (as an example), the power going in was 120V @ 12 amps instead of 240V @ 12 amps, the cable would get warm to the touch over time, as that is the waste heat you are feeling. The same line using 240V would feel right around the ambient temperature after the same amount of time because the 240V would have less resistance through the line.

12A is 12A and the resistance of the wire shouldn't change noticeably at 120V vs 240V. There may be some slight differences due to inductive changes since the power is AC but the resistance of the wire (per unit of length) varies mostly with the wire size and temperature. I've never heard of voltage being a factor.

I think you are referring to constant power instead of constant current. So comparing 120V at 12A vs 240V at 6A (for identical power transfer) will result in a smaller transmission loss in the wire using 240V AC for the OBC since the same amount power requires only 1/2 the current. The heat lost in the wire is mostly just I_squared* R so reducing the current by 2 cuts the waste heat by a factor of 4.
 
Back
Top