I have been running several IoT nodes on solar charged Lithium cells (using the TP4056 charger) for years, but in cases where more than a few 100 mA is needed, this ‘dumb’ charging did not perform well. So i decided to have a look at Lead-Acid battery systems, and got myself a DFR0580 “Solar Power Manager for 12V Lead-Acid Battery” (around USD 22), for a lighting project that will use 12V LED spotlights, about 3-5A max. The DFR0580 is built around the Chinese CN3767 chip, which has mixed reviews. To start, i’m testing it with a bench power supply, and i will test with a solar panel later.
Below graphs show a discharge/charge/discharge cycle, very close to what we expect from the datasheet.
The first discharge phase ends at the Over-Discharge Protection Voltage of 10.8V. This means the charge can skip the TRICKLE charge phase, which is only necessary for deeply discharged batteries (datasheet says it should trickle charge below 11.1V, maybe it just takes a few seconds to reach this level).
So the charging starts with CONSTANT CURRENT. My power supply is set to 18V 2.5A, and the load takes about 400mA, so around 2100mA is available for the battery charging. This continues until the voltage hits 14.5V (datasheet says 14.8V cut-off voltage).
CONSTANT VOLTAGE charging at 14.5V (datasheet says 14.8V) while the current decreases. When the current reaches 38% of the 4A max charging current (should be 1.5A), it switches to float charge. I observe this switch at 1000mA current, maybe the board includes the load current, which would add up to about 1.5A.
FLOAT CHARGE at 13.5V (datasheet says 13.55V) while the current decreases from about 210mA to about 40mA in 3 hours. We can assume that the battery is fully charged at this point, as it is drawing less than 3% of its Ah rating, which would be 100mA for this battery. According to the current sensor, 2770mAh went into the battery. Note that the green LED ‘DONE’ is on from the start of the float phase.
At this point, i disconnect the bench power supply from “SOLAR IN” and the battery starts discharging with a constant load of 30 Ohm, drawing around 450mA. The battery voltage starts around 12.7V and goes down linearly to about 11.5V, then drops faster to the cut-off 10.8V. According to the current sensor, 2430mAh was drawn from the battery, which would mean:
- a ROUND TRIP EFFICIENCY of 88% (2770mAh in, 2430mAh out)
- an EFFECTIVE CAPACITY of 74% of the nominal 3.3Ah
We are charging at 0.64C, double of the recommended 0.3C, but the voltages used by the charger seem within these recommendations for fast charging.
My testing setup:
- a 12V3.3Ah/20HR SLA battery on “BAT IN”, via an INA219 current sensing module
- a constant load of 30 Ohms on the 12V output of the DFR0580 “OUT2” (3 resistors in series: 5W5R, 5W5R, 5W20R, drawing around 450mA)
- “SOLAR IN” connected to my bench power supply, set to 18V 2.5A
- an ESP32 microcontroller for data logging over wifi, powered by “USB2”
- MPPT switch OFF
Note: the INA219 uses a 0.1R current sensing resistor, which causes a small voltage drop across the sensor. In the above voltage graph, the green line is the charger side (Vin+) and the blue line the battery side (Vin-), with a difference of about 0.2V when 2A is passing through. This may slightly influence the charger’s behaviour because it does not see the exact battery voltage in higher charge/discharge situations.
Running the same experiment with a lower charging current available (0.17C, well below the usually recommended 0.3C) gives a different result. With the PSU set to 18V 1A, and the same constant load of 30 Ohm, we get CONSTANT CURRENT charging at around 550mA, while the voltage going up from around 12.4V to 13.8V, but then for some reason it skips the CONSTANT VOLTAGE phase and goes straight into FLOAT charging at 13.5V. This starts with 285mA (higher than above 210mA) and decreases slower than above because the battery is obviously charged less; after 3 hours of float phase it is still drawing 80mA (vs 40mA above). I let it decrease to 50mA, 5 hours of float charge. At this point, 2500mAh went into the battery.
Again i disconnect the bench power supply, and the battery starts discharging into the 30 Ohm load at around 450mA, same as above. Voltage starts at 12.8V and according to the current sensor, 2350mAh was drawn from the battery, which would mean:
- a ROUND TRIP EFFICIENCY of 94% (2500mAh in, 2350mAh out)
- an EFFECTIVE CAPACITY of 71% of the nominal 3.3Ah
So it appears that with a lower charging current (and skipping the CV phase), the battery charges more efficiently (higher round trip efficiency 94% vs 88%), but has a slightly lower effective capacity (71% vs 74% of nominal capacity).
Next step is testing with an 18V 20W solar panel.