Continue to Site

Welcome to

Welcome to our site! is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Solar System Battery Discharge Limit

Aussie Susan

Advanced Member level 5
Jan 5, 2015
Reaction score
Trophy points
Activity points
I have a solar panel and battery system at home where the batteries store 10kW.
The way it is set up, the inverter will discharge the batteries (at night etc.) down to 40% and will then start drawing from the grid.
During summer (especially when we can get usable charge to 1900 or a bit after) this 40% is almost never a problem and we live on the panels and battery nearly all the time.
However we are now approaching winter in my part of the world and sometimes (very cloudy and/or rainy days) we don't get a fill 100% charge during the day. Also the panels stop charging around 1600 (or earlier), just before we start using the heavy power using items (induction stove top, microwave oven and occasionally the wall oven). Recently we have reached the 40% limit by 1900 or 2000, just when our power usage drops right off. (The house idles at about 200-300W during the night.)
I can monitor the system (a GoodWe inverter) using my Home Assistant system which also give me a little control of the operating parameters.
It might be my thrifty background, but I really like not drawing from the grid when I don't have to.

Finally my question: one of the parameters that I can alter is the 'battery discharge limit' that is set to 60% (hence the 40% remaining when it switches to the grid). What impact would this have on the battery life if I were to set that to 70% or 80% to get a bit more out of the power that has been generated during the day? (I'm thinking of doing this during the winter months and then returning to the 60% for the rest of the year.)


10kW? or 10kWh?

What does 1900 mean? is it Wh?

To your question:
* you say you don´t get 100% fill ...
* but then you say "to get a bit more out of the power" (which should be "energy")

So let´s imagine you fill it from 40% to 80%. (charged 40%)
* Now you discharge it to 20% ... so obviously you get "a bit more energy" (60% usable)
* but the next day you start at 20% ... and it should be obvious then to get it charged up to 60% only. (40% charged)
* so the following night you again discharge it to from 60% down to 20% ... (40% usable)
(ignoring battery efficiency)

Indeed for long term you win nothing. In this case the only option is to install more solar cells.

This is different when the battery over the day is charged up to 100%.
Then you can use down to 20% (instead of 40%) --> usable energy rises from 60% to 80%.
Here you have the "optimize" option is to install bigger batteries .... enough to power the house the whole night.

--> you can´t draw more energy than you charge
--> it makes not much sense to install huge batteries when you don´t need the energy

Aging: the battery manufacturer should be able to give reliable informations.

I'm no expert in this topic but I wonder if there is an overall advantage in setting a lower battery 'charge state' limit as long as it doesn't go below say 10%. Battery manufacturers give best lower limits for prolonged battery life but in a situation where they are charged and discharged daily and only have say 1,000 cycles life expectancy (~3 years) anyway, do you get more 'bang for your buck' by pushing them hard or being kind to them and running up grid costs instead.

I have a similar but not identical situation here, two PV arrays at 90 degrees to each other, one (1.8KW max) facing midday sun and the other (600W) facing the evening sun. For people living in Wales the concept of sunshine may be unfamiliar to them, if only we could make 'rain panels', we could supply the Worlds electricity demands!
I also consume about 300W overnight but daytime use varies wildly between 300W and 10KW, sadly I can't export electricity to offset costs but I can make best use of what I generate. Our big difference is I have no batteries so I'm in a use it or lose it situation but I do closely monitor and record my generated and consumed power by measuring both every 5 seconds and using an algorithm to calculate present costs, accumulated monthly cost and predicting the end of month bill. As the monitoring is distributed over a fairly large area, it is all WiFi linked and uses MQTT to carry the data to the recording system.

I think what we would both ideally have is a see-saw arrangement where PV and grid are proportionately supplying demand to give best economy. Willing to share technology here if its of any use to you.

@KlausST - Sorry - typo on my part: 10kWh
1900 - some folks call that 7PM.
Perhaps I was unclear while also providing so much unnecessary detail. Over the last month the batteries have not reached 100% during the day on 6 occasions - 4 if you allow 95% as being 'full'. The vast majority of days I get to 100% no later than about midday and therefore start exporting to the grid for at least a few hours after that. Therefore that you are saying about the charging falls more into the 2nd part of your comments.

@betwixt - I have 20 panels facing north that give me a peak of about 5.5kW (in summer I must admit - but over the last month the peak has been over 4kW - and I know it is the integral that counts!!!).
Your comment about going below 10% is in line with other things I've heard but most of the information I can see (thanks Google) is about phone (and the like) batteries and whether you should not go below 20% and not over 80% etc.. Therefore I came looking here for people with real knowledge of the best way to manage solar batteries.
It is difficult to determine the cost impact with your data, given no battery environmental conditions, age or specs.

Aging rate depends on time spent outside the optimum %SoC range which varies with chemistry types and total cumulative |kWh| transferred. Antagonists include internal power loss I^2 * ESR, junction temperature in summer or integrated temp rise, outgassing. Also desulphation due to low % SoC < 30% which raises ESR. The net capacity changes with min/max %SoC and the usage/aging time is also affected.

Lead acid optimal range is 30 to 80%(?) 90%(?) if deep discharge type.
Lithium types optimally 20 ~80% LiFePO4
Flow Batteries (Vanadium Redox, Zinc-Bromine) 0 to 100% SoC.

My recommendations:
  1. Change the SOC thresholds for min:max to what Battery University or OEM suggests, don't just take my word for it. Lead acid 100% may be too high from outgassing but is also used to normalize cell imbalance. But if can sell surplus power, consider 90% max.
    Pulse charge desulphation devices are addons that promote low ESR. For Lithium when charge changes from CC to CV, this is often when Lithium ages fastest with voltage above nominal Vf..
  2. Converting stove/oven to gas is a very cost-effective solution. (1~2yr payback.)
  3. Assuming you have time-sensitive grid rates, adjust capacity and %SoC range to cover the prime daytime rates.
  4. Lead acid batteries are expected to last 5 yrs min. to 80% capacity but often are over-used and capacity can be < 50% after 5 ys.
  5. Get a Battery load tester to record monthly or bi-annual battery aging and consider a BMS that tracks for lifespan kWh charge transfer vs capacity.
1900 - some folks call that 7PM.
Oh, it means time. I´m used to 19:00. Never seen them without the dots.

So the situation to focus on is when the batteries are fully charged.
In post#4 you wrote (At least that´s what I understand) in this case you sell the energy during daytime ... and eventually have to re-buy it on nighttime.
Financially in worst case we talk about: (buy_price - sell_price) * (40% - 20%) * 10kWh = (buy_price - sell_price) * 2kWh ... per day.

Now you say it is not fully charged each day.... maybe it is also not fully discharged (to the set level) each night. Thus the above number must not be multiplied with 365 for a whole year. It will be less.

Now you know what you can expect to "win".
Still unknown is how much you lose because of reduced battery lifetime.

You could improve the situation by adding batteries.

If it was my situation, I´d surely consider what happens on power outage in the early morning. How long can you "survive" with the remaining energy without grid?
In this "extreme" case you could set the discharge SoC limit to 10%
In one case you have (40% - 10%) * 10KWh = 3kWh
in the other case you have (20% - 10%) * 10kWh = 1 kWh ... so only a third
... Or half if you discharge to 0%.

One surely would try to reduce the consumption to a minimum. Refrigerators, heating control, pumps, light ....

A user once bought some semi-trailer batteries and kept them fully charged by solar panels so they could provide energy during emergency power failures and became disappointed when they failed after 3 years and were hardly ever used.

Another user had similar batteries that were discharged for many days at a low state of charge during the winter trying to extend the useable capacity find that the life expectancy was accelerated lower.

Another user in a hot climate found their car batteries had to be replaced every 2 years..

What all of these cases demonstrate are the non-ideal states for self-discharge, electrode oxidation, thermal failure rate stress from Arrhenius Effects (%50 MTBF loss for every 10'C rise) and rapid acceleration of cell mismatch capacity (Ah) towards battery death, where the weakest cell becomes over and undercharged 1st.

All battery banks with more than one cell and no BMS equalizers will fail with the symptom of 1 dead cell.
The ideal case is all expire at the same time equally. This cell death syndrome is due to a runaway effect of the cell capacity & voltage mismatch where the mismatch can accelerate every cycle. Flooding the cells should normalize the cell voltage but at the expense of some electrolyte loss. It can be directly measured in lead acid batteries by specific gravity (s.g.) which is the strength of the acid and stored chemical energy. While overvoltage causes some degradation from contamination, outgassing and loss of electrolyte it is also accelerated by contaminants from quality purifications by reducing the antimony content. Car batteries are flooded with overvoltage (2.37V/cell) by the alternator, are only used for a few seconds to start the engine and still have a relatively short lifespan of total Wh charge transferred compared to 1 cycle total capacity. thus they are designed for lower ESR at the expense of self-discharge contaminants.

To mitigate the negative effects of antimony, some modern lead-acid batteries use low-antimony or antimony-free alloys, such as calcium or silver-calcium alloys. These alternatives aim to provide mechanical benefits without the drawbacks associated with antimony. Telephone 200 kg 2V cells often were made this way for long life with silver-calcium plates, consequently making excellent solar batteries and UPS cells, if you can find any surplus ones, prepare to get a forklift or pallet jack.​
In summary, while antimony can improve the strength and cycling performance of lead-acid batteries, it can also lead to higher self-discharge rates and other maintenance challenges, potentially affecting the overall lifespan of the battery.​
I once was Eng Mgr for CMAC, a company that produced many client's products including Solartech's old product. This anti-sulfation pulse generator little box only used 25W when the alternator voltage was detected to generate ultrasonic fast risetime load and flyback pulses across the battery. I tested a large Air Canada tractor battery that suffered from poor capacity and it restored the perfectly balanced s.g. and full kWh capacity after 1 week of use. It has been tested to prevent and reduce short-term oxidation of electrode crystals on all battery types with molecular imaging. These insulating crystals on lead acid plates are called lead sulfation that form on the metallic plates both raising the ESR and reducing the Ah capacity of the cell by forming a series capacitance. On lead-acid batteries this is verified by the strength of the acid with the height of floating balls in a "specific gravity" (s.g.) test tool. This is why I suggested you buy this inexpensive tool to record the health of your battery bank. This is also why BMS controllers are so essential in Lithium battery systems where high currents also accelerate cell voltage mismatch.

The thermal effects of self-discharge are fairly well known and 15 'C is a generic ideal median temperature to minimize loss in capacity from self-discharge and minimize the rise in ESR. (This can be verified by measuring the ratio of cold cranking amps CCA to room temp. cranking amps, CA. Locating the batteries in a root cellar might not seem practical but your batteries might prefer it.

If you can increase the range of your crossover State of Charge only if you can maintain the battery mismatch error to 0.1% and prevent extended periods of low state of charge. These are conflicting tradeoffs if your capacity is already degraded this winter. You must consider how to restore the battery health if you suspect any loss in capacity since it was new, but minimize the time below 40% SoC when the sulfation rate increases. So get a cost-effective pulse desulfonator to slow down and reverse the cumulative sulfation and use cheaper grid power to keep the capacity above the 40% SoC voltage per cell. This may depend on the next day's solarity to minimize your use of premium time grid costs. If your battery cell voltage mismatch is more than 2%, that is an early sign of accelerated aging or short life expectancy. This is why when replacing cells in a bank, they must be matched or a BMS must be used to get the maximum usage.

Typical Voltage Mismatch Error​

  1. New Batteries:
    • For new, well-matched cells in a lead-acid battery, the voltage mismatch error is usually very small, typically within 10-20 millivolts (mV) per cell.
  2. Aging Batteries:
    • As batteries age, the cells can become less balanced due to variations in internal resistance, capacity loss, and other degradation mechanisms. In such cases, the voltage mismatch can increase to 50-100 mV per cell or more.
  3. Severely Degraded Batteries:
    • In older or poorly maintained batteries, the voltage mismatch error can be significantly higher, potentially reaching 100-200 mV per cell or even more.

Factors Influencing Voltage Mismatch​

  1. State of Charge (SoC):
    • When cells are at different states of charge, their voltages will differ. A well-balanced battery should have cells at the same SoC.
  2. Temperature:
    • Temperature differences across cells can cause voltage mismatches because the voltage of a lead-acid cell is temperature-dependent.
  3. Internal Resistance:
    • Variations in internal resistance due to manufacturing differences or aging can cause cells to have different voltage drops under load.
  4. Electrolyte Levels:
    • Uneven electrolyte levels can also lead to voltage mismatches, as cells with lower electrolyte levels may exhibit different voltages.

Addressing Voltage Mismatch​

  • Balancing: Regular balancing of the cells, either through equalization charging (a controlled overcharge) or using external balancing circuits, can help minimize voltage mismatches.
  • Maintenance: Proper maintenance, including ensuring correct electrolyte levels and avoiding deep discharges, can help maintain cell balance.
  • Temperature Control: Ensuring uniform temperature across the battery pack can help reduce voltage mismatches.
In practice, maintaining voltage mismatches within 10-50 mV per cell is typically considered acceptable for most applications. However, for critical applications, tighter tolerances may be required. Regular monitoring and maintenance are key to minimizing voltage mismatches and ensuring the longevity and reliability of lead-acid batteries.​
Last edited:

LaTeX Commands Quick-Menu:

Part and Inventory Search

Welcome to