Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Charge management for standby AGM batteries

Status
Not open for further replies.

mtwieg

Advanced Member level 6
Joined
Jan 20, 2011
Messages
3,835
Helped
1,310
Reputation
2,626
Reaction score
1,404
Trophy points
1,393
Activity points
29,363
I have a client who has a load which needs to draw high current pulses from a 24V rail, say max current 40A, max pulse duration=20ms, minimum pulse spacing=100ms. Over large timescales the load will be active arbitrarily; maybe a minute every hour, or maybe continuously for up to a few hours. The average load power over very long timespans will be <100W.

The power source for this load is only capable of supplying up to 100W (let's assume it's a 48V DC supply, Iout=2A). My task is to make a load leveling system so the 200W source can meet the demands of the load. My intent is to use an AGM battery (24V, around 20Ahr) for energy storage (supercaps are not feasible). A very simple method of implementing this is to use a stepdown converter which turns the 48V/2A supply into 27V/3.5A (the standby/float voltage of the battery). The battery just acts like a giant decoupling capacitor for the 27V rail, which connects directly to the load (27V is close enough to 24V).

However, in practice there's a flaw to this approach. When the average Pload exceeds 100W, the charger can't keep up and the battery gradually discharges. If the battery is healthy, then the load will always hold until the load decreases. But even so, the battery charges back pretty slowly because the charger only applies the float voltage (not a bulk charge voltage, typically >28V). During these times, the charger outputs much less than its 3.5A limit. The end result is that the battery spends most of its time with SoC <70%, which degrades its lifespan severely over time.

I know the available power is enough to keep the battery's SoC much higher, but the only way to make use of that is to increase the applied voltage. But if I go too high for too long, I'll harm the battery by overcharging/outgassing/corrosion. Typically this is avoided with a simple three-stage charging algorithm, with stage transitions being governed by current taper, or a timer on the absorption stage. But in my application, my current may never taper, and the load won't stay idle long enough for a timer to expire, so I'd never actually get to the float stage.

Is there some more general algorithm for managing charge/voltage that I can apply here? I've been looking at UPS and solar charge controllers, but those don't seem to fit my needs either.
 

The usual way around this is to charge a 420V bus and then have a fast down converter supply the 24VDC @ 40A, 1kW

as well as the battery to soak up any lumps and bumps, 20mS = 19.2 J, therefore 0.5C.V^2 @ 420 to 300V say gives a C of 444uF

so a single 470uF, 450VDC cap would do the job, once the cap is charged again, you can then set the batt to float.

To charge the 470uF cap from 300 to 420V in 100mS, is again 19.2 J = 192 watts - which you can't get from your 100W source

so using either battery or HV cap - you are in a power deficit ( you could use the source + Batt to charge it )

the only way around that is to have the power balance in your favour over a 24 hour period so you can fully recharge the battery - i.e. the battery will need to be large enough to cope with the worst case power deficit scenario - which you have not provided enough detail about.

Supercaps ( good quality low ESR ones ) in parallel with the battery provide a means of absorbing charge that the battery otherwise might not do very well.

Li-Fe-PO4 is your friend here - as Pb-acid has a round trip efficiency of somewhat less than 80% for discharge / charge. Also its range is about 2.8 - 3.3V / cell, so 8 cells gives you 26.4 which is the very maximum for these cells ( nominal 3V = 24.0 exactly ). The Li-ion will draw what ever current you can supply up to 3.3V / cell
 
Last edited:
The usual way around this is to charge a 420V bus and then have a fast down converter supply the 24VDC @ 40A, 1kW

as well as the battery to soak up any lumps and bumps, 20mS = 19.2 J, therefore 0.5C.V^2 @ 420 to 300V say gives a C of 444uF

so a single 470uF, 450VDC cap would do the job, once the cap is charged again, you can then set the batt to float.
There's a few reasons why capacitors have been ruled out:
1. The AGM batteries actually already exist in the system, and they are conveniently not needed for any other purpose when the pulse load is active.
2. A properly sized capacitor bank would cost at least as much as the batteries. Keep in mind the choice of capacitors will be governed more by ripple current handling than energy storage.
3. Client said nothing is allowed to exceed SELV. I tried arguing against that, but no luck.
To charge the 470uF cap from 300 to 420V in 100mS, is again 19.2 J = 192 watts - which you can't get from your 100W source

so using either battery or HV cap - you are in a power deficit ( you could use the source + Batt to charge it )

the only way around that is to have the power balance in your favour over a 24 hour period so you can fully recharge the battery - i.e. the battery will need to be large enough to cope with the worst case power deficit scenario - which you have not provided enough detail about.
Yeah, it's a given the battery has enough capacity (so long as it remains healthy) such that the battery+100W source will sustain the varying load just fine. Let's just leave it at that.
Supercaps ( good quality low ESR ones ) in parallel with the battery provide a means of absorbing charge that the battery otherwise might not do very well.
I actually expected supercaps to be the best option, but couldn't find any which had a low enough ESR but weren't also enormous and expensive.
Li-Fe-PO4 is your friend here - as Pb-acid has a round trip efficiency of somewhat less than 80% for discharge / charge. Also its range is about 2.8 - 3.3V / cell, so 8 cells gives you 26.4 which is the very maximum for these cells ( nominal 3V = 24.0 exactly ). The Li-ion will draw what ever current you can supply up to 3.3V / cell
Unfortunately, the client said right off the bat that any Li batteries (even LFP) are off the table. They don't want to deal with the shipping restrictions (even if the pack is already UN certified). Plus the cost is about 5x higher. It's true though that LFP doesn't need the absorption stage nearly as much as PbA batteries do. Wouldn't be surprised if there's some other caveats with LFP though.

Your comment about efficiency is concerning though... I was assuming that the only loss mechanism in the battery would be its ESR (30mohm for the AGM), but even at 40A that would only account for 5% power loss. What other mechanisms are at work?
 
Last edited:

Typical Pb-acid batteries - esp as they age - have only ~ 50 - 70% round trip efficiency, if you take into account float current needed ( 1 - 4% )

There are plenty of supercaps out there to choose from - just need to search aggressively ...

If the battery is not large enough to handle the worst case power deficit scenario - then there is no solution with that battery.
 

Typical Pb-acid batteries - esp as they age - have only ~ 50 - 70% round trip efficiency, if you take into account float current needed ( 1 - 4% )
Any sources on this? I've never seen anything claim charge efficiency below 50%.

My understanding is that charge inefficiency is due to self discharge and gassing. So observed charge efficiency will depend greatly on the details of the charge profile. For my application, self discharge is negligible over these timescales, and gassing should only occur if I exceed the float voltage.
There are plenty of supercaps out there to choose from - just need to search aggressively ...
I've looked quite a bit, and haven't found any good fit (ESR<50mohm), even if I were to arrange my own bank out of smaller cells.
If the battery is not large enough to handle the worst case power deficit scenario - then there is no solution with that battery.
Like I said, this isn't a concern.
 

You may need to monitor just the battery current, not the total current, and adjust the charge based upon the battery current (and voltage, of course).
Calculating the net charge into and out of the battery may also help determine the amount of charging needed.

That would likely require some sort of microcontroller.
 

I'll definitely use a coulomb counter for verification. But even if I had that information in-system, I'm not sure how to design a control law which makes use of it properly. That's really the core question here.
 

I'll definitely use a coulomb counter for verification. But even if I had that information in-system, I'm not sure how to design a control law which makes use of it properly. That's really the core question here.
I would charge at least 10-15% more coulombs into the battery than comes out, while limiting the battery charge current and voltage to its maximum rated value (whichever is lower).
When you reach that charge point, then drop the voltage to the trickle-charge value.

Make sense?
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top