Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

designing a charge controller for hybrid system

Status
Not open for further replies.
My observation with solar has always been with higher voltages, I have never played around personally with a 12v system.

The aim should always be to keep all the conduction losses as low as possible.
That is especially difficult with a 12v system.

If your MPPT voltage is close to 15v, and your fully charged battery voltage 14.5v, then the whole exercise is pretty pointless if you cannot pull the two voltages almost right together at full maximum duty cycle.

There should be about six amps maximum perhaps.
So all the resistive losses must be brought down to less than 80 milliohms total. The mosfet should be only 17 milliohms, the choke should not be much, and all the external wiring kept as short as possible.

See if you can measure all the voltage drops with the mosfet turned on hard 100%, right around the whole system, and see which is the worst voltage drop offender.

It may need a physically bigger choke with heavier wire, or two mosfets in parallel, or much thicker external wiring to solar panel and battery.
 

does battery ah matter? I am in actual going to use this system on 12V 110Ah but, presently i have 12V 7Ah battery.
And people who work on 12V system have experience that solar voltage should go right down to battery voltage which isn't happening in my case.
If i forget about the controller atleast the voltage across combined combination should drop to nearly battery voltage when battery is hooked directly to solar.
 

Think about it like this.
The battery is charging and the battery terminal voltage is very slowly rising, with the solar panel voltage fixed at 15v.

With a good efficient controller, the current going into the battery will be at maximum right up to the 14.5v cut off point, where charging will abruptly cease.

With a lossy controller the current starts to taper off well below 14.5v.
A relatively large battery such as your 110 Ah, should easily take six amps right up to maximum charge voltage.

Anyhow, make all the voltage drop measurements, and find out where the voltage drops are. If they are evenly distributed it is more difficult. But it is more likely that one cause is mostly to blame, and fixing that should give a big improvement.

It does not have to be perfect, but if one or a couple of changes can vastly improve things, its well worth doing.
 

It may need a physically bigger choke with heavier wire, or two mosfets in parallel, or much thicker external wiring to solar panel and battery.
But, inductor value is have alredy designed it for particular power arting and it came out to be 30-40uH for 75Wpanel, 90-100uH for 110W panel and 200uH for 350W panel
 

If you design a dc choke to have a certain inductance and to carry a certain current, it may still have too much resistance for a particular purpose.
It may be possible to rewind it with thicker wire, keeping the turns the same.

If not, a physically larger part will be needed to carry thicker wire.

The choke you have may carry six amps perfectly well, but get rather warm doing it.
For some applications, for instance a much higher voltage solar controller, that may be perfectly acceptable, as it allows a physically compact and less costly choke.

But if voltage drop is absolutely critical in a low voltage system, as it is for us here, a very small six amp rated choke may not be good enough for our application.

It may require a larger ten or fifteen amp rated choke even though it only has to carry six amps, just to reduce the dc resistance sufficiently.

That is assuming its mainly the choke that is causing our problem...
 
Today plugged once again step by step.
First --> I connected 110Ah directly to solar and panel voltage dropped to battery voltage but, same didnot happen for 7Ah battery.
Another thing i observed was there must be a finite resistance across G-S(say 50K) terminal of mosfet to turn it off under absence of gate drive because when it was not present MOSFET remained on irrespective of the gate drive signal.
Then i just took mosfet applied external voltage of 16V (G-S) and instead of inductor a wire(shorted the inductor) and capacitor with battery in parallel everything was similar as previous circuit but inductor removed what i observed was 15.5V solar panel voltage and battery voltage 15V.
There is no difference i.e initially when directly connected panel voltage droped to battery voltage=13V but, with above mentioned connection gave rise in 2.5V.
Then removed mosfet and connected battery with capacitor with connecting wires in between again same results.
how can wires drop so much!! when i directly connected battery with solar worked fine but the same thing if i do connecting wires shows differing results.
 

Today plugged once again step by step.
First --> I connected 110Ah directly to solar and panel voltage dropped to battery voltage but, same didnot happen for 7Ah battery.

The small battery may not absorb the current quickly enough (convert Amperes to electrochemical charge). Therefore it acts as a resistor to some extent.

Another thing i observed was there must be a finite resistance across G-S(say 50K) terminal of mosfet to turn it off under absence of gate drive because when it was not present MOSFET remained on irrespective of the gate drive signal.

This can be expected, if the gate was floating, or disconnected. The gate is very high impedance. It's much like an antenna which can pick up ambient 60 cycle hum. It can even be turned on by static charge from an object several feet away, including yourself.

That is why a pull-down resistor is recommended.
 
Thanks,
but at 100% duty cycle inductor acts as a short circuit hance voltage drop across it should be minimal with all voltage across battery but, when i rigged up MOSFET+capacitor with battery across capacitor no inductor, voltage across solar=15V and when MOSFET+inductor+capacitor with battery across capacitor solar voltage =16V.
But atleast for first case where no inductor was present the panel voltage should have dropped to battery voltage .
 

MOSFET+capacitor with battery across capacitor no inductor, voltage across solar=15V and when MOSFET+inductor+capacitor with battery across capacitor solar voltage =16V.

So the choke drops about one volt ?
That is far too much.
 

Sounds as though you've got parasitic resistance in your components or wiring. It's common to encounter this. You need to test voltage drop across each component individually, while you pass the maximum expected Amperes through it. That gives you an idea which component needs further work, to minimize its resistance. Post #121 gave some details about this.
 

But, with no inductor and mosfet+capacitor with battery across at 100% duty cycle and mosfet completely turned on should drop panel voltage right to battery volatage+0.5V which happened when i connected battery directly to panel
 

PLEASE do two things.

Turn the mosfet on continuously 100%

Measure the voltage drop across mosfet drain/source and the voltage drop directly across the choke.
 

battery charged to 13.5V.
And these tests i am doing just with mosfet and battery in series (no inductor or capacitor used)
1)mosfet 100% duty cycle panel voltage =15.66V it could be seen that the connecting wires were heating a bit with most of the drop across those short connecting wires Vds drop negligible.
2)connecting battery terminals directly to solar, voltage across them =14.5V.
The solar panel i mounted rooftop and we are working on ground floor.
 

the connecting wires were heating a bit

This suggests they are too thin gauge. It is also possible you need to check all connectors. Lug type, crimp type, etc. Any and every contact point might have some undesirable resistance, causing voltage drop.

The solar panel i mounted rooftop and we are working on ground floor.

If this is a long distance, then it could account for 1V drop or more. Your connecting wires ought to be thicker gauge.
 

If this is a long distance, then it could account for 1V drop or more. Your connecting wires ought to be thicker gauge.
The wire thickness used for getting solar o/p to ground floor is high but, only the short connecting wires (other miscellaneous wires required for completing the circuit) are thinner and max drop is across them. Anyway if they account for some resistance they are unavoidable i need them to make proper connection in the circuit else how can i built connection without those wires.
I was doubting mosfet turn on might be giving the problem but removing mosfet and just making connection using those connecting wires also accounted for same problem.
problem was resolved only when i connected battery terminals directly to solar o/p wire terminals.
 

Hi, with some updates:
The problem was i worked on breadboard all these days and breadboard supports hardly 0.5A so as connecting wires i used for my experiment.
And here and there some mistakes + multimeter faulty all lead to this delay.
As i already mentioned that my panel Vmpp=14.85V and so as i programmed it, worked perfectly fine with everything soldered on PCB with all wiring them with thick wires.
Now as i set 14.85V panel voltage sits at 14.9V with 110Ah battery connected to charge as load.
Another thing is say if i set Vmpp=17V then voltage across panel=18V or 19V with most of the drop occurring across MOSFET or voltage applied across battery is more but, there was not even 1Vdc drop across the inductor and i suspect of not building the inductor properly.

- - - Updated - - -

Also the core i am using is not painted hence ferrite core and it's dimensions are:
inner dia: 0.3937 inches
outer dia :0.7874 inches
height=0.27559 inches
Now i have chart which asks for all the above mentioned parameteres and also permiablity of the core(which i dont know), Al vlaue and knowing all five of them,
using online caluclator i need to find number of turns required to get particular inducatnce value.
And for ferrite core usually it's suitable if desired inductance is in mH because for 0.045mH=45uH (for 75W panel) the number of turns are just 3-4
 

Ferrite core without an air gap is unsuitable.
The reason being that the dc current will saturate the core before you can get enough inductance.

It really needs a powdered iron core of known type so that suitable turns and wire size can be worked out.
Go back and read post #27 which suggests both a suitable design, or a very low cost ready made choke available from e-bay.
 

Also i used cro to measure inductance what i did was:
I applied around 2Vp-p sine wave and connected inductor across the signal generator with most of internet sources saying that generator has internal resistance of 50 ohm and inductor resistance =1.15ohm hence using voltage didvision and ohms law i got expression:
V(cro)/V(sig gen)=omega*L/sqrt(R^2+(omega*L)^2)
As per cro settings: V/div=0.5
time/div=1us
Vcro(p-p)/Vsig gen (p-p)=0.8/2 after connecting designed inductor
and for 2div on time scale=2*1us=2us =time period
so substituting in formula i got:
L=21.22uH so i was using 21.22uH inductor.

- - - Updated - - -

thanks i would do that and see..
 

hi i designed iron powdered toroid core for 75W panel as of now with 20k switching frequency and inductance= 45uH with 35% ripple current allowed.Its yellow white colored core
Here are the pics: IMG_20160223_145120745.jpg
IMG_20160223_145125817.jpg
And also implemented vmpp tracking algorithm solar panel voltage sits at 14.82V (as my vmpp=14.85V) and now will be implementing with V*I tracking and comparing results.
Just i would like to know one thing as i doubt about the inductor designed as it was designed for 3V drop across it i.e with 14.85-15V as vmpp battery voltage=12V there should be 3V dc drop observed across the inductor right?
 

as vmpp battery voltage=12V

12V is the all-purpose term we use. In fact the battery V changes by several volts in use. As we start a charge session, it could be 10.5 (discharged), and rise to 14.4 by the end of the taper charge.

When you take it off the charger it quickly drops to 13.8. The reading is elevated because the electrolyte is warm (although there might be additional factors).

Eventually it settles to 12.8V.
 
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top