Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

PWM LED driver - how to connect constant-current sink output to MOSFET?

Status
Not open for further replies.

ikorman

Junior Member level 1
Joined
Oct 14, 2010
Messages
15
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,450
Hi all

First of all, I would like to apologize my self for ignorance in basic electronics (I'm more oriented in uC coding and similar), but I need you help here :oops:

I want to control RGB led strip with TLC5940 LED driver. TLC5940 is 16 channel PWM LED driver, which uses constant-current sink outputs to control LED. By varying PWM cycle rate it will make LED brighter or darker. There is limit per each channel of around 100 mA. As I want to drive long LED strip which can take up to several amps I need to drive led strip through transistors and this is where I'm stuck....

Problem is that I'm not qualified to properly design this drive stage. I have seen solution where TIP122 was used by connecting to TLC output through 1 k resistor and using 10k pull up resistor, but there was complaints that transistor was not able to completely shut of the LED strip (remember that TLC output when active sinks predefined (configurable) current - it was something about not able to get 0V on base of TIP through used resistors). Also with higher PWM frequencies and currents, there could be heating and switching issues.

Therefore I would like to use MOSFET for this. Problem is that I'm don't know how to chose proper MOSFET and which resistors for optimum switching (my head hearts from on/off threshold, input capacitance, turn on/off time). I believe that circuit is quite simple (one or two resistors and n channel MOSFET), but ...:-D

Can somebody help me on this? As I'm planing to use lot of RGB LED segments (in total 32 PWM channels per one device), so I'm trying to keep components number at minimum...

Here are my parameters:
- RGB LED strip is driven by +12V, connected in common anode mode
- PWM freq will be around 1000 Hz, PWM steps 4096
- max current per LED strip/transistor 6A
- current sink on TLC output pin when output active is 10mA

I'm aware that by using transistor I can get inverted PWM signal, but is completely OK as I can adjust this in micro controller code.

Note: although TLC5940 supports also digital control of current per channel (in addition to PWM), I don't need this part (no current mirrors or what ever) - current will be constant (defined by TLC external resistor), I will use only PWM for control.

BR
Ivan
 

Ivan,

I'm a little confused as to what you are trying to do.

If I understand you correctly you want to vary the percieved brightness of the LED strips by a PWM?

However you want to set the current available to the LED strips via another device that only has a current capability upto a few milliampers?

The obvious question is why do you want this device to control the current to the LED strips, why not do it independantly.

That is make yourself a constant current powersupply connect the LED strip to this then have a power fet or whatever from the LED strip to ground. And drive the gate of this FET from the PWM output?
 

This chip has one external resistor that adjusts the wanted amount of current in the outputs (same for all),
this current sense for each channels is done inside the chip.

When you use PWM to drive any device (mosfet,transistor) you have to be able to measure the output of that
so that you can calculate the amount of current going to the led(s)
and then increase or decrease the pwm duty cycle to adjust the error.

There is no way to do something like that with your chip, you have 16 outputs that can provide a constant current to the driver
but you can't control the output current based on a constant driver current and there is no way in your application to add external load sense in the 16 outputs.

Alex
 

Ivan,

I'm a little confused as to what you are trying to do.

If I understand you correctly you want to vary the percieved brightness of the LED strips by a PWM?

However you want to set the current available to the LED strips via another device that only has a current capability upto a few milliampers?

The obvious question is why do you want this device to control the current to the LED strips, why not do it independantly.

That is make yourself a constant current powersupply connect the LED strip to this then have a power fet or whatever from the LED strip to ground. And drive the gate of this FET from the PWM output?

You are right, I want to control intensity of LED strip. LED strip is designed in a way that it can be driven directly by 12V without current source (SMD 5050 RGB LED are organized in group of three with appropriate resistors). You can cut the strip (keeping groups of three leds together) and still use same voltage. Therefore by PWM-ing supply to LED strip, I can connect the LED intensity.

This TLC5940 is single chip 16 channel LED driver that is easy to integrate with micro controller (Arduino) and offers independent PWM control of 16 channels, packed in PDIP, with existing Arduino library, therefore is my ideal solution for PWM-ing. Besides PWM-ing output, this chip also limits current through LED which is very handy when you wants to drive small LEDs directly - you can limit current through diode to e.g. 20 mA + control intensity in 4096 steps by PWM-ing current through diode.

As it's outputs have power limitations, some help is needed on output. This is where power MOSFET should came in. But here is where I lack "proper MOSFET switch design" knowledge :)

What is maybe confusing is TLC5940 does not have classic logic output levels, but instead each output as is current source that actually quickly turned on-off (PWM-ed)

---------- Post added at 16:17 ---------- Previous post was at 16:12 ----------

There is no way to do something like that with your chip, you have 16 outputs that can provide a constant current to the driver
but you can't control the output current based on a constant driver current and there is no way in your application to add external load sense in the 16 outputs.

Alex

I don't want to controll current through LED strip (as it is designed to work directly of 12V), but instead I want to PWM the supply to it.
 

I don't want to controll current through LED strip (as it is designed to work directly of 12V), but instead I want to PWM the supply to it.

Ok so your leds work with 12v and you want to control the current level using the PWM but the problem is that you can't because the outputs of the chip will provide less than 100ma to the driver and the chip will increase the PWM to the max.
You don't have a control over the PWM duty cucle, you said in your first post "current will be constant (defined by TLC external resistor)"

You can get some control of the output current of the chip using the digital interface but again you will not be able to make a precise adjustment because the adjustment you can make has to do with the chip output current , the PWM duty will changed to try and match that.

This may work only if you have some kind of load that pulls a known current so that when you set a value above that (digitally) the PWM will increase and below that will decrease.

Alex

---------- Post added at 17:41 ---------- Previous post was at 17:37 ----------

I don't know if i understand something different from what you want to do,
you say "I don't want to control current through LED strip" then why would you use PWM?
 

You don't have a control over the PWM duty cucle, you said in your first post "current will be constant (defined by TLC external resistor)"

Maybe I did not explain good how TLC5940 works. It can control two things:

- current that will go through LED (defined by external chip resistor)
- PWM that controlls how often the current is flowing

Consider it like current source through led, with a switch that is controlled via PWM. I can control PWM and therefore use this feature for high current LED applications.
 

I thought that the chip was using the PWM to adjust the output current but
apparently the output current regulation is independent of the PWM control
which is used to turn on/off the output in a programmed rate.

Ok, so the next question is what will be the PWM frequency in the output?
I can't find it in the datasheet.

Alex

---------- Post added at 18:14 ---------- Previous post was at 18:12 ----------

You wrote that in the first mail , sorry
PWM freq will be around 1000 Hz, PWM steps 4096

---------- Post added at 18:21 ---------- Previous post was at 18:14 ----------

The frequency is not high but the PWM steps are too much and in an extreme position (duty cycle)
the mosfet would have to switch on and off in 1ms * (1/4096)= 0.24n sec,
You need a dedicated high current mosfet driver to achieve such a speed and I think it will still be difficult to do it that fast.

Alex
 

The frequency is not high but the PWM steps are too much and in an extreme position (duty cycle)
the mosfet would have to switch on and off in 1ms * (1/4096)= 0.24n sec,
You need a dedicated high current mosfet driver to achieve such a speed and I think it will still be difficult to do it that fast.

Alex

I understand. I belive that I can go with lower values like PWM freq of 100 Hz (enough to avoid led flickering) and 256 steps, which would than make 38 nS in extreme position.
 

Ivan,

Just to clear one thing up the PWM is not being used to control the LED current but the ON/OFF ratio to control the apparent brightness?

Also rather than use a mosfet, why not use a mosfet gate driver, some of them will quite happily dump 6amp continuously with a very very low on impeadance. And some will quite happily switch well beyond 100MHz (I used some a few years ago to make a class D driver for a medium power HF CW system for plastic welding).
 

The frequency is not high but the PWM steps are too much and in an extreme position (duty cycle)
the mosfet would have to switch on and off in 1ms * (1/4096)= 0.24n sec,
You need a dedicated high current mosfet driver to achieve such a speed and I think it will still be difficult to do it that fast.

Alex

Hm, I think that in this case time is not but 0,24 ns but 0,24 uS or 240 ns. First I took you number and I was in dispear where I'm gonna find such quick FET. With 240ns on/off in worst case, I feel more relax :)
 

Hm, I think that in this case time is not but 0,24 ns but 0,24 uS or 240 ns. First I took you number and I was in dispear where I'm gonna find such quick FET. With 240ns on/off in worst case, I feel more relax :)

ooops... yes you are correct , the correct calculation is 240ns but also note that this will be the duration that the mosfet will stay on, the turn on and off delay (time it takes for the mosfet to turn on/off) should be a small percentage of this time.

Alex
 

ooops... yes you are correct , the correct calculation is 240ns but also note that this will be the duration that the mosfet will stay on, the turn on and off delay (time it takes for the mosfet to turn on/off) should be a small percentage of this time.

Alex

OK, let's go to my initial problem: PWM output which is actually controlled current source (20mA). If I connect n channel MOSFET directly to output and use pull up resistor (600 ohm) from gate to +12V, I should have following:

- when PWM output is off (no current), MOSFET will be charged through pull up resistor with current 12/600 = 20 ma. For MOSFET (with 20 nC) this should give me switch on time 20/0.02 = 1 uS

- when PWM output is on, current source will became active, will create voltage drop of 12V, getting gate to 0 and discharging gate with 20ma, therefore shutting it of in 1uS

I'm not sure if I'm on right track... I doubt that PWM out can drop to 0V when current source is running.

Ivan
 

I don't know if your chip can sink 20ma when the output is off (that is the current needed for the 600 ohm resistor) and some more for the mosfet gate (lets say 20ma more), because it has to be able to ground the mosfet gate.

Alex

---------- Post added at 14:56 ---------- Previous post was at 14:07 ----------

Actually your chip works in the ground side so it can definitely sink up to 100mA,
I has thinking for something else , sorry.
The mosfet gate doesn't have to go down to 0 to turn off the mosfet, for each mosfet you have to provide at least the threshold voltage to turn it on,
you can select a mosfet that needs a few volts to turn on and it won't matter if the switching voltage at the gate is 2v(off) 12v(on) for example.

normally a gate resistor is used in series with the gate to limit the gate current but since your chip has current regulation
I suppose the resistor is not needed because the chip will limit that current using the internal circuits

Check this link for the meaning of mosfet characteristics.
http://www.irf.com/technical-info/appnotes/mosfet.pdf

Alex
 
Last edited:

Hi,

Just signed up because I'm doing exactly what you are in my current project:

Arduino -> TLC5940 -> MOSFET -> SMD5050 LED strip

Now I know you can do this (But its not a very good solution!):

Code:
                         5v           12v
                          |             |
                          |            LED STRIP
                          |             |
                      PULLUP            |
                          |            /
                          |           D
OUTx -------------------------------G||
                                      S
                                       \
                                        |
                                        |
                                       GND

I have had this running for hours on and RGB LED strip (using 3 channels) and it works, but the TLC5940 does get ever so slightly warm.

So, why is this no good ?

Well the Gate of a MOSFET has capacitance, so every time we apply a pulse from the TLC5940, this capacitor is charged. Now, we know that when we apply voltage to a capacitor we see a peak in current (where voltage across the cap is 0) which tails off (and voltage increases) as it becomes charged.

Depending on your MOSFET this very short blip of current can be pretty large, enough to upset or stress out the outputs of the TLC5940.

So out first thought is "No problem, I'll put a resistor between the TLC5940's output and the Gate". Well, this brings about a second issue, gate charge time. We know that the voltage across the gate dictates the conduction state of the transistor, so whilst it is charging the transistor is not fully on. This means there is power dissipation over the transistor and, with a large load, this could quickly cause problems.

So, Onto how we fix this, from where i stand, i can see two immediately visible places to start:

A: We do some maths and balance the gate charging currents required from the TLC5940 by determining the maximum switching time we can have before power dissipation became a problem

B: Design and place something between the TLC5940 and gate which will ask for little current from the TLC5940 and provide large instantaneous current to the gate to charge it quickly.

My problem at the moment is i don't know what gate transition time i need for my given load to keep the MOSFET's power dissipation below its rating. Once that is established, the path to follow will be more Black and White:

IG = QG/t(transition)

QG is the total gate charge, as defined above
MOSFET on in time period t(transition)

Sorry for the essay, Thoughts ?

Louis
 
Last edited:
Well the Gate of a MOSFET has capacitance, so every time we apply a pulse from the TLC5940, this capacitor is charged. Now, we know that when we apply voltage to a capacitor we see a peak in current (where voltage across the cap is 0) which tails off (and voltage increases) as it becomes charged.

Depending on your MOSFET this very short blip of current can be pretty large, enough to upset or stress out the outputs of the TLC5940.

So out first thought is "No problem, I'll put a resistor between the TLC5940's output and the Gate". Well, this brings about a second issue, gate charge time. We know that the voltage across the gate dictates the conduction state of the transistor, so whilst it is charging the transistor is not fully on. This means there is power dissipation over the transistor and, with a large load, this could quickly cause problems.

Actually the exact opposite is happening,
this is a N mosfet with the source connected to the gnd,
it turns on with a positive supply in the gate.
The current gate capacitor charge is controlled by the pull up resistor and the mosfet is turned on,
the TLC5940 can only sink current and discharge the gate capacitance so that the mosfet turns off.
If you put a resistor between the chip and mosfet gate then you will only control the discharge rate (turn off)

Alex
 
  • Like
Reactions: islou

    islou

    Points: 2
    Helpful Answer Positive Rating
Yes, of course, my apologies. Although the same theory still stands.

We want both turn on and off to be as quick as possible to minimize power dissipation across the MOSFET.

Thanks alexan_e.
 

Assuming that you are using a power supply of 12v and there is no need for a level translator a dedicated discrete driver could look like this

N-Mosfet_driver.jpg

The mosfet in the input has the 1K pull up resistor because the TLC5940 can only sink current.
The output of the mosfet driver drives the totem pole with 0v-12v which will provide the high current to the mosfet gate.
Reducing the 2n7000 drain resistor would provide more current to the base of the transistors for faster turn on speed of the output mosfet (turn off is very fast because of the 2n7000) but would also increase the losses of the driver.
The transistors are 3A with high gain, the output mosfet is a random model.

Alex
 
  • Like
Reactions: islou

    islou

    Points: 2
    Helpful Answer Positive Rating
I can see that working. Seems simple enough.

I'm going to get the whole lot on the scope in the lab on monday and see exactly how much peak current I'm getting. I was prototyping with some IRF610's but i decided to solder up some of these: **broken link removed** since thats what i would be using and The system has been on and fading for getting on an hour now and its stone cold.

I'm limiting the current through the TLC5940 (using its own limiting system) to just under 10ma per channel. This should keep the power dissipation for the whole package at, absolute maximum, 800mW, way less than the rated.

If the traces show anything potentially harmful i will give your circuit a shot and report back on performance with traces etc. I think ikorman will still require some kind of drive circuit as he is likely to be using much beefier MOSFET's than me. I am only driving meter lengths of LED strip per channel.

Cheers Alex. Help is much appreciated!
 

I'm missing a means for constant current in the discussed MOSFET drivers. You mentioned current limitings resistors in your first post, but I wonder, if it's a good idea considering type and temperature variation of LED forward voltage. You'll notice, that most high current LED drivers on the market are using PWM buck converters with current control.
 

Didn't want to leave my part of this thread hanging without a conclusion.

Connected up some MOSFETS both directly and via resistors etc and used a CRO to measure charge times and also introduced a 2ohm resistor to get an idea of current spikes from the gate to the TLC. To be honest, with a direct connection from gate to TLC the spikes are nowhere near the kind of current i thought they would be. My calculations were showing nothing too drastic with peaks at around 100ma for, at this point nano seconds !! Either way if you do your calculations and set the current limiting in the TLC correctly it seems your really not going to exceed its power handling with any kind of reasonable mosfet. (Mine are: **broken link removed**)

Also bear in mind that the TLC introduces a 20ns delay between the leading edge of each output and the main current spike will fit within that timeframe so this also helps to reduce the instantaneous current through the TLC5940.

I don't claim to be an expert by any means, in fact this is my first dive into this area. I can understand why and would strongly suggest NOT connecting a mosfet directly to the PWM output of any microcontroller but from what i can see, the TLC5940 and its current limiting seem to be working well. It will obviously also be different on a device by device basis as gate charge changes. So DO take your own measurements and decide.

Thanks for all the help and info, my understanding is so much clearer thanks to you guys.
 
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top