Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Very basic resistor conection question

Status
Not open for further replies.

Tyler Grey

Junior Member level 1
Joined
Jun 6, 2015
Messages
19
Helped
1
Reputation
4
Reaction score
1
Trophy points
3
Activity points
206
Hi, let's say that I have a simple circuit with a battery connected to a LED.
Then if I connect a resistor to the ( - ) pin of the LED, would this resistor keep the voltage that goes to the LED low?
I thought it didn't until I found a bunch of circuit designs which would connect resistors right next to the ( - ) pin of the LED.
 

All diodes including Zeners and LEDs are modeled as a fixed voltage threshold with a saturated ESR as an approximation of the dynamic resistance after saturation of current.

Thus like batteries which have a voltage drop with current, all diodes have a voltage rise above fixed voltage with current. To limit the current between the supply and LED , I use the threshold voltage Vth ( typically 10% of the rated current on VI curve) to get the Vthreshold + ESR*If rated= Vf of the diode.

THe ESR of the diode is typically 1 OHm per Watt or 1/10th Ohm per 10 Watts such that the Ohm-Watt(max) product equals ~1

The simplest case is when the voltage drop is on series resistor either placed on + or - side of diode is >10% of the diode voltage ( for zeners and LEDs )then ESR of the diode is negligible, otherwise you must include the ESR of the diode for a low voltage source, which is actually the most efficient way to run a string of LEDs from a battery or any power source. Like 3V LIthium on a 2.9V White 1W LED.

But most newbies insist on using a 9V battery or something excessive to run a 2V (Red)~3V(White or blue) LED thus the current limit in the LED is by design just using OHm's Law on the voltage difference. Ideally you want the voltage drop on the LED string to be less than the Vf of one LED so you can maximize efficiency and regulate current with Rs to V+ or on - side to gnd. Either works.

Simple case is a Red 5mm Ultrabright LED at 2.2V with 3 in series thus 6.6V from a 9~9.3V battery and thus the drop voltage is 2.4V~2.7V across the series R. THen solve for desired current like 20mA ....Rs=2.7/0.02A= 135 Ohm or nearest value then check power dissipation ... 2.7*0.02=54mW no problem.
Or 3V LIthium CR123 for a 2.2V RED LED Rs=0.8V/0.02= 40 Ohm which for a 5mm LED 65mW has ESR ~15 Ohms is almost negligible so you could use 40-15= 35 Ohms instead for 20mA .

But if you used 9V to drive a 20mA RED led rated at 2.4V typ then 6.4V/0.02A = 320 Ohms or 330 @ 0.02*6.4V = 128mW so use a 1/4W not a 1/8th watt.
 
Hi, let's say that I have a simple circuit with a battery connected to a LED.
Then you do not has a series resistor to limit the current and the LED will instantly burn out.

Then if I connect a resistor to the ( - ) pin of the LED, would this resistor keep the voltage that goes to the LED low?
The series resistor can connect from the (-) pin if the LED to the (-) terminal of the battery OR from the (+) pin of the LED to the (+) terminal of the battery.
The LED has its own voltage drop. The resistor does not reduce the voltage, it limits the current. You can use a 1000V power supply feeding a resistor in series with an LED and if the resistor value is calculated correctly then it is fine except the resistor will get very hot.
 
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top