ibrahim03
Full Member level 1
I am facing a rather simple problem and want to confirm my doubts. I wanted to use LEDs in a particular circuit of mine,the voltage level was 32V DC and I wanted about 10mA of current to pass through the LED (green Color) to get the appropriate brightness so I used the good old Ohm's Law to calculate the resistance as :
R = 32V/10mA = 3.2 KOhm
So I took the ordinary 3.3KOhm .25Watt resistors and attached them with the LEDs in my circuit. BUT when I turned on the circuit , the resistors started to heat up !!!. Ok so now I Calculated the power by :
P = VI
P = 32V * 10mA = 0.32W
the problem was found, I needed a higher wattage resistors of 3.2KOhms and 0.5 Watts.All seemed to be fine uptil this point. But the problem is that in another device(an Extension Socket), I found an Led attached across 220V AC with a 100K series resistance the resistor was 0.25 W !!!! . When we perform the same calculations as above in this scenario:
I = 220V/100KOhm = 2.2mA
P = VI = 220V*2.2mA = 0.484 W
But the resistance attached was 0.25W ! Can anyone tell me why was a 0.5W resistor not attached
in this case? What if I want to pass 10mA at 220V?
I might be missing something very basic over here, please point out the problem.
Also I found a similar topic at:
in it a circuit diagram is given for attaching LED with mains what will happen if I dont attach the diode bridge ? The frequency of the mains voltage is 50Hz so in my view it shouldnt have any visible effect right? Also the voltage of the capacitor in the circuit is required to be about 500V isnt that too high? wont such a capacitor be a bit expensive(I might be wrong) ?
R = 32V/10mA = 3.2 KOhm
So I took the ordinary 3.3KOhm .25Watt resistors and attached them with the LEDs in my circuit. BUT when I turned on the circuit , the resistors started to heat up !!!. Ok so now I Calculated the power by :
P = VI
P = 32V * 10mA = 0.32W
the problem was found, I needed a higher wattage resistors of 3.2KOhms and 0.5 Watts.All seemed to be fine uptil this point. But the problem is that in another device(an Extension Socket), I found an Led attached across 220V AC with a 100K series resistance the resistor was 0.25 W !!!! . When we perform the same calculations as above in this scenario:
I = 220V/100KOhm = 2.2mA
P = VI = 220V*2.2mA = 0.484 W
But the resistance attached was 0.25W ! Can anyone tell me why was a 0.5W resistor not attached
in this case? What if I want to pass 10mA at 220V?
I might be missing something very basic over here, please point out the problem.
Also I found a similar topic at:
in it a circuit diagram is given for attaching LED with mains what will happen if I dont attach the diode bridge ? The frequency of the mains voltage is 50Hz so in my view it shouldnt have any visible effect right? Also the voltage of the capacitor in the circuit is required to be about 500V isnt that too high? wont such a capacitor be a bit expensive(I might be wrong) ?