Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Light Source: Doubling intensity versus doubling sources

Status
Not open for further replies.

mdwebster

Newbie level 1
Joined
Mar 6, 2006
Messages
1
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,291
Greetings,
I'm trying to figure out this optical intensity question. I'm looking at ways to increase the range of an IR emitter. Would it be better to put in a second emitter or increase the current to double the intensity of a single emitter.

Intuitively, it seems that multiple lights wouldn't increase, say, the half-brightness range as much as doubling the intensity of a single source, but I'm not sure. It seems that multiple sources just gives you the opportunity to increase your half-angle intensity by situating the LED's at slightly offset angles from one another.

But ... I'm not sure, so I thought I'd ask ... :D

Thanks for any help,
Mike
 

Doubling sources my friend will only produce interference. If its constructve intensity will increase. But if you want higher range just increase the amount of current input. Because interference depends on the ambience. Hope you succeed Mike.
 

You can get effective interferences with monochromatic sources (e. g. laser diodes) only.

Above a certain level, IR LEDs show efficiency drop with increasing current, at this point, using multiple chips would be advisable, if no imaging optics is implied.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top