Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

1st bit lost issue in optical receiver systems

Status
Not open for further replies.

chang830

Full Member level 5
Joined
Feb 11, 2006
Messages
267
Helped
14
Reputation
28
Reaction score
3
Trophy points
1,298
Activity points
3,424
HI,
I am designing a DC-10MHz optical receiver which mainly works in the industrial monitoring control applications. Because it works low to DC which means the receiver have to handle with the burst mode like signal. Considering the large dynamic range requirement, the receiver have a AGC loop to adjust the gain for the different input signal strength. But I found in some cases for the limited time constants of AGC loop, the 1st or 1st&2nd bits will be lost in the output.

I am not familiar with the systems. I wonder if it would be problem in applications.

Thanks
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top