1st bit lost issue in optical receiver systems

Status
Not open for further replies.

chang830

Full Member level 5
Joined
Feb 11, 2006
Messages
267
Helped
14
Reputation
28
Reaction score
3
Trophy points
1,298
Activity points
3,424
HI,
I am designing a DC-10MHz optical receiver which mainly works in the industrial monitoring control applications. Because it works low to DC which means the receiver have to handle with the burst mode like signal. Considering the large dynamic range requirement, the receiver have a AGC loop to adjust the gain for the different input signal strength. But I found in some cases for the limited time constants of AGC loop, the 1st or 1st&2nd bits will be lost in the output.

I am not familiar with the systems. I wonder if it would be problem in applications.

Thanks
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…