Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Question on Optical Receiver Tested result.

Status
Not open for further replies.

suria3

Full Member level 5
Joined
Mar 5, 2004
Messages
300
Helped
17
Reputation
34
Reaction score
5
Trophy points
1,298
Activity points
3,028
Hi guys,

I did design a optical receiver circuit which to work at 1.25Gbps. The circuit consists of Transimpedance amplifier (TIA), Limiting Amplifier (LA) and Low Voltage Differential Swing (LVDS) driver. The simulated extracted results showed a sensitivity of 10uApp input current with the output LVDS swing of 350mVpp. The total current consumption of this receiver is 35mA. But when we tested the chip, we didn't get the expected sensitivity, we were only managed to get a clear eye-diagram until 50uApp input current, it was noisy output. There is a large difference in term of sensitivity of the simulated result (with noise enabled in simulator) and the tested in lab. We also did test another optical receiver chip which was designed with 12mA current consumption as for the lower power design and managed to get the good test result until 10uApp as per simulated.
So, here I would like to seek input from the people in this forum to troubleshoot on what could have gone wrong/possible factor for not achieving the simulated result.
Attached is the simulated and tested eye-diagram of the output optical receiver.

Thanks,
Suria.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top