suria3
Full Member level 5
Hi Guys,
I have designed a LVDS driver with a 3.3V power supply. I have used the 1.8V transistors in the driver design to help in the voltage headroom limitation and its work
fine. Now, i have issue with my design when I ramp up the power supply from 0V to 3.3V with the rise time of 10ns. From the transient simulation I noticed that I'm getting the voltage breakdown issue from some of the transistors which exceed 1.98V. My question here is, do I need to worry on this transistor breakdown issue due to ramp up, or is it normal to have this break down while ramping up power supply voltage. Please explain.
Thanks,
suria.
I have designed a LVDS driver with a 3.3V power supply. I have used the 1.8V transistors in the driver design to help in the voltage headroom limitation and its work
fine. Now, i have issue with my design when I ramp up the power supply from 0V to 3.3V with the rise time of 10ns. From the transient simulation I noticed that I'm getting the voltage breakdown issue from some of the transistors which exceed 1.98V. My question here is, do I need to worry on this transistor breakdown issue due to ramp up, or is it normal to have this break down while ramping up power supply voltage. Please explain.
Thanks,
suria.