AllenD
Member level 5
inverter output Vhigh voltage
Hi Team,
I have designed a circuit with a 1.2V dc power supply. Part of the circuit is a clock generator and I output a few clock signals out to test. In order to drive more load, I adopted 4 series-connected inverter buffers before the output pad. Simulation suggested it can deliver 0-1.2V square waves.
When I have my IC back, I glued a piece to a PCB and wire-bonded to the 50 Ohms SMA to 50 Ohms oscilloscope. The phase relationship of the output is correct but the amplitude is wrong. The output inverter buffer is supposed biased at 0-1.2V but the oscilloscope measurement suggested the output is 0-0.9V. Moreover, when I decrease the power supply voltage (Vdd) from 1.2 to 0.9V, the output voltage high also decreases from 0-~0.6V
I have multiple ground and Vdd planes on my IC and PCB and the IR drop should not be this big in my opinion...
Does anyone have any insight into what is going on?
Thanks
Hi Team,
I have designed a circuit with a 1.2V dc power supply. Part of the circuit is a clock generator and I output a few clock signals out to test. In order to drive more load, I adopted 4 series-connected inverter buffers before the output pad. Simulation suggested it can deliver 0-1.2V square waves.
When I have my IC back, I glued a piece to a PCB and wire-bonded to the 50 Ohms SMA to 50 Ohms oscilloscope. The phase relationship of the output is correct but the amplitude is wrong. The output inverter buffer is supposed biased at 0-1.2V but the oscilloscope measurement suggested the output is 0-0.9V. Moreover, when I decrease the power supply voltage (Vdd) from 1.2 to 0.9V, the output voltage high also decreases from 0-~0.6V
I have multiple ground and Vdd planes on my IC and PCB and the IR drop should not be this big in my opinion...
Does anyone have any insight into what is going on?
Thanks