Continue to Site

# about voltage divider

Status
Not open for further replies.

#### needforspeed

##### Member level 1
hi,
I want to implement a voltage divider by capacitors in series to dirve a MOS amplifier.
softwares cann`t simulate the circuit because these is no DC biasing for the gate of MOS transistor.
I want to know whether this circuit work in actual chip. thanks.

actually the circuit would work only for AC signals because the capacitors dont form voltage divider for DC... so try giving a AC signal and simulating....

It will never work on chip. Like you said, you need to define the DC voltage at the gate of the device.

Conceptually , the capacitive divider works for both ac and DC changes in the input voltage because (1/sC2) /(1/sC1 +1/sC2) =C1 /(C1+C2) which is independent on the frequency. Thus any change in the input voltage will cause a proportional change in the output . However, the output of this circuit is highly dependent on the initial condition of the circuit (the stored charge in the capacitors) which can not be guaranteed unless some extra circuits are added to fix this initial condition. Also the circuit is not easy to be simulated by AC analysis , unless you forced the simulator to skip the initial operating point analysis and provide it to simulator to make the ac analysis . (you may need to add a dummy resistor to ground with a huge value)

Conceptually, a capacitance has an infinite impedance at DC; Vin is thus disconnected from the circuit..

I think in silicon implementation, the cap never be ideal cap. A limited impedance exists between two nodes of the cap. That is, there is always leakage current paralleling with the cap. In the end, the DC point is decided by the leakage current. That lead to huge variation. So dividing DC voltage with a cap will be a bad idea. However, it can still be used for high frequency signal.

Status
Not open for further replies.