elonjia
Newbie level 6
We all know for the single-end amplifier, we need to provide the DC bias voltage for setting the DC operation point, but I wonder what the influence of the circuit noise on the DC bias voltage for the input of the amplifier is? Generally how should we simulate this problem? Like how large of the noise bandwidth (like 0.001Hz~0.1Hz) should we use to simulate this effect of noise on the input pins or DC bias voltage?