Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

About the amplifier DC bias input voltage for single-end amplifier

elonjia

Newbie level 6
Newbie level 6
Joined
May 6, 2024
Messages
11
Helped
1
Reputation
2
Reaction score
1
Trophy points
3
Activity points
82
We all know for the single-end amplifier, we need to provide the DC bias voltage for setting the DC operation point, but I wonder what the influence of the circuit noise on the DC bias voltage for the input of the amplifier is? Generally how should we simulate this problem? Like how large of the noise bandwidth (like 0.001Hz~0.1Hz) should we use to simulate this effect of noise on the input pins or DC bias voltage?
 
Mains hum is so often troublesome, either its fundamental frequency 50 or 60 Hz (or 2xf rectified). Mains hum can come from supply rails or an incoming signal. In simulations I simply add a small amplitude of sine or triangle wave.

In real life I've observed noise at about 48 kHz riding on house voltage. I don't know if it originates in modern light bulbs or other electronic devices.
 

LaTeX Commands Quick-Menu:

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top