Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Why does the amplifier saturate at the supply voltage?

Status
Not open for further replies.

SherlockBenedict

Member level 4
Joined
Dec 26, 2011
Messages
77
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,870
I know that the output can never be greater than the supply voltage. What's the reason behind it?


Thanks a lot.
 

SherlockBenedict

Member level 4
Joined
Dec 26, 2011
Messages
77
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,870
We don't have law of conservation of voltages right? Only the power should be conserved. In that case why can't the output current reduce and increase the voltage so that the energy is still conserved?
 

crutschow

Advanced Member level 5
Joined
Feb 22, 2012
Messages
3,978
Helped
939
Reputation
1,876
Reaction score
945
Trophy points
1,393
Location
Colorado USA Zulu -7
Activity points
22,505
Unless you have an inductance or capacitance to store energy and supply the extra voltage (such as a switching regular or charge pump), an amplifier can only deliver an output voltage equal to the the supply voltage. Think of a standard amp as a variable resistor connected between the power supply and the load. Even if you reduce the resistance to zero, the maximum voltage can still be no greater than the supply voltage.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Top