1dB Compression voltage simulation

Status
Not open for further replies.

freewing

Member level 1
Joined
Mar 16, 2005
Messages
35
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,288
Activity points
1,592
I'm simulating the 1dB compression voltage of an amplifier. It's weird that the gain is not monotonically decreasing when the amplitude of input signal increases. For instance, from 25mV to 150mV, the gain increases from 1.55 to 1.63. Then from 150mV to 300mV, it decreases from 1.63 to 1.33. What's the reason and how can I get 1dB compression voltage?
 

the problem is the circuit nolinearity
what the dc operation points? keep that for 1dB compression voltage
 

In order to get 1-dB compression point, you need to run PSS simulation
 

I successfully did the 1dB compression point using PSS. It's more convenient.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…