mw_rookie
Member level 2
- Joined
- Mar 13, 2009
- Messages
- 48
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1,286
- Activity points
- 1,668
Hello,
Am trying to use a general purpose wideband amplifier in the S Band. This a 50 Ohms matched amplifier. When I give an input power of about 0 dBm or even -12 dBm, it amplifies with a gain of 8-9 dB. However, just for testing if I input a power lower than -30 dBm, there is no gain. In fact, there is a loss. Now, this is an HBT based amplifier.
My question is, what is the minimum threshold for an amplifier to amplify a signal? Does such a threshold exist? Is there any way to calculate that? When you bias an amplifier aren't you bringing it into the required region of operation? So it should amplify any signal once it is in the correct region since any amplifier amplifies noise as well. The noise is going to be lower than -30 dBm in any case. What is the difference that input power level makes?
Please advise.
Thanks.
Am trying to use a general purpose wideband amplifier in the S Band. This a 50 Ohms matched amplifier. When I give an input power of about 0 dBm or even -12 dBm, it amplifies with a gain of 8-9 dB. However, just for testing if I input a power lower than -30 dBm, there is no gain. In fact, there is a loss. Now, this is an HBT based amplifier.
My question is, what is the minimum threshold for an amplifier to amplify a signal? Does such a threshold exist? Is there any way to calculate that? When you bias an amplifier aren't you bringing it into the required region of operation? So it should amplify any signal once it is in the correct region since any amplifier amplifies noise as well. The noise is going to be lower than -30 dBm in any case. What is the difference that input power level makes?
Please advise.
Thanks.