Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

[SOLVED] Reducing the power of power amplifier(changing drain voltage???)

Status
Not open for further replies.

rfhobbit

Newbie level 4
Joined
May 16, 2012
Messages
5
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,323
hello amplifier experts,

Given condition: :smile:

I have a Power Amplifier that I designed for example 100W at 50V drain voltage. This will give me a first hand approximation of the Roptimum that i need to present to the amplifier output [Ropt=(Vdd-Vsat)^2/2Pout)]. Ropt=12.5 ohms

Now, I would like to reduce my power at 50W and 10W (assuming Vsat ~ 0). I would then compute the required Drain voltage for each of this power level assuming I retain the Ropt which is 12.5 ohms.

Question: :idea:

1. Since the amplifier is designed and matched at 100W with Ropt=12.5, will it still be properly matched at 50W and 10W operation when I lower the Drain voltage as computed using the formula above?

2. What can you say about the efficiency?

Thank you ;-)
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top