Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

LDMOS Impedance over Temperature

Status
Not open for further replies.

languer

Newbie level 4
Joined
Sep 22, 2005
Messages
7
Helped
1
Reputation
2
Reaction score
0
Trophy points
1,281
Activity points
1,358
We have been working with some LDMOS amplifiers for some time now. These are used for Pre-Driver applications in VHF radios. In some recent tests, we performed some pulsed measurements and found something really odd. As the device fell below -25º, the output pulse went from 10msec risetime to 3sec risetime (the device gets within 2dB of the final output in no time, but takes an additional 3sec to get to full output power).

To verify the pulsed operation, the bias is applied about 1msec before the RF is applied. When this behavior was observed the RFinput, and DC bias (gate-voltage and drain-voltage/current) were monitored. All came-up within 10msec, yet the RFoutput had a 3sec risetime. Monitoring the input of the device you can see the input impedance (S11) changes with temperature (as does the gain).

The device in question is an old Motorola device (MRF9482), we have tried several devices and they perform the same. However a similar device from another manufacturer does not appear to exhibit this issue. I was wondering if anybody has previous similar experience with LDMOs devices.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top