how to measure input impedance of a power detector (diode detector) circuit

Status
Not open for further replies.

per_lube

Advanced Member level 4
Joined
Sep 29, 2010
Messages
107
Helped
17
Reputation
34
Reaction score
17
Trophy points
1,298
Activity points
2,120
Hi all,

I got zero bias shottky diode based power detector. It gives DC output upto -30 dBm power input.

I want to measure input impedance of this circuit.
I just connected the input of the circuit to VNA port 1; it shows about -3 dB in magnitude plot.

1. Is the method I used to measure input impedance correct? if not what is the proper method?

2. What can I do to improve the input matching?
If I use an impedance matching cct. will it improve the sensitivity (will it detect signals < -30 dBm) ?

cheers,
per_lube
 


As you wrote that your detector should operate with -30 dBm input and the VNA generates -3 dBm, you should use a 27 dB attenuator to get the required input power.
Then the detector will be well matched (the VNA will "see" only the attenuator).

Detectors without input attenuators have a variable impedance as a function of input power.
Many detectors are used as "quadratic response" devices with a linear output voltage for input power level below -20 dBm. Their output load should be >10 kOhms. Their frequency response is often slow, below ~10 kHz.
For a fast response, a low-impedance load is needed, and detector sensitivity (K-factor, 2-3 V per 1 mW input, may drop to 0.2 V/mW for <100 Ohm load). But the video-frequency response may then exceed 1 GHz, depending on video-output capacitor value.

A simple diode detector can be and is a quite complex device depending on application.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…