Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Name of the metric that describes the minimal power difference

Status
Not open for further replies.

elgen

Newbie level 5
Joined
Feb 27, 2011
Messages
8
Helped
1
Reputation
2
Reaction score
0
Trophy points
1,281
Location
Canada
Activity points
1,341
I have a radar related problem. Consider having a target that reflects power of 1 mW from 1 GHz to 2 GHz. Then I replace the target by another one. The reflected power is 1.1 mW over the frequency range. The power variation is 0.1 mW.

My question is, what is the name of the metric that describes the minimal discernible difference in the reflected power for a spectral analyzer? In the above example, subtracting two power readings should give 0.1 mW.

Does this refer to the "dynamic range" of a spectral analyzer? My feeling says "no" as the "dynamic range" specifies the minimal detectable signal, but not the minimal difference between two signals.

Any comment is welcome. Thank you.
 

The term "resolution" -- or a variant, resolving power -- is often used generically to describe the ability to distinguish two closely spaced estimates or objects. I don't know whether there may be a specific term for radar, but resolution of the reflected power would apply to your example.

John
 

Thx for the pointer. I looked up the spec of the vector network analyzer in the lab and indeed it has something called "power resolution" in the unit of "dB".

Would anyone be able to comment on why the unit of power resolution is in "dB" but not in some linear scale for the network analyzer?
 

Thx for the reply. Power margin seems to have a good definition as the excessive power above the minimum power. I initially thought I am looking for the "sensitivity" of a system. The "sensitivity" seems to be defined as the minimium detectable power above the noise floor. However, my signal is interference-bounded, but not noise bounded.

My current understanding is that this "power resolution" is an analog quantity. On the digital side, would this "power resolution" also depend on the ADC inside the receiver. For instance, given the power resolution of 0.01dB (as for the case of the network analyzer), would it mean that the numerical value saved to a computer would be the same if the power variation is within this resolution?
 

For instance, given the power resolution of 0.01dB (as for the case of the network analyzer), would it mean that the numerical value saved to a computer would be the same if the power variation is within this resolution?
Not necessarily. Numerical truncation or rounding is only one possible limitation in result accuracy. The most common is noise originated measurement uncertainty, mostly at least one order of magnitude above magnitude resolution.
 

...Would anyone be able to comment on why the unit of power resolution is in "dB" but not in some linear scale for the network analyzer?

hi elgen,

In communications, dB scale is often used due it intrinsic simplicity to calculate losses in chain connections.
The only math operation required to perform is addition (+) and it is possible to do in field instalations without electronic calculators.
On the other hand, linear scale need multiply power losses at each node.

Hope this helps.
 

There are two commonly used methods to describe this phenomenon.

1) dBc. If there is a main signal with 1 mw, and a nearby signal that is 0.1 mw (like a spurious component), then you could say the nearby signal is -10 dBc, or 10 dB below the carrier.

The benefit of this approach is that if you attenuate the signals, the dBc relationship still applies. If you, for instance, apply a 10 dB pad to the signals, the main signal is now at 0.1 mw, and the nearby signal is now 0.01 mw. The nearby signal is still -10 dBc.

2) Return Loss. If you send a 1 mw signal at a taret, and it reflects a signal at 0.1 mw, you have a 10 dB return loss. Once again, the quantity measured is independent of power. If you sent 100 mw to that same target, you would see a 10 mw refileciton, which is still a 10 dB return loss.

Rich
 
Last edited by a moderator:

Thank you for the constructive answers. They shape my understanding on the power resolution and the dB scale.
 

I was asked on a related question. The time-domain oscilloscope measures the voltage and one parameter in its spec says a RMS maximum noise be 2 mV.

Is this "RMS maxiumum noise" considered an equivalent of some "voltage resolution", i.e. the oscilloscope cannot differentiate the two signals with amplitudes of 100 mV and 102 mV?

A natural extension to this question is that why is this resolution specified in the unit of absolute value, i.e. 2mV while the power resolution of the network analyzer is specified in the unit of decibels.

Any comment is welcome. Thx.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top