mcyberey
Junior Member level 1
I am trying to use NIST StatistiCAL software for a one-tier calibration. It asks for a "switch-term calibration" standard. I have no idea what this is and cannot find information on it in their documentation or in my textbooks.
Does anyone have any information or know what a switch-term measurement is?
Here is what the NIST manual states about it:
Does anyone have any information or know what a switch-term measurement is?
Here is what the NIST manual states about it:
The Switch-term calibration standard is not really a calibration standard per se. It contains the
switch-term measurements. These measurements are typically taken in one-tier calibrations
1
when the Thru calibration standard is measured. MultiCal adds the prefix "G" to the Thru
calibration standard filename when it saves the switch-terms to disk.
In SOLT calibrations the switch terms are often measured indirectly. This results in measurement
imprecision. In these cases, the switch terms used by StatistiCAL should be measured directly,
as opposed to determining them from the 12-term error model. See section 5.5 ìSwitch-term file
formatî for instructions on how to measure the switch terms.
Switch-term files are ignored in two-tier calibrations.