low resistance measurement
nzkunal said:
Hi,
I need to design a very accurate ohmeter for a copper wire tube. At present I put 0.5A through the wire (26.4-28.4 ohms). The voltage is then shifted down throrough a potential divider. The tester seems to drift with time. I am currently running some tests to monitor the affect of the current on resistance. The reason I am using the high current is so I can get a higher resolution on my ADC. I believe this is more accurate than putting less current through and then amplifying it later in the circuit lower SNR etc.
Does anyone know of any techniques using bridges etc to measure low resistance values (round 20-30 ohms) with an accuracy of round +/-0.05 ohms?
Thanks in advance
Let's try to estimate proper readout for your case, assuming an 4-wire measurement tecgnicue.
1. If the noise-equivalent resistance U want to measure is 50 mOhm, and some chose an instrumental amplifier as a front and, with noise appr 20 nVrms/RtHz above the 1/f knee of say 100 Hz and say the bandwidth of measurement is 1 Hz, then the rms value of the preamp noise will be 20 nVrms and the pp value will be 2*pi or about 185 nV. From here some could estimate the minimum bias current to be 185E-9/5E-2 = 400 uA
2. In order to provide 400 uA constsnt current through the 30 Ohm resistance , the easiest way is for some use reference voltage and large resistance in series. In order to minimize the mains pickup, actually 2 such bias resistors should be used in each input of the instrimental amplifier, to increase the CMRR. In general the value of this bias resistors should be at least ten times higher than the maximum Rx of 30 Ohm -> Rl = 300 Ohm (2x150 Ohm). The minimum value of the voltage reference will be 400 uA * 300 Ohm = 120 mV. This is too low voltage for a standart voltage reference. If some takes a 1.2 V bangap reference, then the bias resistances can be increased tenfold what will make the bias current even more "constant" vs variations in the Rx -> we endup with 2x1.5 kOhm bias resistors.
3. The gain of the readout is defined by your data acquisition. The DAQ resolution should be larger than the dynamic range of your measurement which U defined as 30 Ohm/50 mOhm = 600, or U'll need 10 bid ADC. If some assumes that a microcontroller is used for DAQ (for instance the micropower MSP430F149 is very good choice), then if the refference of the ADC is it's power of 3.3 V, the maximum gain of the preamplifier will be 3.3V/(400 uA * 30 Ohm) = 3V/12 mV < 300.
4. All the estimates were done assuming that an AC bias current with frequency of 100 Hz is applied, what is done by chopping the 1.2 bandgap refference using electronic switch controlled by the uC. The lock-in loop is done in software by multiplying the ADC readout by +/-1 syncronously with the bias chopper drive and integrating over 10 bias periods. to have null at 10 Hz and relatively flat responce at 1 Hz.
5. I'm not sure that applying 0.5 A current buy's you much because then the contact impedance heats the copper and U have a thermal dependence of the coper resistance. A as small bias currens as possible allways are recommended not to garble the measurement with additional efects.