karbiuch
Junior Member level 1
Hi,
I made simply simulation for difference op amp. In simulation I am using resistors: Rf=10k, Rg=1k, so gain is 10. When I change value of one resistor I have error on output. I know, this oamp is very sensitive to resistors tolerance. For example, in ideal world:
R1=1k, R2f=10k, R3=1k, R4f=10k, In+=50mV, In-=0mV, Vout is = 500mV
but for R1=1010 ohm, Vout is = 499,54mV,
Are there any methods/equation to calibrate such amplifier? I know the input voltage and output voltage so Can I determine the actual value of the resistors (gains for In+ and In-) ? Of course I can buy a monolithic amplifier, or precision resistors. However, the difference in resistance will always occur. After all, the gain error (resistors) is always the same, so the fault must be constant.
I made simply simulation for difference op amp. In simulation I am using resistors: Rf=10k, Rg=1k, so gain is 10. When I change value of one resistor I have error on output. I know, this oamp is very sensitive to resistors tolerance. For example, in ideal world:
R1=1k, R2f=10k, R3=1k, R4f=10k, In+=50mV, In-=0mV, Vout is = 500mV
but for R1=1010 ohm, Vout is = 499,54mV,
Are there any methods/equation to calibrate such amplifier? I know the input voltage and output voltage so Can I determine the actual value of the resistors (gains for In+ and In-) ? Of course I can buy a monolithic amplifier, or precision resistors. However, the difference in resistance will always occur. After all, the gain error (resistors) is always the same, so the fault must be constant.