Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

[SOLVED] Op-amp offset compensation

Status
Not open for further replies.

mrinalmani

Advanced Member level 1
Joined
Oct 7, 2011
Messages
463
Helped
60
Reputation
121
Reaction score
58
Trophy points
1,318
Location
Delhi, India
Activity points
5,285
What is the most cost effective way of compensating offset voltage of op-amps. (I have a microcontroller on board, and the use of the same is permitted as it doesn't explicitly add to the cost of a MCU)
 

1) Digital Pot
2) DAC
3) If you've got a lot of pins available on your MCU, and don't need great accuracy, you could make your own DAC by connecting the MCU output pins through weighting resistors and summing them at the opamp. This would only cost a few resistors.
 

How will you determine the offset?
Can you just compensate for it by adding an offset correction factor to the analog signal in the µC?
Could you use a low offset op amp?
 

1. Yes, you got it absolutely right, I don't mean to compensate it in real. I just need to determine it so that the MCU 'knows' it.
2. There are around 8 op-amps on board and using lower offset amplifier literally doubles the cost. Think of mass production.
3. Half a dollar or so is acceptable. (for the extra calibration circuit)
4. I did not understand the concept of making DAC. Why would i want to do that?
5. I am sensing current, with a full range output of 50mV. Even a 5mV offset is unacceptable, and 2mV amplifiers, though still not readily acceptable, cost almost double!
6. The MCU has a 10bit resolution, with a full range of about 2V. A direct measurement of offset through the MCU thus seems of no good as the MCU resolution itself it 2mV, and ofcourse we all know how reliable the LSB of a MCU based ADC is!

7. Is manual measurement and offset feeding in the MCU the only way?
8. Since I do not have the luxury of a negative signal voltage... measuring negative offsets can be even more painful.
Any suggestions? please help
 

You didn't tell the exact application problem. If it's about offset compensation for a digital measurement, you usually calibrate offset and gain and store the calibration parameters in an EEPROM.
 
What measures the normal current measurement output from the op amps? If its the MCU then the offset is multiplied by the gain of the op amp and can be readily measured by the MCU. (I assume the op amp has gain of 40 or so to increase the 50mV signal to 2V).

Negative offset is indeed a problem. The only solution I can think of is to apply a small positive voltage to the op amp positive input to give a known output voltage. Any deviation from that is the offset.

Post your schematic.
 
In your first post you state that you want to COMPENSATE the offset. Then, in post #4 you state that you DON'T need to compensate it, you just need to "know" it, whatever that means.

Do you mean to apply a known voltage to the input, and measure the output (that's one way to do it). Can you "compensate" for the offset in software?

As far as using a DAC, you said you need to compensate the offset voltage; that means apply a voltage opposite to the offset voltage. The DAC would generate that voltage under program control.
 

I need to compensate the offset by feeding the offset value in the EPROM of the MCU. But for this I need to know the offset. And yes, perhaps I'll have to apply some input voltage till the time it balances. I got your idea of using a DAC, to detect negative offset. But DAC will produce a voltage in range of volts, and the offset is in range of millivolts, so do you think an op-amp with a "reduction" factor of about 1/100 should be a good idea?
Positive offset can simply be detected by measuring it at the ADC. And as Crutschow said... since the offset is multiplied by the gain of the amplifier, it will not be difficult to measure

- - - Updated - - -

@diarmuid
what do you mean my chopping off to higher frequency and filter... please explain.

- - - Updated - - -

@FVM
Ofcourse, we can feed the offset to the EPROM, but for this we need to know the offset. The question is about a cost effective way of detecting the offset value without taking out the chip from the board (perhaps by adding some extra calibration circuitry and using the MCU)

- - - Updated - - -

The proposed schematic for calibration is attached below. But this creates an immediate problem. Once the calibration is over, the DAC is still connected to the input. This will hinder the input signal. Is there any way of "floating" the DAC output to high impedance? Even if it can be floated, it appears that the DAC output will be subjected to non zero voltages under normal operation of the amplifier. Is input voltage acceptable on the output pin of the DAC?

- - - Updated - - -

There's a slight problem.... the circuit is intended to be "NON-INVERTING" amplifier, however an inverting configuration is mistakenly shown in the attachment
 

Attachments

  • Capture5.jpg
    Capture5.jpg
    166.9 KB · Views: 155
Last edited:

But DAC will produce a voltage in range of volts
Really? A 12-bit DAC with a 1V reference is able to generate voltages with a resolution of 244 microvolts.

But forget that. With all due respect, you are all over the place.

Do you realize that the offset voltage will change over time and temperature? How will you handle that with only a single value stored in memory? Will you periodically recalibrate? You don't tell us your requirements: resolution, accuracy. Is there any reason you can't compensate in software?
 

The way I've done it is to write an calibration routine which removes the input signal, and connects the input to a known and stable reference voltage with an analog switch like a 74HC4053.
Then measure the reference voltage, and store in memory the correction factor. Needless to say, one must take many samples to ensure that the correction factor is indeed an average offset value, and not instantaneous noise.

If the operating temp does not change too much, you can get away with the calibration routine to be performed only during the power up process. Otherwise you would have to add temp sensors, and determine when i.e. the temperature deviates more than 10C from the initial calibration temperature, to perform the calibration again.
 
Really?
...With all due respect, you are all over the place.
1. Thanks, I'll take that as a compliment!
2. I assume that by saying "compensating in software" you mean to say calibrating offset in the MCU program. Yes I want to do exactly the same, may be I couldn't convey it properly..
4. MCUs with 12 bit ADC are not cheap, 10bit is being used. And even if they are, we dont typically want to utilize only a few LSBs of the ADC to measure a signal. It would be nice if the entire range could be covered. Even if the resolution of ADC is 0.2mV, it doesnt sound impressive to measure offsets as low as 2mV or perhaps even 1mV using a .2mV resolution. And on top, MCU based ADCs generally do not guarantee the correctness of the LSB!
5. Once a calibration circuit is on board, why does its use have to be limited to one time only!!?? Ofcourse re-calibration is possible.
6. Ok, if we forget everything about op-amp, MCU etc etc... My application is to measure current from a 12V battery which can be as high as 100A.
7. As far as the accuracy, the measured value of current will be used in a closed loop system driving a HF transformer, connected to the battery through an HBridge. So I think an accuracy of atleast 5% should be maintained, dont you agree?
8. Any suggestions about other cost-effective ways to going about current-measurement?

- - - Updated - - -

Thanks Schmitt Trigger! I never thought in this direction. And since an accuracy of around 4% to 5% is required, I don't think temperature should play a significant role.
1. Are you being able to measure offsets in the range of a few millivolts with your calibration routine?
 
Last edited:

As long as you have a known and stable voltage reference, which is divided down with stable, precision resistors, you can easily compensate a few millivolts.
This assumes that the op amp's feedback circuit also has precision resistors, such that the gain is a known quantity.
 

If the current control is purely analog and doesn't use a digital setpoint from µP. you should correct the offset voltage by analog means, e.g. a potemtiometer.

If it has a digital setpoint, digital adjustment is appropriate.

Regarding accuracy, if the quantity of interest is the output current, it should be measured during calibration with an external instrument.
 

Here's my take: Use a 10-bit DAC to generate a small voltage at the input of the amp (through a voltage divider) to give about a 2V full scale (all 10 bits high) output at the op amp. Use a couple of CMOS analog switches to switch the DAC and sensor signal off and on as desired. Calculate the op amp offset from that and use that to correct the subsequent signal readings. Perform this offset correction as often as feasible, depending upon the signal update rate. That will minimize the effect of any drift due to temperature, etc.

You can also measure a low and high voltage level to calibrate the gain as well as the offset of the op amp circuit, if desired. That way the accuracy of your circuit measurement is determined mainly by the accuracy of the DAC and the voltage divider.
 
Last edited:
Here's my take: Use a 10-bit DAC to generate a small voltage at the input of the amp (through a voltage divider) to give about a 2V full scale (all 10 bits high) output at the op amp. Use a couple of CMOS analog switches to switch the DAC and sensor signal off and on as desired. Calculate the op amp offset from that and use that to correct the subsequent signal readings. Perform this offset correction as often as feasible, depending upon the signal update rate. That will minimize the effect of any drift due to temperature, etc.

You can also measure a low and high voltage level to calibrate the gain as well as the offset of the op amp circuit, if desired. That way the accuracy of your circuit measurement is determined mainly by the accuracy of the DAC and the voltage divider.

Thanks for the suggestions, crutschow. I think I'll go for this. Voltage division through resistors had slipped off my mind completely. Thanks again.
 

... please see attached paper section 3. Takes you through all the principles of chopping.
 

Attachments

  • enz_procieee_1996.pdf
    2.6 MB · Views: 345
Thanks diarmuid ! Excellent paper. I think it's a must read for everyone working with amplifiers.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top