Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
If you are talking about differential input for an ADC there is matching tolerance between devices in a differential bipolar input stage. Typical state of art for bipolar matching is 1 mVdc. CMOS runs about 5 mVdc.
There are various methods to calibrate out offset.
Offset means to shift the dc level of the signal.
In your case, offset bipolar signal is to shift the dc level (zero-crossing reference) of the true bipolar signal.
dc level is zero-crossing reference for bipolar signal that is either '1' or '0' (hence bipolar)
Some ADCs have unipolar (e.g. 0-10V) and bipolar (-10 - +10V)
input modes. Of course this is not the norm these days, range
wise. But it may refer to the offset error for "bipolar input mode".
You should read more about the test conditions that the data
Because the configuration differs between unipolar and bipolar
mode, you may see key specs (offset error, gain error, linearity
errors) duplicated per mode.