seyyah
Advanced Member level 2
ADC conversion problem
I have a delta sigma adc (cs5532). I'm trying to convert the small signals i.e mVs into digital. It has a 5V supply and a 2.5V ref voltage which is at 2.490 or 2.495V and maintained by a low ppm low noise ref voltage ic. The ADc is in unipolar mode. So the negative values should be measured as zero and that is the case mostly. I'm getting some measurements that are not consistent. For example:
@ gain 1.
If the input voltage is between 0 and 5.5 mV the output is same as the input.
If the input is between 5.5mV and ~10mV then the output is 10mV below the input ; i.e. from -4.5mV to 0
If the input is above the ~10mV then the output is again same with the input.
I could not understand it and solve it. I have tried to design the circuit from the beginning and tried different cs5532s but nothing changed. Does this sense to you? Please help me if you have any idea, thanks.
I have a delta sigma adc (cs5532). I'm trying to convert the small signals i.e mVs into digital. It has a 5V supply and a 2.5V ref voltage which is at 2.490 or 2.495V and maintained by a low ppm low noise ref voltage ic. The ADc is in unipolar mode. So the negative values should be measured as zero and that is the case mostly. I'm getting some measurements that are not consistent. For example:
@ gain 1.
If the input voltage is between 0 and 5.5 mV the output is same as the input.
If the input is between 5.5mV and ~10mV then the output is 10mV below the input ; i.e. from -4.5mV to 0
If the input is above the ~10mV then the output is again same with the input.
I could not understand it and solve it. I have tried to design the circuit from the beginning and tried different cs5532s but nothing changed. Does this sense to you? Please help me if you have any idea, thanks.