tomk
Newbie level 6
- Joined
- Oct 15, 2012
- Messages
- 14
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1,281
- Activity points
- 1,431
I have a circuit that consists of an analog sensor, some signal conditioning, and an ADC (ADuCM360). Everything is powered off the same supply. The ADC supply can be anywhere between 1.8 and 3.6 V. From an ADC noise performance standpoint, I'm trying to decide if there's any advantage to using a supply voltage of 3.6 V (top end of the ADC range), compared with a more standard 3.3 V. I understand that the resolution (volts per bit) will be different, but what about noise? How should I approach this problem?
For the sake of the discussion, assume other sources of noise (voltage regulator, analog sensor, conditioning electronics) will scale with the supply voltage.
Thanks for your help.
For the sake of the discussion, assume other sources of noise (voltage regulator, analog sensor, conditioning electronics) will scale with the supply voltage.
Thanks for your help.