Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

ADC settings

Status
Not open for further replies.

Mtech1

Junior Member level 1
Junior Member level 1
Joined
Mar 18, 2023
Messages
18
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
178
Hello,

I'm reading the general description of the ADC features in a any microcontroller, and I have some questions on my mind. I'm trying to understand what configuration settings I need to set in a program to read a 0-12V DC voltage with a 10-bit ADC.

Specifically, refrance voltage, sampling time and trigger mode. I'm confused between using a 5V or 12V DC reference voltage. 5V because the microcontroller operates at 5V, but 12V because it's the maximum voltage range I want to read. Additionally, I'm not sure about the sampling time. There are two options: manual or automatic, and I want to set the automatic conversion mode.
 

Trying to answer a probably too general question. Generally no microprocessor is able of processing 12 V analog input directly. It needs a voltage divider to match ADC input range. Respectively 12 V source voltage doesn't determine reference voltage.
 

Hi,

Every ADC is different.
Thus please give a link to the datasheet.

Klaus
 

Hi,

This still is not a datasheet....
You've chosen to do so ... then I need to refer for every specific item to the according datasheet...

******
Voltage considerations
* there is the "absolute maximum rating" for (pin) input voltage. It usually depends on the applied supply voltage VCC.
Often it is like: -0.5V .... VCC + 0.5V. --> read datasheet. If your input voltage exceeds these limits it may immediately and permanently kill the ADC/microcontroller. Mind: this also is true in times when the ADC/microcontroller is not powered. Then VCC is considered 0V ... and thus the pin voltage may not exceed +/-0.5 V or so.

* there is the decodable input voltage range. It usually depends on the "ADC Reference Voltage" VRef. Often this is 0V ... VRef. --> read datasheet. It also depends on ADC settings.
Some ADCs can work with bipolar input voltages, some can work in differential mode, some ADCs have internal amplifiers (PGAs). -> Read datasheet

* VRef. You may have various options which VRef to chose. Often: Internal (bandgap) VRef, external VRef, VCC used as VRef. Read datasheet. Each determines the decodable voltage range. VRef needs to be accurate, stable and clean (low noise). Every error on VRef will be reflected 1:1 on the decoded digital value. Mind: VCC is a supply voltage, it usually is not accurate, is noisy and driftw with time, temperature and load current. If you need accurate, reliable ADC results ... it means you can not use VCC as reference. Also often a VRef needs external capacitors --> read datasheet.

* voltage resolution:
An ADC decodes the analog input voltage into integer steps. Usually 2^n steps, where n is the ADC resolution in bit_count. A 10 bit ADC will give 2^10 = 1024 steps. Often one can additionally chose to reduce resolution to gain conversion speed. Read datasheet.
If you use a 3.00V VRef on a 10 bit ADC, then the voltage resolution often (not always, see decodable input voltage range) is 3.00V / 1024 = 2.93 mV. With the digital output range of 0...1023 (= 1024 steps) it usually represents 0V to 2.997V. It usually is one LSB below VRef.

* voltage errors, problems:
Every ADC introduces errors. Offset error means that "virtually" there is a small voltage added to the input voltage. The error may be positive as well as negative. Read datasheet. On a system without negative supply voltage the minimum input voltage should be 0V ... if one consuderes offset error of lets say 10mV ... you can never get an ADC output of 0..3. (3V VRef, 10 bits).
You can not expect the software line " IF (ADC_value < 2) ... " to work reliably.
The same applies to gain errors.
Because of this (and additionally rail probles of eventually used Opamps) you need to add some headroom to your desired range.
For the above setup you are never able to destinguish between: 2.997V, 3.00V, ... 3.1V ... it will alway show 1023 max.
Thus if you are really interested to calculate with 12V signals you should chose a decodable input voltage range up to 12.5V or so.

* external voltage dividers:
Are used to extend the input voltage range. Often two simple resistors connected as voltage divider.
So if your input voltage is 12V then you need to use a 4:1 divider to get the signal down to 3.0V. (Again mind the headroom)
Resistor values need to be chosen to meet the ADC's input specifications. Often the maximum "source" impedance is given as let's say 10 kOhms. Read datasheet.

**********
Now one could write the same lengthy text for ADC timing, sampling frequencies, nyquist limits, anti aliasing filters....and so on.

But all this has been already discussed in numeruous documents, tutorials, even videos.
Read through a couple of them ... then come back with detailed questions.

Klaus
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top