Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How can I code a 12-bit-output of an ADC

Status
Not open for further replies.

DuftDerBlumen

Newbie level 4
Joined
May 27, 2010
Messages
6
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Location
Munich
Activity points
1,329
Hi,

I am using an ADC (AD7266) in order to convert an Input Voltage to a 12-bit-value.

I would really like to ask you if there is a standard method for the coding of these bits? For example when I put an input voltage of 1V I get different values like 11010011110 or 11010101000 or11010011001. As I am not familiar with ADCs, I would also like to ask you if this instability is normal and if there are some solutions to reduce it.


Thank you very much in advance for your help.
best regards,
Chadha
 

The output coding depends on the input selection: diff. input will output in twos complement and single inputs will output in straight binary
 

That sounds like noise on your input signal. Even is you simply supply the ADC from a potentiometer it could have noise on it. Also, some ADCs need a low impedance source. A capacitor on the input to ground may help...' read the recommendations with the micro/ADC.

Keith
 

Hi... After seeing replies for your post i got confused that whether you are asking how to get digital code from your ADC or how to program the 12 bit output from ADC.......
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top