Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Low SNR in ADC at high input frequencies

Status
Not open for further replies.

Chinmaye

Full Member level 3
Joined
Jan 18, 2016
Messages
164
Helped
0
Reputation
0
Reaction score
1
Trophy points
1,298
Activity points
3,145
Dear All,
In an ADC, it is seen that the SNR degrades as the input frequency increases close to fs/2 where fs is the sampling frequency. Why?
 

Hi,

the input frequncy amplitude might be attenuated due to an input filtering network in front of the ADC, thus leading to an SNR reduction. How does your setup look like? Theoretical or practical?

Might also be caused by the sampling itself. Imagine the (sinusodial) input waveform is never/barely sampled at its maximum, thus leading to a reduced signal amplitude in the digital domain. Here the question is how the SNR is determined (FFT ?) and how large/long is the data set.

BR
 

Classical flash ADC have usually constant noise density up to fs/2 (and even above), acquisition bandwidth >> fs.
 

I would also consider the noise bandwitdh as constant. Thus my thoughts about a decreased signal amplitude.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top