Low SNR in ADC at high input frequencies

Status
Not open for further replies.

Chinmaye

Full Member level 3
Joined
Jan 18, 2016
Messages
164
Helped
0
Reputation
0
Reaction score
1
Trophy points
1,298
Activity points
3,145
Dear All,
In an ADC, it is seen that the SNR degrades as the input frequency increases close to fs/2 where fs is the sampling frequency. Why?
 

Hi,

the input frequncy amplitude might be attenuated due to an input filtering network in front of the ADC, thus leading to an SNR reduction. How does your setup look like? Theoretical or practical?

Might also be caused by the sampling itself. Imagine the (sinusodial) input waveform is never/barely sampled at its maximum, thus leading to a reduced signal amplitude in the digital domain. Here the question is how the SNR is determined (FFT ?) and how large/long is the data set.

BR
 

Classical flash ADC have usually constant noise density up to fs/2 (and even above), acquisition bandwidth >> fs.
 

I would also consider the noise bandwitdh as constant. Thus my thoughts about a decreased signal amplitude.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…