specifies the frequency at which a full-scale
input of anADC leads to a reconstructed output 3 dB belowits lowfrequency
value. This definition differs from what is used for amplifiers which usually
use a small signal input.
but how to select the sample clock?
and with the increasing of the input frequency,we convert the digital output back to analog,and see at which frequency the amplitude fall 3 dB?
but how to select the sample clock?
and with the increasing of the input frequency,we convert the digital output back to analog,and see at which frequency the amplitude fall 3 dB?
No its not signal power.. Its rather Signal to noise ratio(Signal to noise and distortion ratio to be more precise)!
Input signal bandwidth is that bandwidth which decreases your SNDR by 3 dB! Why 3 dB? Here is the reason..
For ADC, ENOB(Effective number of bits is one of the parameters)..
ENOB = (SNDR-1.72)/6.02
You can also see it like this.. SNR = 6*N + 1.72 dB where N is the number of bits of your adc.. So for 10 bits SNR must be around 62dB!
A decrease of 3dB in SNDR wil decrease your ENOB by approximately .5 bits.. For instance, if its a 10 bit ADC, signal bandwidth can be till the frequency where you get your ENOB .5 bits lesser! Some thing like 9 from 9.5 enob!
For more information, please refer CMOS data converters book by Gustavsson..
As an additional remark, ADC analog bandwith isn't related to sampling frequency. In case of oversampling ADC, it may be even higher than the sampling frequency. Basically, the analog bandwith is defined by the ADC acquisition time. Of course, if signals above nyquist frequency (0.5 * sampling frequency) are acquired, they can't be reconstructed unequivocally, but in some applications (e. g. digital receiver IF procssing), this isn't necessary.