israel_Y
Member level 1
- Joined
- Feb 8, 2010
- Messages
- 34
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1,286
- Location
- The Netherlands
- Activity points
- 1,534
I have an ADC sort of similar to digital ramp ADC (3-bit). I have a clock for the counter, another clock for sample and hold (to sample input signal) and input sine wave to be digitized. My setup to characterize ADC is S/H --> ADC --> DAC --->oscilloscope (DA).
I am using the code from maxim-ic page to calculate SNDR, ENOB..parameters after modifying the code a little (https://www.maximintegrated.com/app-notes/index.mvp/id/729). everytime i record the data and do analysis, i get variation in ENOB up to 0.5 which is huge for 3-bit ADC. My question is how should i set the sampling frequency of the oscilloscope??
And in the code there are variables to input such as Fclk, Fin, Fs. I set them as Fclk = sample and hold (S/H) frequency, Fin = sine input frequency and Fs = sampling frequency of oscilloscope. Am i right? please help.
Thank you in advance
I am using the code from maxim-ic page to calculate SNDR, ENOB..parameters after modifying the code a little (https://www.maximintegrated.com/app-notes/index.mvp/id/729). everytime i record the data and do analysis, i get variation in ENOB up to 0.5 which is huge for 3-bit ADC. My question is how should i set the sampling frequency of the oscilloscope??
And in the code there are variables to input such as Fclk, Fin, Fs. I set them as Fclk = sample and hold (S/H) frequency, Fin = sine input frequency and Fs = sampling frequency of oscilloscope. Am i right? please help.
Thank you in advance