#### coffeelox

##### Junior Member level 3

**edaboard coffeelox**

Hi there,

I am trying to measure my 10 bits ADC ENOB.

My input signal is 7.5KHz sine wave, sampling frequency is around 552KHz (required by application). And output is the sampled data through dac (written in veriloga, no filter).

First, I selected my input signal and do the DFT as,

From 30u to 1m (did transient simulation from 0 to 1ms),

number of samples 4096

window type Rectangular

smoothing factor 1

coherent gain default

coherent gain factor 1

then select dB20, I got the graph located at right in red. Why it is not just one vertical line located at 7.5KHz?

Second, I selected my output and do the DFT exactly same as above and got the graph at right in blue.

It seems that my input signal is not good and output signal is not good as well. Based on SNR = 6.02ENOB + 1.76, I could not get ENOB close to 10 bits. However, from its time domain signal plot, it seems not bad (left top is input signal, left bottom is dac output).

Why? did I make something wrong?

Thanks...

JS