i'm simulating a 8-bit Folding and Interpolating ADC and just found out that its SFDR is at 60dB at sampling rate of 25MHz. When i increase the sampling rate to 40MHz the SFDR of my ADC degrades to 30dB. Does anyone have the idea which main factors cause this? Thanks in advanced
I would imagine it relates to the settling time inside
the pipeline(?) stages, finite time to get to sub-LSB
error against a shortening period.
Have you determined which element of the SFDR rollup
is driving the degradation (noise vs offset vs gain error
or whatever)?
I am somewhat surprised at getting 60dB SFDR (1000:1)
out of an 8-bit (256:1) machine. That is suspicious to
my barely-DAQ-educated eye. Perhaps this initial result
is unrealistically good and realism rolls on harder at 40MHz
simulated.
I mean that SFDR is a combined figure of merit that
embodies both DC (INL, DNL, etc.) and frequency
dependent (bandwidth, settling time) effects. As such
it is perhaps a good quick-comparison metric, but
not necessarily enlightening about the dominant
effects. By decomposing (or resimulating with the
different aspects picked off individually) you might
see what drives the degradation more clearly.