Djaferbey
Newbie level 6
I'm trying to make a digital decoder. I sample a digitally modulated signal with the LTC2387, an 18-bit 15MHz SAR ADC. The carrier signal is at 2.1MHz, and is digitally mixed down to DC . After that,decimation is peformed. The ADC has a dynamic range of 95dBFS, and then I conduct a BERT (bit error rate test). For some reason, errors start occurring when the signal is at -106dBFS. This doesn't make any sense since I only have 95dBFS of dynamic range. I suspect that performing decimation might help improving the signal to noise ratio, but I'm not sure. I hope someone can provide an explanation