s55
Junior Member level 2
I have some questions about an ADC performance using jittery clock.
Here are some questions to be used in my questions.
Let's ignore other noise sources except the quantization and the clock jitter for simple discussion.
SNR_j = -20 log(tj * 2*pi * fo) --- (eq.1)
= 10 log (P_sig / P_j) --- (eq.2)
where SNR_j: SNR between the signal power and the jitter induced noise power
tj: rms clock jitter and fo: ADC input signal frequency.
fo: ADC output signal frequency
P_sig: signal power in watt
P_j: jitter induced noise power in watt
Here are my questions:
1. Theoretically, the sum of the signal power and the harmonics' power at the ADC output is supposed to be reduced by the amount of the total phase noise power caused by the clock jitter. Then, I calculated that total reduced power, ignoring the harmonics' power portions for simple discussion, as
For example, assuming tj = 20 ps_rms, fo = 10 MHz, and the signal power is 10mW or 10dBm (ie. swing signal from +1v to -1v) then
(using (eq.1))
SNR_j = -20 log(tj * 2*pi * fo) = 58dB
(using (eq.2))
SNR_j = 10 log (P_sig / P_j)
P_j = P_sig / (10^(SNR_j / 10)) = 15.8nW
The reduced signal power is then: P_sig - P_j = 10mW - 15.8nW = 0.0099999842 W
or 10 log (0.0099999842W / 1mW) = 9.9999931 dBm (it's still almost 10dBm)
Personally, 20ps of the clock jitter is too high for an ADC operation, and so I expected the fundamental signal power would be significantly reduced, but it is not reduced at all, as seen in my calculation. Please correct my derivation, if any.
2. It is known that the clock jitter causes 2 types of the phase noises: the close-in phase noise and the broadband phase noise.
If we know the clock-jitter induced noise, 15.8nW, then can we split it to the close-in phase noise power and the broadband phase noise? If yes, then how can we calculate it?
Here are some questions to be used in my questions.
Let's ignore other noise sources except the quantization and the clock jitter for simple discussion.
SNR_j = -20 log(tj * 2*pi * fo) --- (eq.1)
= 10 log (P_sig / P_j) --- (eq.2)
where SNR_j: SNR between the signal power and the jitter induced noise power
tj: rms clock jitter and fo: ADC input signal frequency.
fo: ADC output signal frequency
P_sig: signal power in watt
P_j: jitter induced noise power in watt
Here are my questions:
1. Theoretically, the sum of the signal power and the harmonics' power at the ADC output is supposed to be reduced by the amount of the total phase noise power caused by the clock jitter. Then, I calculated that total reduced power, ignoring the harmonics' power portions for simple discussion, as
For example, assuming tj = 20 ps_rms, fo = 10 MHz, and the signal power is 10mW or 10dBm (ie. swing signal from +1v to -1v) then
(using (eq.1))
SNR_j = -20 log(tj * 2*pi * fo) = 58dB
(using (eq.2))
SNR_j = 10 log (P_sig / P_j)
P_j = P_sig / (10^(SNR_j / 10)) = 15.8nW
The reduced signal power is then: P_sig - P_j = 10mW - 15.8nW = 0.0099999842 W
or 10 log (0.0099999842W / 1mW) = 9.9999931 dBm (it's still almost 10dBm)
Personally, 20ps of the clock jitter is too high for an ADC operation, and so I expected the fundamental signal power would be significantly reduced, but it is not reduced at all, as seen in my calculation. Please correct my derivation, if any.
2. It is known that the clock jitter causes 2 types of the phase noises: the close-in phase noise and the broadband phase noise.
If we know the clock-jitter induced noise, 15.8nW, then can we split it to the close-in phase noise power and the broadband phase noise? If yes, then how can we calculate it?