NikosTS
Advanced Member level 4
Hello everyone,
Suppose we have a Fractional-N frequency synthesizer employing an MMD + 3rd order MASH loop. If we want to divide by the fractional value 100.XX ( where XX = fractional value ) , the MASH will instruct the MMD to divide by values between 97 - 104.
My question is , which is the maximum delay for a new division value to be loaded from MASH to the divider?
At first glance, you would say it should be 1 period of VCO frequenct in order not to lose a single period between the 2 divison values. However because of the architecture of MMD , as I understand, there is some extra time that you can still load a new value without dividing wrongly.
If Tvco is the output period of the VCO, could we quantify this maximum delay in terms of VCO periods? E.g , max delay = 10*Tvco?
If i wasnt clear enough, please tell me so I can explain it better
Thank you advance
Suppose we have a Fractional-N frequency synthesizer employing an MMD + 3rd order MASH loop. If we want to divide by the fractional value 100.XX ( where XX = fractional value ) , the MASH will instruct the MMD to divide by values between 97 - 104.
My question is , which is the maximum delay for a new division value to be loaded from MASH to the divider?
At first glance, you would say it should be 1 period of VCO frequenct in order not to lose a single period between the 2 divison values. However because of the architecture of MMD , as I understand, there is some extra time that you can still load a new value without dividing wrongly.
If Tvco is the output period of the VCO, could we quantify this maximum delay in terms of VCO periods? E.g , max delay = 10*Tvco?
If i wasnt clear enough, please tell me so I can explain it better
Thank you advance