jbord39
Newbie level 3
Hey all,
I am currently working on a Master's Thesis for Electrical Engineering and am very interested in the Delta-Sigma ADC using a VCO. I am having trouble finding a solid starting reference for this topic, however in the meantime I have a few questions about these papers if anyone can help.
"A Time-Based Analog-to-Digital Converter
Using a Multi-Phase Voltage-Controlled Oscillator" (IEEE)
The following paragraph:
"The output phase is coarsely quantized by the counter which in this case counts the rising and falling edges of the VCO output. The quantization error of the counter, which we will call the residual phase, is quantized by the phase detector with the resolution of pi/(Number of VCO phases) with the help from the multi-phase outputs of the VCO."
I understand how the phase is coarsely quantized by the counter getting a single phase of the VCO output (the quantization error ranging between 0 and 2pi). However I do not understand how the Phase Detector (which apparently has N inputs) can use the multi-phase output of the VCO to create a better quantization error.
The other reference I have questions about is:
"Analysis and Design of Voltage-Controlled Oscillator
Based Analog-to-Digital Converter" (IEEE)
First, in Fig1, I understand how counting each phase of the VCO and summing them could give a better resolution. However, the sum appears to be just a single bit, and each counter seems to be a single bit. So how could it be able to count more than 1 VCO rising/falling edge per clock cycle?
Also, in the implementation used in this paper:
"The VCO was implemented using the 32-stage differential delay cell shown in Fig. 19 [18]. The ring VCO is followed by 64 1-bit quantized reset counters and a 64-bit adder. The reset counter consists of one divider, two DFFs, and an XOR gate."
Again, I do not understand how a 1 bit counter could really count the number of VCO rising edges assuming it is greater than 1 per sample period.
It also says the reset counter consists of a divider, two DFFs, and an XOR gate. How can you divide a single bit by two?
If anyone could help explain this or give me a good reference I would really appreciate it.
Sincerely,
John
I am currently working on a Master's Thesis for Electrical Engineering and am very interested in the Delta-Sigma ADC using a VCO. I am having trouble finding a solid starting reference for this topic, however in the meantime I have a few questions about these papers if anyone can help.
"A Time-Based Analog-to-Digital Converter
Using a Multi-Phase Voltage-Controlled Oscillator" (IEEE)
The following paragraph:
"The output phase is coarsely quantized by the counter which in this case counts the rising and falling edges of the VCO output. The quantization error of the counter, which we will call the residual phase, is quantized by the phase detector with the resolution of pi/(Number of VCO phases) with the help from the multi-phase outputs of the VCO."
I understand how the phase is coarsely quantized by the counter getting a single phase of the VCO output (the quantization error ranging between 0 and 2pi). However I do not understand how the Phase Detector (which apparently has N inputs) can use the multi-phase output of the VCO to create a better quantization error.
The other reference I have questions about is:
"Analysis and Design of Voltage-Controlled Oscillator
Based Analog-to-Digital Converter" (IEEE)
First, in Fig1, I understand how counting each phase of the VCO and summing them could give a better resolution. However, the sum appears to be just a single bit, and each counter seems to be a single bit. So how could it be able to count more than 1 VCO rising/falling edge per clock cycle?
Also, in the implementation used in this paper:
"The VCO was implemented using the 32-stage differential delay cell shown in Fig. 19 [18]. The ring VCO is followed by 64 1-bit quantized reset counters and a 64-bit adder. The reset counter consists of one divider, two DFFs, and an XOR gate."
Again, I do not understand how a 1 bit counter could really count the number of VCO rising edges assuming it is greater than 1 per sample period.
It also says the reset counter consists of a divider, two DFFs, and an XOR gate. How can you divide a single bit by two?
If anyone could help explain this or give me a good reference I would really appreciate it.
Sincerely,
John