Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Switch Cap Sigma Delta ADC Reference, use supply or need an LDO?

Status
Not open for further replies.

jgk2004

Full Member level 5
Joined
Dec 1, 2009
Messages
274
Helped
77
Reputation
154
Reaction score
74
Trophy points
1,308
Location
Munich Germany
Activity points
3,520
Hello,

I am wondering if anyone has any industry experience in this topic. The switch cap feedback in a sigma delta ADC (We can also consider a SAR ADC) samples a Vrefp and Vrefn. Lets just do a single bit feedback example. If I make this Vrefp = VDD and Vrefn=VSS this means I maximize my signal to noise ratio since this allows for larger input signal into the ADC. Now if I look at the peak currents pulled from this switch cap and compare them to an high speed LDO which would be needed to deliver these currents, the supply on the LDO has about the same peak currents. Thus doesn't it defeat the purpose of using an LDO? How many people just connect up their references right up to VDD/VSS? Do you always use LDOs? One other issue with an on chip LDO would be I also have to pick Vrefp/Vrefn smaller then VDD/VSS, maybe VDD-200mV and VSS+200mV to keep pass LDO transistors in saturation.... which is bad for signal to noise ratios...

NOTE: when I say connect up to VDD/VSS I still assume separate bumps/bondpads for some isolation. But then off chip it would have its own big LDO with large decap.

I am looking for vote and reasons if possible... can't really find anything on the web :(

Thanks for all feedback
JGK
 

I think there are more reasons to use reference voltage and LDO:
- VDD tolerance is too high,
- unknown supply noise in many cases,
- modularity if the ADC must be used in more chips,
- non-linearity at the edges of input range,
- maybe not so lucky to connect low voltage devices to rails which are connected somehow to the environment (ESD) or a different voltage domain.
 

Hi,

I have no experience with IC design ... but with precise analog and digital signal processing for the industry.

From my experience ... and my error calculations:

If I make this Vrefp = VDD and Vrefn=VSS this means I maximize my signal to noise ratio since this allows for larger input signal into the ADC.
This is not generally true.
It just gives you a higher input voltage range. But not necessarily a better signal to noise performance.
It depends on the signal source and how the signal is "conditioned" on the analog side.
Example:
If you have a input signal with 1Vpp range at an ADC with 1Vpp range
and now you increase the ADC input range to 3Vpp
--> usually you want to amplify the input signal. Usually with an OPAMP. But his Opamp circuit will introduce new errors like offset, offset drift, nonlinearities and - last but not least - noise.
You have to take all this into account to see whether this gives an improvement in overall performance.
Sometimes it does, sometimes not. It´s not clear from the beginning.

Now if I look at the peak currents pulled from this switch cap and compare them to an high speed LDO which would be needed to deliver these currents, the supply on the LDO has about the same peak currents.
You should not do this as described.
Instead you should have a suitable capacitor (in size and type) at the LDO output to provide this short peak current. For sure this still will require the LDO to deliver the current, but over a longer time. Much more relaxed.
You should avoid ringing - or in other words: improve stability. This may be satisfied with a series resistor.

How many people just connect up their references right up to VDD/VSS?
Many. Too many for may taste.
I don´t like the idea of using VDD as reference for an ADC. VDD is supply voltage, not an accurate, precise, stable, reliable reference. VDD may by inaccurate, drift with time and load current and will carry a lot of noise.
This errors you will see to 100% ath the ADC result.
You may suppress high frequency noise, but not drift and load current problems.
For sure it depends on the requirements of the application, but as soon as you expect more than 8bit precision you need a clean REF. (There are exceptions, like ratiometric measurement)

Some people use a high resolution 24bit ADC .... but 16 of the 24 bits are useless besause of noise and drift.
To verify on your own:
* check the variation of lets say 100 consecutive samples (on a known stable input)
* check the variation of a known input today and tomorrow, or in winter and in summer

To get a clue about the noise source: Run an FFT on a lot of ADC samples to see noise frequencies.

One other issue with an on chip LDO would be I also have to pick Vrefp/Vrefn smaller then VDD/VSS, maybe VDD-200mV and VSS+200mV to keep pass LDO transistors in saturation.... which is bad for signal to noise ratios...
I can´t agree (indeed only partly)
An example: 12 bit ADC with an input range of 3.3V vs 3.0V
an LSB represents:
* 0.806mV @ 3.3V
* 0.732mV @ 3.0V
--> the difference is just 73uV. I doubt one relly cares about 73uV at a 12bit ADC.
(you may adjust your calculation with the valles from your application)
Further calculation.
What does this mean for the ADC results.
Let´s convert a 100.00mV signal:
* The ADC output is 124 (3.300V) --> calculates back to 99.90mV (with an uncertainty of 0.806mV)
* The ADC output is 137 (3.000V) --> calculates back to 100.34mV (with an uncertainty of 0.732mV)
And here (just for this 100mV example) the deviation of 0.34mV is almost worst case.
If you use rounding method both values result in 100mV (don´t round to less than the LSB step size)

The error (uncertainty) on a 100mV signal is
* 0.806mV (3.3V) = +/-0.403% wrt. 100mV
* 0.732mV (3.0V) = +/-0.366% wrt. 100mV
--> The error of a resistive voltage divider usually is higher. And for sure the error of VDD drift is much much higher.

****
Conclusion:
It depends on the application whether one solution is better than the other.
Like whether you are interested in AC, low DC values, high DC values, ratiometric, absolute, relative....
--> to get best overall performance you have to take care for the whole measurement chain. And you have to focus on the part with the highest error.

Klaus
 

Hi KlausST,
I like all you feedback, and really like your conclusion. Let me answer some stuff inline with you comments and I will make some points which maybe we can chat about in more details.

Hi,

I have no experience with IC design ... but with precise analog and digital signal processing for the industry.

From my experience ... and my error calculations:
My old statement copied: If I make this Vrefp = VDD and Vrefn=VSS this means I maximize my signal to noise ratio since this allows for larger input signal into the ADC.

This is not generally true.
It just gives you a higher input voltage range. But not necessarily a better signal to noise performance.
It depends on the signal source and how the signal is "conditioned" on the analog side.
Example:
If you have a input signal with 1Vpp range at an ADC with 1Vpp range
and now you increase the ADC input range to 3Vpp
--> usually you want to amplify the input signal. Usually with an OPAMP. But his Opamp circuit will introduce new errors like offset, offset drift, nonlinearities and - last but not least - noise.
You have to take all this into account to see whether this gives an improvement in overall performance.
Sometimes it does, sometimes not. It´s not clear from the beginning.
When designing an ADC I want to max my input signal. If my VDD and VSS are my references, I can the have my max signal between these values, within an single bit sigma delta (DSM) ADC I can only achieve -4 to -5dBFS within this VDD to VSS, thus I am not at 0dBFS. Thus there is no problem for the driver amplifier/buffer since it is not going to go to 0dBFS anyways(rail to rail) for the input signal. The DSM ADC can never achieve 0dBFS due to feedback stability. Thus if I max my input signal, I can then design for example my KT/C noise to meet my ADC requirements, thus maximizing my SNR with min power dissipation. Of course I need to take into account lineup noise/linearity, offset from proceeding blocks but still max signal, I would assume the best thing possible.

My old statement copied: Now if I look at the peak currents pulled from this switch cap and compare them to an high speed LDO which would be needed to deliver these currents, the supply on the LDO has about the same peak currents.
You should not do this as described.
Instead you should have a suitable capacitor (in size and type) at the LDO output to provide this short peak current. For sure this still will require the LDO to deliver the current, but over a longer time. Much more relaxed.
You should avoid ringing - or in other words: improve stability. This may be satisfied with a series resistor.
I unfortunately don't have this option. I need to have each sampling time my LDO settle below an LSB/2, thus you either have a power hungry LDO (Class A) which burns some current to drop down the impedance and deliver peak currents and or a very large cap. I don't have those options since I need to have a local LDO which doesn't consume more current then the ADC. I have designed a very wide BW LDO with fast settling to do the job with no decap. My ADC clock is at 20MHz, and this LDO GBW is at 300MHz while only consuming 20uA.... This also meets my noise requirements as well. That is why I was looking at peak currents and questioning using it.... Have you ever done high speed LDOs before? capless?

My old statement copied: How many people just connect up their references right up to VDD/VSS?
Many. Too many for may taste.
I don´t like the idea of using VDD as reference for an ADC. VDD is supply voltage, not an accurate, precise, stable, reliable reference. VDD may by inaccurate, drift with time and load current and will carry a lot of noise.
This errors you will see to 100% ath the ADC result.
You may suppress high frequency noise, but not drift and load current problems.
For sure it depends on the requirements of the application, but as soon as you expect more than 8bit precision you need a clean REF. (There are exceptions, like ratiometric measurement)

Some people use a high resolution 24bit ADC .... but 16 of the 24 bits are useless besause of noise and drift.
To verify on your own:
* check the variation of lets say 100 consecutive samples (on a known stable input)
* check the variation of a known input today and tomorrow, or in winter and in summer

To get a clue about the noise source: Run an FFT on a lot of ADC samples to see noise frequencies.
I agree here... I think a lot of people do do it... but. If you have dedicated bumps/balls/pads with separate LDOs just for these pins, can't you meet the requirements of being accurate, stable and low noise?, Like what is the difference of an LDO on chip or offchip besides the issues with inductance/cap over connecting into the chip. Is that the only issue? or you still feel all external LDOs can't be as good as designed on chip ones?

As for your 24 bit example, Im not at that range, only 14bits, but I can see your point here. You could be shopping for external LDOs for awhile or burn even more current to make them clean off chip.


My old statement copied: One other issue with an on chip LDO would be I also have to pick Vrefp/Vrefn smaller then VDD/VSS, maybe VDD-200mV and VSS+200mV to keep pass LDO transistors in saturation.... which is bad for signal to noise ratios...
I can´t agree (indeed only partly)
An example: 12 bit ADC with an input range of 3.3V vs 3.0V
an LSB represents:
* 0.806mV @ 3.3V
* 0.732mV @ 3.0V
--> the difference is just 73uV. I doubt one relly cares about 73uV at a 12bit ADC.
(you may adjust your calculation with the valles from your application)
Further calculation.
What does this mean for the ADC results.
Let´s convert a 100.00mV signal:
* The ADC output is 124 (3.300V) --> calculates back to 99.90mV (with an uncertainty of 0.806mV)
* The ADC output is 137 (3.000V) --> calculates back to 100.34mV (with an uncertainty of 0.732mV)
And here (just for this 100mV example) the deviation of 0.34mV is almost worst case.
If you use rounding method both values result in 100mV (don´t round to less than the LSB step size)

The error (uncertainty) on a 100mV signal is
* 0.806mV (3.3V) = +/-0.403% wrt. 100mV
* 0.732mV (3.0V) = +/-0.366% wrt. 100mV
--> The error of a resistive voltage divider usually is higher. And for sure the error of VDD drift is much much higher.
I see your example and for large ranges it makes sense, but I have a lower supply, only 1V, thus when decreasing to 0.6 since 200mV top bottom, I lose ~40% of my signal range. Thus I am almost doubling my power consumption for noise requirements.
Also, when looking at your statements on errors, they all make sense I think so, but I think when considering power consumption maximizing these references means all the most. I could always make the statement, if I need more accuracy, I could just increase my resolution... but then I would also need to increase power again.

****
Conclusion:
It depends on the application whether one solution is better than the other.
Like whether you are interested in AC, low DC values, high DC values, ratiometric, absolute, relative....
--> to get best overall performance you have to take care for the whole measurement chain. And you have to focus on the part with the highest error.
You are very true here. Thats why I am challenging the regulators at this point. I need to meet my power requirements and I am concerned with the LDOs eating my headroom. Also, I am worried about settling requirements on the LDO since if they don't settle the linearity is lowered big time.

I am looking at three choices:

Fast LDOs which operate within my supply, here core devices for speed. It cost signal power, thus increasing ADC power, seems to be best option at this point with min power increase in LDOs also meets linearity, small area.

Slow LDOs with increased supply on chip. Here high VT devices are used, they are slow and I need large decap. Best power for ADC, but not sure about linearity due to LDO regulation speed/ decap needed. Maybe decap can be off chip, bond/bump cap inductance could be problem (Is this the most common approach? Just have decap off chip.)

I could push off chip and use external LDOs and decap, bond/bump cap and inductance an issue with large decap. external LDO can be issue due to accuracy, noise.

I am voting for choice 1, but not 100% sure at this point... thats why I am posting here and wanting feedback!

JGK
 

Hi,

I see there is a lot of confusion because of my lack of experience in IC design.
I hope I did not confuse you too much.

*****
I can only speak as an IC_user with experience in designing industrial measurement devices.

No need to read through the lengthy post...It´s just my feedback on your questions.


When designing an ADC I want to max my input signal.
This may be true for many cases, but not generally. Maybe it´s good for your application, I don´t know.

I bought a delta sigma ADC powered with 3.3V. It has a max input voltage range of 600mV and I´m happy with it.
It even has an input gain stage to make the usable input voltage range even smaller.
Having the same ADC with 3.3V input voltage range does not make my measurement system better in any way I can think of.

If you just focus on the ADC, then you may be right (in most cases). -- conclusion

but still max signal, I would assume the best thing possible.
When we just look at the SNR of the ADC ths might be correct.

Different opinions are no bad thing. We both have different jobs. You are an IC designer - and thus you are focussed on IC performance. I have no experience with IC design, I´m just an IC user. I´m a measurement device designer, thus I´m focussed on overall performance. These are different views with different focus. None is better than the other.

I unfortunately don't have this option.
This is why I worte I have no experience with IC design. I don´t have a clue about the options you have or not. I´m not even a beginner in this field. :-(
My experince is limited in using an LDO as IC and an ADC as another IC.

--> There are many experienced IC designers in this forum. I hope they can help you more than I with this.

Have you ever done high speed LDOs before? capless?
Definitely not.

can't you meet the requirements of being accurate, stable and low noise?
As a non_IC_designer: For me an LDO is a voltage regulator for a power supply. It´s not that important to be low noise, low drift ... like a voltage reference (in my thinking of a seperate IC to generate a low noise, very accurate, low drift voltage signal).

I have a lower supply, only 1V
I´m used to 3.3V ...

if I need more accuracy, I could just increase my resolution...
...but accuracy and resolution are two different and independent things.
The accuracy of an 8 bit ADC can be better than the accuracy of a 12 bit ADC.

I am looking at three choices:
Although I can´t help you ... I guess it could be useful to give values for
* expected accuracy
* expected current consumption
* linearity
* and so on
so others with more experience in IC design can better assit you and focus on the one thing or the other.

Klaus
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top