Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How to reduce the noise of an opamp design?

Status
Not open for further replies.

hearter

Member level 1
Joined
Nov 12, 2009
Messages
41
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,569
I am working on a single stage 5 transistor opamp, try to minimize noise (thermal and flicker) by sizing the MOS and current, but really can't come up a optimal solution, for example, WL need to be big to reduce flicker noise, thermal noise will increase if you have higher transconductance. if I increase the current thermal noise to increase(proportional to gm considering the output noise current of each MOS), but the input refered noise voltage is actually reduced because input refered noise is proportional to 1/gm.

anyone here has more experience as to what is the rule of thumb to reduce noise of an opamp design.

really appreciate.

BTW, how do you simulate in spectere or hspice, what is inportant output noise or input refered noise?

Attached is a picture.
 

Re: opamp noise question

Do you need to minimize the total integrated noise - Vrms or Irms? If so, depending on your BW, you should realize that the contribution of the flicker noise is quite insignificant compared to the contribution of the thermal noise in the total integrated noise. If this is your goal concentrate mainly on minimizing the thermal noise.
 

opamp noise question

Thanks Sutapanaki

I thought flicker noise is more significant because it is lowr freq compared to thermal noise(of course thermal noise has low freq part too), if you integrate from low freq to high freqm the low freq part is more significant, isn't it?

also appreciate if you can elabrate difference between integrated noise Vrms and Irms, why they are diffrerent interms of design.
 

Re: opamp noise question

I mentioned in my previous post that what I said depends on the BW of the circuit. Of course if your circuit only works at low frequencies and you cut out high frequencies, than flicker noise is dominant. If your circuit is still processing low frequency signals but is a sampling kind of architecture, then thermal noise is more significant. And finally, if your circuit works up to high frequencies, then thermal noise is far more important than flicker noise. To get the total noise power, we usually integrate noise psd in the band of interest. Square root of this total noise power I designated as Vrms or Irms. Noise voltage or current, depending on which one is of more interest to you.
It is misleading to think that if you integrate in frequency, the flicker noise will be the more significant component. Flicker noise psd is something like k/f. Integrating between f1 and f2 results in total noise power of k*ln(f2/f1), so it doesn't matter if you integrate between 1Hz and 1KHz or 1MHz and 1GHz - you get the same amount of total flicker noise power. Integrated thermal noise is something like k*(f2-f1) and it makes a huge difference if you integrate between 1MHz to 1GHz compared to integrating from 1Hz to 1KHz. The higher the BW, the higher the significance of thermal noise as compared to flicker noise.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top