Hello
I am simulating a CG low-noise amplifier. And I found the flicker noise of the bias transistor dominates. How could it contribute more than the input transistor? The bias transistor is circled in the following picture. And how can I reduce its flicker noise?
Thanks
You should increase area of the transistors(both bias and CG transistors). That is increase W and L while keeping W/L constant.
Bias transistor's flicker noise is increased by the amount of mirror ratio, that may be the reason why bias noise is dominant. Therefore, be sure reference current is not too small than CG current. In the expense of higher power consumption (due to increase in reference bias current), your noise should be better.
You should increase area of the transistors(both bias and CG transistors). That is increase W and L while keeping W/L constant.
Bias transistor's flicker noise is increased by the amount of mirror ratio, that may be the reason why bias noise is dominant. Therefore, be sure reference current is not too small than CG current. In the expense of higher power consumption (due to increase in reference bias current), your noise should be better.
If the gate bypass capacitor has no effect in the frequency range of interest, the bias transistor will contribute to overall noise anyway. Increase it.
You should increase area of the transistors(both bias and CG transistors). That is increase W and L while keeping W/L constant.
Bias transistor's flicker noise is increased by the amount of mirror ratio, that may be the reason why bias noise is dominant. Therefore, be sure reference current is not too small than CG current. In the expense of higher power consumption (due to increase in reference bias current), your noise should be better.
Thanks. I agree with you about the influence of transistor size on flicker noise. I don't think there is a direct link between the consumed current and flicker noise. You could reduce flicker noise by using a bias transistor with a larger width and a smaller mirror ratio. But the decrease in flicker noise is due to the increase in width rather than the increase in current. However, the growth of the current reduces the thermal noise.
If the gate bypass capacitor has no effect in the frequency range of interest, the bias transistor will contribute to overall noise anyway. Increase it.
Flicker noise is directly related to the current density in the transistor.
My simulations shows that I get minimum flicker noise when I use a bigger transistor supporting higher Id/Ic, but used at its minimum as possible Id/Ic current.
I found that this is valid for both, CMOS and BJTs..
Flicker noise is directly related to the current density in the transistor.
My simulations shows that I get minimum flicker noise when I use a bigger transistor supporting higher Id/Ic, but used at its minimum as possible Id/Ic current.
I found that this is valid for both, CMOS and BJTs..
How do you define current density?
So you are talking about BJT. I am not familiar with that. But in MOS, Id equals Is. So I do not understand how to apply the relationship between flicker noise and Id/Ic in MOS.
I guess you are wondering how flicker noise changes with the source-drain voltage. But I don't think that is a problem.
Cause in a bias circuit, the transistor is connected like a diode, and there is enough vds.
For the CG structure mentioned in this thread, the input transistor will also be set to work in saturation mode. And if its vds is changed. Its current and transconductance will change too. So as the noise transfer function. Thus, analyzing the influence of Vds on flicker noise merely is not easy.