Source degeneration for differential amplifier

Status
Not open for further replies.

Junus2012

Advanced Member level 5
Joined
Jan 9, 2012
Messages
1,552
Helped
47
Reputation
98
Reaction score
53
Trophy points
1,328
Location
Italy
Activity points
15,235
Dear friends,

I need to ask you please why designers goes for designing degenerated source differential amplifier while they can reduce the W/L of the input transistors, if we choose L greater than W it will give us more and more linear range.
Both of the techniques leads to decrease the input transconductance with the same amount we gain linearity.

Thank you in advance
 

Increasing of L doesn't help on linearity, you just move operating point to less steeper point of the square law characteristic. However degeneration uses the high non-linear gain of the diff.pair to introduce a very linear gain by negative feedback. Negative feedback is necessary to reduce distortion, and degeneration is a very simple way to implement it.
And source degeneration of diff. pair doesn't increase Vgs like increasing of the length, so it cannot ruin the input common mode voltage range (just an other reason, no connection with linearity).
 

Thank you Frank for your reply, that answered completely my question.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…