I was wondering why does a diode connected MOS should have an output resistance of 1/gm. I understand that a voltage controlled current source i.e Gm*Vgs, would give a resistance of 1/Gm if same voltage is applied across it but i am not able to figure out what makes it happen in MOS, if taken as a device rather than a model.
Any help would be appreciated as this one has been bugging me a lot.
I got the circuit part of it. But could u explain in physical terms. i.e in terms of working of the mos. I guess this is too much to ask but just curious.
Is it that when u connect drain to gate, which are 180 degrees out of phase for small signals results in lowering of output resistance from r0 to 1/Gm. I mean i dont understand why when the MOS is in saturation(though it is diode connected) should it drop its resistance from r0 to 1/gm.
mos has two control for its current, one is vgs and other is vds.
vgs control called transconductance "gm" and vds control called output impedance "ro"
for mos generally gm*ro>>1, so ro>>(1/gm)...
when you made diode conncted configuration gm and ro comes into parallel and effective ro becomes (1/gm).