mtwieg
Advanced Member level 6
Hi all, for a project I need to design some very low noise OTAs for up to 10MHz, in UMC65nm, supplied by around 2.5V. Using 1.2V mos devices is attractive, and the small signal simulations look good, but I'm not really comfortable with it. I don't think I fully understand the meaning of the different voltage levels and what operating conditions they refer to, and how flexible they are. Things are simple when you're just building logic gates, since your source and bulk will always be biased near a supply rail, but for an OTA this is not the case.
So what actually defines the proper operating limits of a 1.2V mos? Vgs, Vgb, Vdg, Vdb, Vds, etc? Should I be doing some large signal tests to ensure nothing terrible is happening? Maybe simulating the rise of the supply rails when powering on? How do I know what's tolerable and what isn't?
So what actually defines the proper operating limits of a 1.2V mos? Vgs, Vgb, Vdg, Vdb, Vds, etc? Should I be doing some large signal tests to ensure nothing terrible is happening? Maybe simulating the rise of the supply rails when powering on? How do I know what's tolerable and what isn't?