Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How can I implement the voltage biasing in CMOS circuit?

Status
Not open for further replies.

katrin

Full Member level 1
Joined
Dec 3, 2005
Messages
98
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,288
Activity points
2,200
In bipolar circuit people can shift dwon the voltage level by diodes which voltage drop is almost a constant.

But in CMOS circuit, I want to obtain a certain voltage drop from the supply voltage for biasing of the other transistors. a resistor is usually too big to be implemented. So what can I do in this case, how do people implement this?
 

use a diode-connected MOS biased by a current source to create a biasing voltage, but these voltages vary largely with process and temperature, so in general they are used to bias cascode transistors
 

you can use current mirror to do the job or voltage reference circuit such as bandgap reference circuit. but it is all depend on what type of circuit you want to bias.
 

if ONEresistor is too large to be implemented, the voltage reference is also not good for you. Normally gingerjxb's method is right, if you just need a biasing without current drive cability.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top