Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How Bias voltage changes from 5v to 1.8,1.5,1.2 etc

Status
Not open for further replies.

vikramc98406

Full Member level 1
Joined
Nov 30, 2007
Messages
97
Helped
8
Reputation
16
Reaction score
6
Trophy points
1,288
Activity points
1,939
Anyone give me more details of how the trend of bias voltage reduction happening.

Earlier 5v MOS transistors there there , now a days as the shrinkage in MOS devices, applied bias voltage is also reducing to 1.8, 1.35 and so on.

On what factors these values change.
 

smaller techs means smaller areas of trs. (that also means less Vth) so thats mean you need less biasing voltages to activate this trs.
thats my idea and seems correct to me...
 

Also could be because lowering the voltage helps in reducing the power dissipated. Since power reduction has gone hand in hand with transistor size reduction, I guess it was natural.

-Aravind
 

The progress of semiconductor technology in recent decades was mainly due to scaling - proportional reduction of transistor geometrical features: gate length, gate oxide thickness, junction depth, etc. As the dimensions scale down, we need to reduce applied voltages - to preserve device reliability (too high gate voltage will cause oxide breakdown, too high drain voltage will cause hot carrier degradation, etc.). As the electric field is held approximately constant, the device performance (drive current per unit gate width) stays approximately constant, but the device density (number of devices per unit area) is increased. Device threshold voltage does not scale much, since the leakage should stay low (source-drain leakage current is proportional to exp(-Vt/kT)).

These days, the thicknesses became so small, that further scaling is practically prohibited (i.e. there is very high gate oxide leakage due to quantum mechanical tunneling of carriers through ~10A thick gate oxide), That's why semiconductor companies are looking at other alternatives to improve the performance - such as metal gates (to replace polysilicon gates that are prone to depletion effect), high-K dielectrics (to increase physical gate oxide thickness to suppress tunneling bu keeping electrical thickness low), strained Si - to improve carrier mobility, etc.

The scaling of supply voltage has practically stopped at ~1.0V (maybe 0.9 or 0.8V) - due to stability, noise immunity, and other effects.
 

Does that mean,

the applied Bias voltage does not depend on drain/source area/sizes?
 

vikramc98406 said:
Does that mean,

the applied Bias voltage does not depend on drain/source area/sizes?

Applied voltage is determined mainly by gate oxide thickness and gate/channel length.
 

timof,

may be you are confused,

i mean applied BIAS voltage to the source of a MOS transistor

For the gate it is always the signal level voltage which differentiates 0 or 1.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top