Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

What is the actual Vth displayed in spectre with DC analysis

Status
Not open for further replies.

jts

Newbie level 5
Joined
May 25, 2009
Messages
9
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,354
dc analysis wikipedia

Hi every body,

I run DC analysis in spectre and display DC operating points of each transistor (Vth, Vgs, Vds, Ids,...). It looks weird with Vth since when I increase the device's length, its Vth decreases correspondingly (a lot). Why does Vth change with the length? Is it the actual threshold voltage of the device or it is the way spectre defines? Does anyone experience this also?

Thanks.
 

spectre vth

Hi,

Its always better to check how equation for Vth is implemented for your model level/version. they are different for BSIM, EKV..

If your using BSIM, then you can find related equations in BSIM documentation.

Thanks,
 

dibl minimum vth

Regardless the model used, is it true that threshold voltage decreases when the channel length increases? For example, when the length is 60 nm, spectre displays Vth of 580 mV, when the length is increased to 600 nm, Vth = 390 mV.
Can anyone explain this?

Thanks.
 

spectre dc analysis

You can find the answer in Behzad Razavis book (”Design of Analog CMOS Integrated Circuits”, McGraw-Hill 2001)
Basically the reason is a none linear doping which fights the DIBL (drain-induced barrier
lowering) effect. Is needed for short channel transistors
 

90nm vth

Interesting edge_tv. Is this for 65nm node and down or?

From what I understand, traditionally(90nm) the Vth increases when the channel length increases. The increase is offcourse dependent on the Vds. The higher Vds the more Vth increases you get, up to a certain channel length where Vds has minimum impact.
 

transistor vth

When feature size (minimum length) scales down, everything scales down BUT not at the same ratio. Thats why stacking becomes a problem at low internal vdd levels - Vth scales down but not at the same rate as Vdd. Anyways, DIBL causes the threshold voltage to decrease as Vd increases. As much as i hate wikipedia, it has a decent explanation: https://en.wikipedia.org/wiki/DIBL
 

minimum transistor length 600nm

The DIBL explains the decrease of Vth when Vd increases, but my question is that Vth displayed in spectre decreases when the channel length increases. Let's me show you what I've done: I test an inverter using IBM 65 nm process, the widths of PMOS and NMOS are 1.05um and 350nm, respectively. VDD = 0.5V, I do DC analysis with Vin sweeping from 0 to 0.5V to find the switching point with the lengths of both devices are: 60 nm, 600 nm, 6 um, and 60 um. For each case, spectre displays Vth as the following:
length - NMOS (mV) - PMOS (mV) - Switching Point (mV)
60nm 558 -462 257
600nm 391 -313 266
6um 343 - 260 274
60um 336 -249 275

The threshold voltages decrease with the increase of channel length. The change is huge when the length is small (from 60 nm to 600 nm).

How can we explain this?
 

ibm 65nm vdd

now is this tme
 

vds change vth

Ok, I just fired up the 65nm process from STMicroelectronics, and did a DC sweep.

Using the low power transistors, I could not see a Vth decrease when increasing the length of the transistor. (60nm -> 600nm)

Using the general pourpose transistors showed that a 2x increase (60nm -> 120nm) in channel length gave a substansial increase in Vth.

Must be something that is process dependent?
 

I have the same problem with umc90nm.
When i plot Vth as a function of L I see that Vth decreases as L increases.

eg: nmos 12_LL
100nm -> 630mV, 10um -> 320mV !!

Instead when I plot sqrt(I) as a function of Vgs (saturation region) and i extrapolate the Vth from the slope I don't see such a huge spread.

The variation of Vth with L is technology dependent, indeed. As far as i know, if no doping gradient is used in the channel, Vth decreases as L decreases (as the width of the depletion regions around source and drain becomes more importat for small channel lengths). To counteract this, a higher doping profile is used in the channel near the diffusion regions (Drain and Source).
Anyway I don't think that this doping will cause such a significative increase of the threshold voltage with small channel lengths!

The model I use is BSIM4

Has someone solved this problem?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top