Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How to define input slew rate range when doing standard cell characterization?

Status
Not open for further replies.

Brucecwang

Newbie level 3
Newbie level 3
Joined
May 21, 2014
Messages
3
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Visit site
Activity points
19
Hi,

When using Siliconsmart or ELC to characterize standard cells to generate .lib, how to define the realistic input slew rate range for the delay/power lookup tables?

In practice, is it to use spice simulation to measure the slew rates of standard X4 invertor/buffer from FO1 to FO10? Or any other standard way adopted and used by standard cell vendors?

Thanks.
Bruce
 

For a 0.18um commercial standard cell library, from spice simulation of a minimized buffer driving 10 X4 buffers (in the .lib, the max fanout is defined as 10),the slew rate is less than 1ns. But in the .lib, the input slew range of the timing & power table for all cells is from 0.05ns up to 4.5ns.

Anybody have any idea for this? Thanks in advance.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top