Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

SDC in RTL Synthesis

Status
Not open for further replies.

sun_ray

Advanced Member level 3
Joined
Oct 3, 2011
Messages
772
Helped
5
Reputation
10
Reaction score
5
Trophy points
1,298
Activity points
6,828
How are the input delay, output delay, clock uncertainty values are determined for RTL Synthesis? On what basis are the values of input delay, output delay, clock uncertainty are calculated?
 

Take the any Synthesis REFERENCE MANUAL (Synopsys, MentorGraphics, Cadence, Xilinx/Altera) and read the initial chapters. All of your answers will be there.
 

Take the any Synthesis REFERENCE MANUAL (Synopsys, MentorGraphics, Cadence, Xilinx/Altera) and read the initial chapters. All of your answers will be there.

No, it is not stated there. Persons who are working on this, please reply to the post no 1.
 

So your Q is about how and on what basis are input delay, output delay, clock uncertainty determined. I will try to answer them!

In very short, these values depend on the environment in which the chip being designed will be used (board design, interfacing with other chips, etc).

Assume that your chip being designed will have inputs from an ADC and its on board clock source is a crystal or RC oscillator. So if the ADC chip output driving your chip inputs has a 1ns delay on the data (meaning that the clock rises, then 1ns later the data changes), then the data input delay is 1ns. Then you need to look in the data sheet of that ADC chip where the min/max values of the output data lines will be specified. Of course, your data (which is the o/p of the ADC) is going to be invalid around that 1ns point as it switches, and so your design needs to know how long the input takes to switch transitions (from 0 to 1 or 1 to 0) at the input.

These min/max input delay values tell your design that the data at your input pins may change at any time from that min value through to the max value (since it might take anywhere between those two times to be stable). So you might, for example, set the min value to 0ns and the max to 2ns, and that tells your design that the data is changing during the period from 0ns to 2ns after the clock rose - and hence your design should automatically include clock or data delays to ensure that the data is not sampled during that time.

Similar would be my argument for determining the output delay, where the questions needed to be considered is whether your o/p ports would be connected to another chip placed nearby in the board or there would be a long metal connection from your o/p to some corner of the board.
 

So your Q is about how and on what basis are input delay, output delay, clock uncertainty determined. I will try to answer them!

In very short, these values depend on the environment in which the chip being designed will be used (board design, interfacing with other chips, etc).

Assume that your chip being designed will have inputs from an ADC and its on board clock source is a crystal or RC oscillator. So if the ADC chip output driving your chip inputs has a 1ns delay on the data (meaning that the clock rises, then 1ns later the data changes), then the data input delay is 1ns. Then you need to look in the data sheet of that ADC chip where the min/max values of the output data lines will be specified. Of course, your data (which is the o/p of the ADC) is going to be invalid around that 1ns point as it switches, and so your design needs to know how long the input takes to switch transitions (from 0 to 1 or 1 to 0) at the input.

These min/max input delay values tell your design that the data at your input pins may change at any time from that min value through to the max value (since it might take anywhere between those two times to be stable). So you might, for example, set the min value to 0ns and the max to 2ns, and that tells your design that the data is changing during the period from 0ns to 2ns after the clock rose - and hence your design should automatically include clock or data delays to ensure that the data is not sampled during that time.

Similar would be my argument for determining the output delay, where the questions needed to be considered is whether your o/p ports would be connected to another chip placed nearby in the board or there would be a long metal connection from your o/p to some corner of the board.

Does it mean when the rtl of a block is synthesized for all its inputs and outputs are done in this way to provide the input and output delays for this block while doing synthesis? This may not be the standard procedure.
 

Yes, this is called design constraining and it should be done all input/output/in-out ports of a design.

This may not be the standard procedure.
Then what is it, may I know?
 

Yes, this is called design constraining and it should be done all input/output/in-out ports of a design.


Then what is it, may I know?

I was hearing that there is a standard procedure. Even I want to know.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top