Static Timing Analysis

Status
Not open for further replies.

preethi19

Full Member level 5
Timing Analysis

Hi i am learning timing analysis in digital design. Wen we normally simulate any design we generally check for the functionality wer the clock, inputs or outputs are with no delays.. But in real life thats not the case and the logical elements always cause delay. I understand the concepts of setup, hold, WNS violations, CTS and all that. The input, outputs delays i give are upto me. So i can give input delay to a certain range and wen it goes beyond it gives me violation which i should not have. My question is

1) What am i going to learn from giving this delay??? meaning say an input delay of 5ns is tolerable and wen it exceeds that i get a WNS violation. So what does this 5ns represent. Does it represent the tolerance range of max how much my design can have a delay beyond which the design's performance is not good. So is this wat checking the design under worst case condition mean???

2)secondly the delays caused by the logical elements how would i know that. Meaning say i give a manual input delay of 5ns but the delay caused by the design is 6ns. So how can i set the input delay for a design. Shouldnt i run a design and find on its own how much delay it causes on the input rather than i having to input a delay value?

Can anyone kindly help me with this. thanks in advance!!!

Super Moderator
Staff member

Input delays represent the clock period used up by some external device that has some Tco plus the routing delay on the board to the input of your ASIC. You need this to make sure your ASIC will still work with the minimum and maximum delay of that external device.

Output delays represent the clock period that the ASIC can't use that is required to meet the setup time of the external device plus the flight time of the routing to that device.

secondly the delays caused by the logical elements how would i know that. Meaning say i give a manual input delay of 5ns but the delay caused by the design is 6ns.
This is really hard to understand, what are you asking?
If you gave a constraint of set_input_delay -clock clk 5ns my_input_pin and you have a delay of 6ns of setup time from the pin my_input_pin to the input FF then you can't run the design any faster than 1/(5ns+6ns) = ~90.9 MHz.

- - - Updated - - -

It just dawned on my that maybe you are trying to determine what would be a good value for an ASIC set_input_delay constraint when you don't know what is driving the input? In that case I would either a) make the input as fast as possible, constrain it to nearly the limit of the technology so the input setup is as small as possible or b) use half the period for the set_input_delay value, so that any external device gets half the clock period and the ASIC gets the rest.

Points: 2

cyrax747

Points: 2

ThisIsNotSam

It just dawned on my that maybe you are trying to determine what would be a good value for an ASIC set_input_delay constraint when you don't know what is driving the input? In that case I would either a) make the input as fast as possible, constrain it to nearly the limit of the technology so the input setup is as small as possible or b) use half the period for the set_input_delay value, so that any external device gets half the clock period and the ASIC gets the rest.

actually it is not one or the other. you need both a best case and a worst case for input_delay, for hold and setup respectively.

say my 'fast' clock is 1GHz, period = 1ns. I could set my input delays as:
set_input_delay -clock [get_clocks fast] -add_delay 0.500 [get_ports rst_n] -max
set_input_delay -clock [get_clocks fast] -add_delay 0.020 [get_ports rst_n] -min

Status
Not open for further replies.