csarami
Newbie level 4
Hi,
On page https://billauer.co.il/blog/2017/04/io-timing-constraints-meaning/ we read that
It is emphasizing that we must Always constraint both min and max.
Assuming above scenario ( picture attached above). What is the formula for setting min and max constraint for my chip in the middle in terms of the timing requirements receiving chip and driving chip?
How do we calculate the propagation delay of our chip?
I do appreciate it if you clarify this.
CS
On page https://billauer.co.il/blog/2017/04/io-timing-constraints-meaning/ we read that
Code:
In short,
set_input_delay -clock … -max … : The maximal clock-to-output of the driving chip + board propagation delay
set_input_delay -clock … -min … : The minimal clock-to-output of the driving chip. If not given, choose zero (maybe a future revision of the driving chip will be manufactured with a really fast process)
set_output_delay -clock … -max … : The t_setup time of the receiving chip + board propagation delay
set_output_delay -clock … -min … : Minus the t_hold time of the receiving chip (e.g. set to -1 if the hold time is 1 ns).
Note that if neither -min or -max are given, it’s like two assignments, one with -min and one with -max. In other words: Poor constraining.
It is emphasizing that we must Always constraint both min and max.
Assuming above scenario ( picture attached above). What is the formula for setting min and max constraint for my chip in the middle in terms of the timing requirements receiving chip and driving chip?
How do we calculate the propagation delay of our chip?
I do appreciate it if you clarify this.
CS