When use set_input_delay/set_output_delay, how to determine the -max/-min parameter? Is it calculated by hand , calculated by tools, or give out by some standard specification?
Code:
set_input_delay -max 498 -clock EXTSCL [find port ddc_sda_i]
set_input_delay -min 0 -clock EXTSCL [find port ddc_sda_i]
set_output_delay -max 498 -clock CLK1MHZ [find port ddc_sda_o]
set_output_delay -min 0 -clock CLK1MHZ [find port ddc_sda_o]
This should be as a part of specification .
In general if you are doing synthesis of module you do 40% of clk . But if you constrain the same at SOC you need to go through data sheet and get the same .
It is totally dependent on the environment where your design is going to sit.
For example if your input is driven by a block/chip which is fast enough to give the output in 20% time
then your design budget is 80% of clock period.
For a block level or IP design these margins are specified by the architect whereas at chip level this
is driven by a application requirements.:|