Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

SDC set input delay has no effect

Status
Not open for further replies.

shaiko

Advanced Member level 5
Joined
Aug 20, 2011
Messages
2,644
Helped
303
Reputation
608
Reaction score
297
Trophy points
1,363
Activity points
18,302
Hello,

I have a parallel, source synchronous video bus connected to my Altera FPGA.
IN_CLOCK (100MHz) is the clock signal and IN_DATA ( 11 downto 0 ) is the data synchronous to that clock. The access time of the video source data can be as high as 1ns according to the video device's documentation.

So, this is how I wrote my SDC constrains:
Code:
create_clock -period 10 -waveform {0 5} -name IN_CLOCK [get_ports IN_CLOCK]

derive_pll_clocks # [B]The above clock drives a PLL and this PLL's output actually clocks the design[/B] 

set_input_delay -clock IN_PCLOCK -max [COLOR="#FF0000"]1.000[/COLOR] [get_ports {IN_DATA*}]

I started playing with the 1.000 value and increment it - just to see when the tool fails to meet the constrain.
From 1.000 I got up to 9.500 and still no failure.
Why is that ?
 

Hello,

I have a parallel, source synchronous video bus connected to my Altera FPGA.
IN_CLOCK (100MHz) is the clock signal and IN_DATA ( 11 downto 0 ) is the data synchronous to that clock. The access time of the video source data can be as high as 1ns according to the video device's documentation.

So, this is how I wrote my SDC constrains:
Code:
create_clock -period 10 -waveform {0 5} -name IN_CLOCK [get_ports IN_CLOCK]

derive_pll_clocks # [B]The above clock drives a PLL and this PLL's output actually clocks the design[/B] 

set_input_delay -clock IN_PCLOCK -max [COLOR="#FF0000"]1.000[/COLOR] [get_ports {IN_DATA*}]

I started playing with the 1.000 value and increment it - just to see when the tool fails to meet the constrain.
From 1.000 I got up to 9.500 and still no failure.
Why is that ?

Because it is enough. I don't think you understand the meaning of the constraint you are trying to set.

It means the signal has 0.5 ns to reach the first flop. If you are not doing anything with it, just storing into flops, that could be enough time.
 

I don't think you understand the meaning of the constraint you are trying to set.
I know what this constrain does. It informs the tool about the worst possible delay between clock edge to data availability - so setup time can be met.
I just find it hard to believe that 0.5ns would be enough.
 

it could be. if the signal is latched as soon as it enters the fpga. a typical flop would have a setup time of tens of picoseconds. add some more because the FPGA has all that complex routing and IO and LUTs, and maybe it still fits within 500ps.
 

it could be. if the signal is latched as soon as it enters the fpga. a typical flop would have a setup time of tens of picoseconds. add some more because the FPGA has all that complex routing and IO and LUTs, and maybe it still fits within 500ps.

Changed it to:
Code:
set_input_delay -clock IN_PCLOCK -max 9.999 [get_ports {IN_DATA*}]
No effect. Timing is met.
I also checked the "ignored constrains" report - it's not there.
 

You should also specify the minimum input delay.
 

Changed it to:
Code:
set_input_delay -clock IN_PCLOCK -max 9.999 [get_ports {IN_DATA*}]
No effect. Timing is met.
I also checked the "ignored constrains" report - it's not there.

something else is off then. maybe the clock hierarchy/naming. do a report timing, make sure the constrained path is indeed part of the design. check the names letter by letter
 

std_match,
You should also specify the minimum input delay.
Of course, but is it possible that not specifying it causes the "max" delay to be ignored ?
 

The input registers are clocked by the PLL clock?
It should then be possible to handle any input delay by adjusting the phase of the PLL clock.
If you didn't specify the PLL clock phase, maybe the tools did it automatically to handle the 9.999 ns input delay.
If you also specify the minimum input delay so the range is 0.000 - 9.999 ns, it should be difficult or impossible to meet timing.
 
  • Like
Reactions: shaiko

    shaiko

    Points: 2
    Helpful Answer Positive Rating
The input registers are clocked by the PLL clock?
Indeed.

If you didn't specify the PLL clock phase
The only place where I specified the PLL's phase (which I set to 0) is in the IP catalog (while configuring the component for generation).
Is there another configuration I should adjust?

After your note, I set the minimum input delay to be as conservative as possible (0.000) while leaving the maximum at 9.999.
Timing is still met (!)
 

Update.
I removed the PLL and used the input clock to directly sample the data signals.
Timing violations on the nets in question appear.

Seems like the presence of the PLL interrupts somehow with proper timing calculations.
Ideas anyone?
 

what is the name of the generated clock post PLL? maybe try to set a constraint with respect to that clock. it seems that somehow the clock info is not being propagated correctly
 

what is the name of the generated clock post PLL?
"pll_video_out_clock"
it seems that somehow the clock info is not being propagated correctly
Maybe, but the generated clock DOES appear (with all parameters correctly set) in Time Quest's clock reports.

maybe try to set a constraint with respect to that clock.
OK. do you know how I can refer to the PLL output port for reference in the SDC command?
I don't think I can use the "pll_video_out_clock" signal name...
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top