Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Gate level simulation not working with SDF

Status
Not open for further replies.

inputoutput

Member level 1
Joined
Mar 18, 2017
Messages
32
Helped
2
Reputation
4
Reaction score
2
Trophy points
8
Activity points
231
I have finished synthesizing my circuit in design compiler. Now I want to do a gate level simulation in Modelsim in order to get the power consumption. The gate level simulation works correctly, but generates "xxx" when annotated with the SDF generated by design compiler. I've tried to reduce the clock frequency but still get the same result. I should add that I've specified input and output delays to the ports (using
HTML:
set_input_delay
and
HTML:
set_output_delay
DC commands) during synthesis, could this be in any way related? Any thoughts?
 

Does you testbench correctly handle the the input and output delays caused by the gate level design that were specified in DC? The testbench has to have matching delays for stimulus (inputs) can capture (outputs).

Or perhaps you aren't supplying a clock or reset to the gate level DUT from your testbench, or the reset is too short.
 

first do a gate level simulation WITHOUT SDF. this will allow to fix functional problems before you dig deeper into the timing problems.
 

@ThisIsNotSam As I mentioned in my post, the gate level simulation without SDF works correctly. The problem occurs when I apply the SDF.

@ads-ee I suspected that could be the problem, that's why I mentioned the input/output delays, but I haven't figured how to fix that. I'm using the same test bench for the RTL for the gate level synthesis. All material I've read so far, do not appear to do any adjustments to the test bench for the gate level simulation. Yes, I'm applying reset and clock to the DUT.
 
Last edited:

@ThisIsNotSam As I mentioned in my post, the gate level simulation without SDF works correctly. The problem occurs when I apply the SDF.

@ads-ee I suspected that could be the problem, that's why I mentioned the input/output delays, but I haven't figured how to fix that. I'm using the same test bench for the RTL for the gate level synthesis. All material I've read so far, do not appear to do any adjustments to the test bench for the gate level simulation. Yes, I'm applying reset and clock to the DUT.

I see. In that case, what ads-ee hinted at is probably the cause. The testbench you are using does not match the environment you have modeled using SDC constraints. first thing to check is the signal-to-clock behavior. if they are switching at the same time, that can cause a lot of xxxs.
 

Yeah, just like ThisIsNotSam says, you need to match the simulation environment to the SDC constraints you have specified. Namely your testbench should model the behavior of the stuff that is connected to your DUT. So inputs (to your design) that are synchronous with a clock will arrive after the clock edge by some delay determined by the device sending the data.

Back when I used to work on ASICs my testbench would have a `define that set whether or not the simulation was functional or timing. this would get set in the simulation scripts to enable the delays in the testbench or not depending on if an RTL, gate, or back annotated gate simulation was being run.
 
1. Are there any constraints other than set_input_delay and set_output_delay in your SDC?

2. Is STA successful after synthesis?

3. Are there any reports of timing check violations in the simulator console?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top