Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Post synthesis timing issues

Status
Not open for further replies.

user_asic

Advanced Member level 4
Joined
Nov 13, 2009
Messages
100
Helped
7
Reputation
14
Reaction score
2
Trophy points
1,298
Location
Canada
Activity points
1,836
Hi,

I synthesized using DC (TSMC 018 technology). When I try to simulate the synthesized netlist with my testbench, none of the outputs are correct. The output timing is way off. None of the results in the synthesized netlist match the behavioral model. In the behavioral model, the output was in steps of 10's (0 10 20 30 40). That is, some signal generated by the testbench changes every 10 time units. With the synthesized netlist time timing is 0 10 15 16 17 19 20 25 27 30 34 35 36 37 39 40...). Not only does the timing not match, but the expected results are also incorrect.

Behavioral synthesis finishes at $finish at simulation time 410
and post synthesis finishes at $finish at simulation time 41000

Is there something I forgot in the post synthesis simulation?
 

Default time scale being set via your standard cell libraries?
 

I really dont know. I'm just learning about all this. I have not used the `timescale directive in my code however.
 

ljxpjpjljx said:
you do simulation using this lib?

when i compile the synthesized netlist, i include the technology libs which contain all the needed modules.
 

Best way to debug this is through the waveform.
If you have a tool like verdi it could come in handy. If not some more manual work is needed.

This I believe is not something that can be resolved through a forum.

Cheers,
eChipDesign
 

eChipDesign said:
Best way to debug this is through the waveform.
If you have a tool like verdi it could come in handy. If not some more manual work is needed.

This I believe is not something that can be resolved through a forum.

Cheers,
eChipDesign

This is the strange thing. I'm using synopsys vcs for simulations. In DVE, when I try to view the waveform from the dump file, it is empty! All the signals are just red color fills on the waveform viewer.
 

Are you getting a lot of timing-check violations in your simulation? You need to supply us with more info. Your VCS log (compilation-time & run-time) would help alot.

Are you back-annotating your sdf file? If so, then you need to find all your multi-flop synchronizers in your netlist and disable them. If you haven't done this, it will cause X's in your simulation. There is a couple way you can do this. You can disable them by using VCS's feature that disable timing check on an instance or edit the SDF file on those instances and zero out the values (setup/hold/...).

I highly recommend you first do your netlist simulation without back-annotation and with all timing-check disable, and all gate delay disable. In effect, you're only doing functional simulation on your netlist, very similar to your RTL simulation. So the result should be what you would get in your RTL simulation. If this works, then I would next enable gate delay, but still without timing checks enabled This most likely mean you need to change your timescale because now the gate delay of your lib cell has ps resolution. I typically use 1ns/1ps resolution for netlist simulation, and should be good for your's also. If your netlist with gate delay simulation without timing check works, then next I would enable the timing check. For this, it just uses the default timing values (setup/hold/etc.) in your library cells, which is not as accurate as the ones in your sdf file, which is after PAR. But this is fine for this step. Anytime you enable timing-check, you will get alot of timing violations if you don't disable the timing check on your sync flops. You need to manually disable them. You will know where they're located at when you running the simulation and you get a bunch of timing violations in your VCS log file. The last step is to back annotate your sdf file and re-run your simulation.

The following compile-time options does the following:
+nospecify : Disable the specify block in the gate, which means all timing checks will be disabled, as well as the gate delay.
+notimingcheck : Disable the timing checks only. You still have gate delay.

I can take a look at your VCS log file if you like.

- Hung
 

Hi skyfaye,

I got lost for a second. An sdf file? I do not have that. I am simulating using the synthesized netlist and testbench (along with tsmc verilog lib files).

I will post my log file when I get access to the machine. Also are there any special commands I need to pass to vcs to generate compilation-time & run-time logs, or is it generated by default.

Thank you.
 

Hung wrote everything you need to do. SDF includes the timing of all timing arcs. If you'd like to run gate level simulations, you should make sure your timing (STA) is clean. If you don't have SDF or the timing is not clean, then you shouldn't use any delays.
 

I should have read your title more carefully. If you're just running gate simulation with the netlist after synthesis, then you don't have an sdf file. By default VCS does not save the log file. You can add "-l <filename>" at the end your VCS command to save the output to a log file.

Also, you do know that it is better to run gate simulation with the PAR netlist right? It will have more accurate timings.

- Hung
 

The thing is that with multiple clocks and clock gating, prelayout simulations as it was described by Hung (zero delays, no timing checks, etc) don't work most of the time. You can try that, but you may want to wait till you have SDF and clean timing.
 

Hi,

Here is the VCS log file (synthesized netlist):
**broken link removed**

Run time log file of simv (synthesized netlist)
**broken link removed**

Run time log file of simv (behavioral netlist)
**broken link removed**
 

Your circuit is simple enough such that if you can generate a vpd dump I can better debug the issue. But from the log file, it looks like the gate delay is causing the simulation result mismatch between your gate sim and your rtl sim. You can disalbe the gate delay by add the compile-time option +nospecify.

- Hung
 

    user_asic

    Points: 2
    Helpful Answer Positive Rating
Hi Hung,

Adding +nospecify did the trick. Thanks!

So is this usually the situation when #delay is used in the testbench and run with synthesized code?
 

If your testbench is done right, usually, it will work regardless if you're running RTL simulation or gate simulation. For your case, it's difficult to tell what's exactly causing the problem unless you look at the waveform and debug it.

By the way, it's generally a bad idea to have your testbench drives X's into the design at time 0ns. Initialize it to some known values. It may be as simply as that.

Also, in your gate sim, because of the gate delay, the signals being monitored are changing 0, 10, 15, 16, 17, etc. Where as in the RTL sim, since there's no gate delay, the signals being monitored are only changing at every 10 ns. That's why the output of the gate sim is different then your RTl sim.

- Hung
 
Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top