After rtls are synthesized using a synthesis tool like Design Compiler of Synopsys, we get a netlist. The simulation of this netlist is called gate level simulation. Is it clear now?
After rtls are synthesized using a synthesis tool like Design Compiler of Synopsys, we get a netlist. The simulation of this netlist is called gate level simulation. Is it clear now?
at a bare minimum, a verilog file of the standard cell library is needed. this should be combined with an SDF, otherwise you get some timing model that is not realistics like a unit delay model (ie, every cells takes 1 time unit to compute its output).
at a bare minimum, a verilog file of the standard cell library is needed. this should be combined with an SDF, otherwise you get some timing model that is not realistics like a unit delay model (ie, every cells takes 1 time unit to compute its output).
But the equation in .lib may not be similar to the model file. For example if a D-fliplop is taken its behavioural code is understood as a D-flipflop but the equation of it does not reflect the behaviour of a D-flipflop. A LEC tool can read those equations of cells in a .lib if I am not doing any mistake.
But the equation in .lib may not be similar to the model file. For example if a D-fliplop is taken its behavioural code is understood as a D-flipflop but the equation of it does not reflect the behaviour of a D-flipflop. A LEC tool can read those equations of cells in a .lib if I am not doing any mistake.
what do you want me to say? if you try to hack a solution, you get as far as the hack takes you. there are standard ways of doing things, stick to those and you will not have to worry about model incoherence.