Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Timescale in Verilog Simulation

Status
Not open for further replies.

digi001

Full Member level 5
Joined
Apr 5, 2011
Messages
244
Helped
22
Reputation
44
Reaction score
21
Trophy points
1,298
Activity points
2,904
I dont really understand the concept of why digital Verilog simulators need a timescale and resolution?

Doesn't all combinational logic in a simulator happen instantaneous? Doesn't all sequential logic occur on a clock pulse? So why the need for a resolution of time?


Saying 5 clock pulses whether it happens in 5s or 5ns the simulator doesn't have any idea whether this is synthesizable or not? Why the need for resolution in between clock pulses if everything happens instantaneous at every clock pulse?
 

That's true the simulator doesn't care about time resolution, but only if there's a single clock, and there are no delays modeled in your system (and you've done a perfect job using non-blocking assignments for sequential logic and regular blocking assignments with 0 delays for combinatorial logic). As soon as you have to deal with multiple clocks or other asynchronous stimulus, then you need a delays with a timescale to specify the relationships between when these events happen.

People do use simulation for timing analysis, although not as accurate as specific timing analysis tools. Many gate-level libraries have delays modeled in them.
And as soon as one part of your simulation ,either in your testbench or design, introduces a timescale with delays, then all of your design needs to deal with time as a single global variable with a common time resolution.
 
Great thanks for the info!

It looks like the Altera-Modelsim starter tool lets you compile your design then go into ModelSim for this type of 'gate-delay' detailed timing analysis. I was mistaken that all simulation was running on the HDL only.
 

Timing resolution also decides the simulation time.
 

Regarding FPGA design flow:

Should the design first be simulated without the delays modeled to confirm the logic is correct? Then next simulate with all delays modeled and clocks and verify design?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top