Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
I have ready many pages of it already and I think it is a good book. Maybe, this should be done using GUI, as it explained here:
https://www.eng.auburn.edu/~agrawvd/COURSE/E6200_Spr09/HW/Modelsim%20Tutorial.pdf
I am reading a book and this kind of testing ( on the fly at ModelSim terminal) is mentioned there. Example above is from Page 7 of Digital System Design by Roth, et el. The book is published 2017! The method used above is mentioned through the book. It is using ModelSim.
Hi,
As a newbie I am trying to test a very simple verilog as follows:
module gates(A,B,C,D,E);
input A,B,C;
output D,E;
assign #5 D = A|| B;
assign #5 E = C || D;
endmodule
I issue the following command:
add list A B C D E
force A 0
However, I get the following error message...
I just got it. The post corresponds to FPGA. The answer to ASIC is very simple and is common sense. min input_delay for receiving module ( ( say HDL module))= min of all output delays from the last flip flop of the connecting modules to the inputs of the receiving module, etc. So in the case...
Hi,
On page https://billauer.co.il/blog/2017/04/io-timing-constraints-meaning/ we read that
In short,
set_input_delay -clock … -max … : The maximal clock-to-output of the driving chip + board propagation delay
set_input_delay -clock … -min … : The minimal clock-to-output of the driving chip...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.