Hello, could someone explain me what is going on in my simulation? I have assigned a clock signal that drives some logic, cnt and sig1. This clock signal i assigned to another signal, which i use to drive sig2. I expect sig1 and sig2 beeing exactly the same, but the simulator seems to add a delta-time to it??
Hello, could someone explain me what is going on in my simulation? I have assigned a clock signal that drives some logic, cnt and sig1. This clock signal i assigned to another signal, which i use to drive sig2. I expect sig1 and sig2 beeing exactly the same, but the simulator seems to add a delta-time to it??
Yes, that is what i can see on the simulator. But is it expected to behave this way? I assume CLK2 <= CLK1 will be synthesized into one common signal name, right? This should not take one delta time, even in simulation!
Yes, that is what i can see on the simulator. But is it expected to behave this way? I assume CLK2 <= CLK1 will be synthesized into one common signal name, right? This should not take one delta time, even in simulation!
This is how VHDL works, not necessarily how it will work in reality. This is the danger of renaming clocks inside your design. Also not that a port map counts as a signal assignment incuring a 1 delta penalty. But because you usually clock everything inside a single entity off the same clock signal, you dont see any issues.