Apr 6, 2015 #1 L littlebu Member level 1 Joined May 19, 2007 Messages 33 Helped 4 Reputation 8 Reaction score 0 Trophy points 1,286 Activity points 1,452 I am running OVM env together with some behavior model descripted in verilog. But seems it is never working.... assign #0.2ns A = B; But below is OK (changed to always) always @ * A <= #0.2ns B; Since another env without OVM is OK, so i doubt it is related with OVM or some delay mode setting.
I am running OVM env together with some behavior model descripted in verilog. But seems it is never working.... assign #0.2ns A = B; But below is OK (changed to always) always @ * A <= #0.2ns B; Since another env without OVM is OK, so i doubt it is related with OVM or some delay mode setting.
Apr 8, 2015 #2 L littlebu Member level 1 Joined May 19, 2007 Messages 33 Helped 4 Reputation 8 Reaction score 0 Trophy points 1,286 Activity points 1,452 littlebu said: I am running OVM env together with some behavior model descripted in verilog. But seems it is never working.... assign #0.2ns A = B; But below is OK (changed to always) always @ * A <= #0.2ns B; Since another env without OVM is OK, so i doubt it is related with OVM or some delay mode setting. Click to expand... i have found root cause is SEQ_UDP_DELAY option, simulation is OK after removing this option.!!
littlebu said: I am running OVM env together with some behavior model descripted in verilog. But seems it is never working.... assign #0.2ns A = B; But below is OK (changed to always) always @ * A <= #0.2ns B; Since another env without OVM is OK, so i doubt it is related with OVM or some delay mode setting. Click to expand... i have found root cause is SEQ_UDP_DELAY option, simulation is OK after removing this option.!!