owen_li
Full Member level 3
Hi.
I just came up with a question about the gate level simulation.
when coming to deep submicron technology, we would like to use ocv analysis type to do STA.
So, when generating the sdf file, it will have three different delay values, like max/min/typical delay for a cell.
As we know, STA tool will choose the worst case to analyze timing when using OCV.
Now coming to gate level simultation using sdf file, will simulation tool choose the mix delay values, like maximum delay at data path
mimum delay at clock path, when doing setup analysis ? If so, how will simulation tool covers the hold analysis.
I am sorry that I am a backend engineer, and don't have some ideas on frontend.
Thanks all!
I just came up with a question about the gate level simulation.
when coming to deep submicron technology, we would like to use ocv analysis type to do STA.
So, when generating the sdf file, it will have three different delay values, like max/min/typical delay for a cell.
As we know, STA tool will choose the worst case to analyze timing when using OCV.
Now coming to gate level simultation using sdf file, will simulation tool choose the mix delay values, like maximum delay at data path
mimum delay at clock path, when doing setup analysis ? If so, how will simulation tool covers the hold analysis.
I am sorry that I am a backend engineer, and don't have some ideas on frontend.
Thanks all!