The_Dutchman
Member level 1
Hi everyone,
I'm an analog designer performing my first mixed-signal chip now and I have some confusion about input and output delay constraints.
Basically I have a 5-bit thermometer decoder now written in RTL which takes it's input data from the bondpads of the chip and the output of the thermometer decoder drive switches in an analog block.
Eventually the parallel inputs will be driven using a ParBERT which has delay control. So I am unsure of how I should set my input delay constraint?
Also the output of my decoder drives switches in an analog block, how should I set this output delay constraint as this isn't "sampled" by a clock no more of a future module?
Furthermore, I was wondering if inputs directly connected to a bondpad need some additional attention? I guess putting e.g. some inverters to restore the edges of the signal, but won't the synthesis tool optimize them out?
Is there any best practice advice for this?
I'm an analog designer performing my first mixed-signal chip now and I have some confusion about input and output delay constraints.
Basically I have a 5-bit thermometer decoder now written in RTL which takes it's input data from the bondpads of the chip and the output of the thermometer decoder drive switches in an analog block.
Eventually the parallel inputs will be driven using a ParBERT which has delay control. So I am unsure of how I should set my input delay constraint?
Also the output of my decoder drives switches in an analog block, how should I set this output delay constraint as this isn't "sampled" by a clock no more of a future module?
Furthermore, I was wondering if inputs directly connected to a bondpad need some additional attention? I guess putting e.g. some inverters to restore the edges of the signal, but won't the synthesis tool optimize them out?
Is there any best practice advice for this?