Hi all!
During CTS stage in Synopsys ICC I observed the .log file and found some information about technology-based delay scale factors (see .log below):
I use all necessary technology files for CTS (i.e. TLU+, etc), and the main question is why wire-delay scale factor is constant (1.0)? Why it cannot changes like, for example, logical gates?
It does change, there are process corners for wires just like for logic: best, typical, worst, etc. In Cadence environment this comes from the qrc files, not sure what is the equivalent in synopsys world.
Thank you for your answer!
Yes, I understand, that it does. But what the reason that ICC reports Technology-based wire delay scale factors equal for all corners? How I can determine what exact delay scale factor fol wires for specific corner?
Thank you for your answer!
Yes, I understand, that it does. But what the reason that ICC reports Technology-based wire delay scale factors equal for all corners? How I can determine what exact delay scale factor fol wires for specific corner?
Hard to say. I am not a heavy Synopsys user. What I can say for sure is that wire scale factor tables are not so popular with modern technologies as they are not accurate enough. However, the tools still support it. In my most recent tapeout I had all my tables as 100/100/100 whereas extraction would actually do the job of calculating pessimistic/optimistic Rs and Cs for wires.
In Synopsys world (StarRC world), a direct analog of QRC techfiles are ITF files (source files - text files (encrypted or not) defining BEOL stack - metal and dielectric thicknesses, dielectric constants, resistivities, manufacturing effects, etc.), and NXTGRD files (binary databases created from ITF files, for various metal lines patterns, for capacitance extraction). StarRC-generated SPF file contains a pointer to NXTGRD file used for extraction.