This is a misconception. It's not higher/smaller, it's estimated / real.
A poor designer could actually come up with a smaller / higher scenario too if he is off.
Once you know what your clock tree looks like, why would you use an estimated value instead of the real one?
I don't think this is correct. Not at physical synthesis. During logic synthesis you might try some really tough scenarios to see what the absolute best performance could be. But during physical synthesis, you just want to be as close to reality as possible, and as close to the spec as possible.
Still, my previous reply had the real reason why we do this. It's a decreasing uncertainty game as you go down the implementation flow.
Using over constraint is a normal practice by place and route phase, during synthesis. In my opinion this is not correct thing to do and tools may ended up adding more buffers or over optimize the design and ended up consuming more power.
Using over constraint is a normal practice by place and route phase, during synthesis. In my opinion this is not correct thing to do and tools may ended up adding more buffers or over optimize the design and ended up consuming more power.