While simulation of carrier transport in semiconductor devices is more or less "first principle", simulation of degradation is not.
That is true for any type of degradation - HCE (Hot Carrier Effect), NBTI (Negative Bias Temperature Instability), TDDB (Time-Dependent Dielectric Breakdown), etc.
There are not first-principle models predicting the degradation dynamics (or statics) more or less "predictively".
Degradation is strongly affected by obscure process, chemical, and other effects that are far from being well understood or described by any theory - nitridation of gate oxides, hydrogenation or deuteriazation of the interface defects/traps, presence of fluorine or other chemicals, etc. etc.
It appears that there are some empirical models in device simulation tools (including Silvaco's Atlas) that do allow to simulate some degradation effects.
If so, there should be some tuning or fitting parameters, that you can change in the input deck, to tune your simulation results.
Then you will get few mV or few tens of mV of degradation after X hours of "stress", instead of unreasonable few Volts.
Hopefully, those empirical models account for the effect of the electric field ion the hot carrier degradation, so that when you change LDD doping profile, you will see the change of electric field and then, hopefully, change in degradation.
In the past, an indirect indication of the strength of hot carrier effect was substrate current (also generated by hot carriers).
People observed, experimentally, a correlation between the degradation and the peak substrate current (vs gate voltage, at max Vds).
So, the magnitude of the substrate current was used as a "proxy" for degradation.