Hi, I am having trouble understanding your question. Yes, like you said, first, the process engineers plan on what they want, then they can use a number of tools to try to simulate and predict how the process is going to be. After that they produce a test chip that is measured for each corner. Depending on the process node that this can include different issues, like optical problems, temperature ranges, derates, and so on. Like you said, this is why we wind up with some chips comming out slightly faster, some much faster, some with say the Pmos much stronger, some with it weaker, and so on. This gives us the corners. So we aim for the typical but get everything.
The next task then the wafers are tested to make actual statistics, the corners of what is acceptable are defined. I remember several years ago seeing the device physics guys going to the lab and measuring each transistor by hand, blowing each resistor, and so on to see the limits. Today it might be more automatic, this was 20 years ago.
Once all this is done the data is fed to the tools. Of course, if it is not satisfactory, one can always tweak the process, and there are often some rounds of that before the process is ready to be used. So the early preliminary PDK can be quite different from the final PDK details sent to the customer. Sometimes customers are too in a hurry to get something out and then there is already a product on these preliminary runs but then the designer often puts in lots of margin to account for changes.
Now, what do you mean? To shift the process towards the FF, think about that. This would mean the transistors need to switch faster. So, yes, one could design a variant process where some parameters were tweaked, so one could perhaps change the K. These variant processes do exist sometimes. This takes time though and you would probably just define a new typical and then shift everything. Another thing that happens is sometimes your library provider will design libraries with different VTs to get a faster library at cost of more energy and a slower more power friendly library.
Is this whay you mean? Or do you mean perhaps making processes that use extreeme temperatures. They do exist too but usually only few foundries focust on them, for example chips that are expected to operate above 100C. But the whole process is quite odd in this case.