Pretty vague question. What kind of sensor, and what do you mean by “selectivity”? Are you talking the impact averaging, maybe?
Really the subject is vague, because it is an example of a generic sensor that most sensors have this behavior.
I will explain what I have understood so far with research.
When we are talking about sensitivity, small changes (or changes) occur, give an electrical signal for example. this can be through graphs and using mathematical people with the relationship between the output and the input used graph using the derivative.
See, selectivity means to be selective in the type of signal you want to detect (read). example if I am using an ideal sensor (that is, it does not exist in the real world), it can separate two different quantities in its reading.
If we take an example, we have a temperature and gases to measure in the same place, if we have a sensor that is ideal, the sensor can read a single quantity, without interference from the other quantity. The real sensor (truth) can read, but with a small or large margin of reading error because of the other quantity.
My question: when we process the signals generated by the sensor at the expense of sensitivity / selectivity.
Precisely when we have processing the signals generated by the sensor to the detriment ... this does not achieve this relationship. (that understanding).