As I mentioned before, I would keep the input zero across the shift operation, so the shift amount only appears as a common mode voltage to the ADC and thus isn't involved in calibration. Of course, the IA still contributes to measurement error, both offset and scale. Offset can be eliminated by a system calibration, possibly also scale, depending on the design.
But I agree, that the restricted common mode range of single supply ADCs is always somewhat annoying. I don't see a general solution to it, the trade-off has to be found for each individual design a new.