The accuracy is mainly determined by the resistor tolerances and the circuit offset voltage.
The common-mode rejection is determined by the resistor tolerance. Thus for 1% resistors the worst-case CM rejection is 40dB. For 3.6V and a gain of 18 that would give a worst-case zero current offset of 650mV at the output, which is about 36% of the 1.8V signal output. You could put a pot in series with one of the resistors to adjust this offset to zero (say a 1k pot in series with a 17.5k resistor).
The input bias current is not a problem as long as the resistor values are kept low (20k ohm or less).
If you don't want have to adjust for offset then you need to use more accurate resistors, or use a instrumentation amp which has a much higher common mode rejection, or use a high-side dedicated current-monitor IC. These are made by several IC vendors.