i have a variable resistor R2 with value ranging from 5-111 Ω. i think of the following circuit which has another R1 of 39 Ω in series with R2.
V1 is shot into the MCU ADC pin.
here we have \[ V1 = \frac{5V}{R1+R2} \times R1\]
--> the relationship between V1 and R2 is of course not linear but i think still acceptable..
the problem is the circuit draws too much current (can be up to more than 0.12A) as 5V supply is also shared with the MCU.
is there any way to improve?
Why wouldn't you increase the value of R1 from39Ω to at least 100Ω, or more?
Other option will be to use simple constant current source that will gve you, say 10-20mA, and V1 will become directly proportional to the variable resistor ..
Use larger resistors. Maybe 100 times larger, if the ADC input current is low.
By the way, driving that much current through the wiper of a small pot will eventually damage it.
-------------------
Oops, nevermind! You can't change the variable resistor.
You could build a signal conditioning circuit using a low-voltage opamp. It could feed a small current through the variable resistor, and amplify the generated voltage, which it sends to the ADC.
In this case I would use simple constant current source build around 317L voltage regulator and one resistor (see picture below), supply it of 12V (not from 5V, to ensure that there is at least 3V voltage drop accross 317, and not to drain current from 5V source), and set the current to 45mA ..
This current will generate max 4.995V @ 111Ω, so you will have almost full range ..