Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Converting Sensor output 0 to 3vDC to 0 to 1.5VDC

Status
Not open for further replies.

dsk2858

Member level 2
Joined
Aug 17, 2011
Messages
47
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Location
Chennai,India
Activity points
1,683
Hi ,

I am using a Soil moisture sensor whose Output voltage is 0 to 3vDC which i am feeding to the ADC of the MCU.

The ADC is a 10bit ADC,And the maximum reference voltage is 1.8VDC as the reference voltage is less than the out put of the sensor voltage i could not measure the sensor out put when it exceed more than 1.8v what should i do to solve this problem.
 

You can use a resistor divider, or you can make a simple attenuation stage using an opamp. Either will work, since I'm assuming that your sensor output voltage is a slowly fluctuating DC voltage.

Depending on the output current that your sensor can deliver, a resistor divider may be appropriate, though it will probably be slightly less accurate, but it is much simpler to implement (two resistors). Make sure that if you're using a resistor divider, that your resistance values are large (100's of kOhm or MegOhm), because if you use low value resistors, you will likely draw too much current out of the soil moisture sensor. If you draw too much current out of your sensor's output, the output voltage will droop and won't be accurate. I've used soil sensors before, and realized that an op-amp stage was much better. If you want to use an omp-amp stage, make sure you get an op-amp with very low bias current (Ib). You can find ones with Ib in the nanoAmp or picoAmp range.

The name of the game is to not ask the sensor to output much current--because if you do, the voltage will fall.
 
thanks you for your brief explanation. before posting this thread i made a resistor divider with 1.2kohm resistor i found there was tremendous voltage drop from 3v to 0.68v i was confused i could not understand at that time now i understood.

Could you please link me to a tutorial how to use a opamp for this application.
 

Sure, you'll likely want to use a non-inverting amplifier, but make your gain <1 (to make an attenuator), to have something like a gain of 0.5 to drop your 3V output to be fully scaled at 1.5V. You can just write software to ignore the last 0.3V at the top end of the ADC result. Check out the first few pages of this PDF:

**broken link removed**

Try using a TLC2264 op-amp for this application. The input bias current is very low, which means you won't be "bogging down" the output voltage very much:

https://www.ti.com/lit/ds/symlink/tlc2264.pdf

You may have to get creative with the op-amp circuit, and use a non-standard way of hooking up the op-amp so as to draw the least amount of current from the sensor output as possible. Be careful with the current paths from the sensor output to ground that exist in your opamp design. Maximize resistance values while still following the correct circuit design.
 
thank you for ur explanation

As i don't have opamp readily i would like to use a resistor divider circuit .

just few minutes back i used two 1Mohm resistors in series and i found that the input voltage was 3vdc but when i measured across the junction of the two resistors it was 1.05vdc only why i could not measure 1.5v the half of the input voltage 3vdc sir.
 

The name of the game is to not ask the sensor to output much current--because if you do, the voltage will fall.
Is that a problem, since reducing the voltage is exactly what he wants?

before posting this thread i made a resistor divider with 1.2kohm resistor i found there was tremendous voltage drop from 3v to 0.68v i was confused i could not understand at that time now i understood.
So perhaps if you connect a 4.7K resistor across the probe, the voltage will drop from 3V to 1.8V. Does the solution need to be more complicated than one resistor?
 

I'm not so sure that you can use it as an accurate linear sensor if you do this.

---------- Post added at 10:37 ---------- Previous post was at 10:36 ----------

Do you have something else connected to that resistor divider at the middle? If you have 3VDC at the top, and GND at the bottom, and nothing else in the circuit besides the two resistors which are the same value, then at the middle, you should have half the top voltage.
 
just few minutes back i used two 1Mohm resistors in series and i found that the input voltage was 3vdc but when i measured across the junction of the two resistors it was 1.05vdc only why i could not measure 1.5v the half of the input voltage 3vdc sir.
The input impedance of the ADC (apparently also about 1Mogm) is in parallel with the second resistor. A divider made with two 100K resistors should work much better.
 

@milesguidon
Do you have something else connected to that resistor divider at the middle? If you have 3VDC at the top, and GND at the bottom, and nothing else in the circuit besides the two resistors which are the same value, then at the middle, you should have half the top voltage.

i did not have any thing besides the two resistors.

But as godfreyl had suggested i connected 100kohm resistance and is found that the half the top voltage was observed can i proceed now with 100k resistors .
 

Ah--looks like your multimeter input impedance must be in the ballpark of 1MegOhm. Might as well try it with the A/D now.
 
@milesguidon,
when i used 100kohm resistance on bread board it as displaying the exactly half the voltage but when i had soldered it on the PCb and started measuring the analog value but it was not accurate to 1.5v then i replaced it with 1Mohm it was working fine now .

thank you for helping me.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top