Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Accuracy on analog sensors

Status
Not open for further replies.

howie

Member level 2
Joined
Jul 2, 2009
Messages
51
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Location
Philippines
Activity points
1,671
Let say an analog water level sensor with voltage output range between 0-5Vac / 0-500ft installed with cable length of 300ft. The sensor outputs a voltage of 2.5V corresponding to an equivalent level of 250ft.

How can I negate the voltage drop due to cable length, cable size, temperature etc. to have an accurate reading.

Are there any circuit to solve this problems?

Or would the voltage drop (in any cable length) be neglible not to affect much on the voltage output?

Thanks!
 

I advise you to do a small test yourself.
But in general the impedance of the wire must be compared to the impedance if the sensor.
What is the sensor you are using?
does it have a datasheet
 

howie said:
Or would the voltage drop (in any cable length) be neglible not to affect much on the voltage output?

How much current is running through the signal lines of the cable? Probably under 100mA, since it's a voltage output sensor (you can check the sensor data sheet for the max output current). What's the input impedance of the circuit which reads the sensor? Current, resistance of the cable, which you can measure, and Ohm's law will give you the voltage drop. You can convert it to feet and see if it's negligible.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top