Hi,
I have wires connected to an ADC. I would like to have a simple circuit to detect if signal is ok or not (wire got detached). Actually, I want to design a circuit to differentiate between disconnected wire and zero voltage. I would be grateful if you share your comments. Any idea how to design that without sacrificing circuit precision?
If you can design this yourself, then shift the voltage up by a known amount. That way, if the line s disconnected, it will show as 0V, where as the lowest valid voltage will show as the offset amount.
This is one of the reason s that RS232 and the like use +/- 3-12 V - 0V is not a valid signal and so can be differentiated as a broken connection. (I know that is a different context but the idea is the same.)
Susan
Thank you for your response.
I am designing the receiver. The sender uses standard 0-5v or 0-10v. I don't have problem with 4-20mA standard since as you stated the zero is shifted. My problem is standards using zero as reference.
On the receiver side you may use weak resistors (100k ?) to pull the signal beyond the output limits
For a 0V / 5V system you may use -0.2V / +5.2V.
Decide which value is the safe one.
As long as the wire is connected .... the resistor is not able to draw the signal beyond the limits
( the current should be within the limits to be within the specified limits, not to cause too much voltage error.)
But as soon as the connection breaks .. the voltage will go beyond limits and can be detected by a comparator.
Thank you for the response. The problem is that the line is connected to ground by weak resistor 500 ohm. However, I m thinking to disable it using analog switch chip. but I am not sure how much error it will introduce while I am dealing with current input. I am working on that. I would be grateful if you share, if you come up with new one