Jay_
Member level 3
- Joined
- Aug 23, 2010
- Messages
- 56
- Helped
- 1
- Reputation
- 2
- Reaction score
- 2
- Trophy points
- 1,288
- Activity points
- 1,779
Hi,
How do we ensure that when we are communicating serially, we don't lose any of the data (say the start bit) in between? And if we do, by what means does the receiver actually 'know' this has happened?
Let's assume I have an asynchronous communication and I am sending some data:
Data : 3DE2 (Hex)
Start sequence : ASCII '@' (viz. Hex 40)
End sequence : ASCII '%' (viz. Hex 25)
So I would code to send :
0x403DE225. But what happens if one bit is lost in the transmission? So the receiver wouldn't know it got a end sequence and it would count the next part as continuing data. What if one bit of the data was lost?
By what logic is the receiver supposed to be coded so that it recognizes these errors? I recently worked with Arduino sending data through XBees and encountered these problems.
How do we ensure that when we are communicating serially, we don't lose any of the data (say the start bit) in between? And if we do, by what means does the receiver actually 'know' this has happened?
Let's assume I have an asynchronous communication and I am sending some data:
Data : 3DE2 (Hex)
Start sequence : ASCII '@' (viz. Hex 40)
End sequence : ASCII '%' (viz. Hex 25)
So I would code to send :
0x403DE225. But what happens if one bit is lost in the transmission? So the receiver wouldn't know it got a end sequence and it would count the next part as continuing data. What if one bit of the data was lost?
By what logic is the receiver supposed to be coded so that it recognizes these errors? I recently worked with Arduino sending data through XBees and encountered these problems.