ZacMan
Newbie level 4
- Joined
- Feb 11, 2013
- Messages
- 5
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1,281
- Activity points
- 1,366
G'day All.
I'm working on a project that involves receiving and transmitting data on a single wire serial bus. I have the documentation for the physical and message layers of the protocol, but cannot share them here as it's proprietary information.
I had a good study of AVR274 (software interrupt driven single wire UART) and it gave me a good jump off point to start some design. I've got a decent receiver and transmitter sorted out and they work separately reasonably well.
I am having a slight problem with the receiver however. My program detects transitions on the data bus, which trigger an interrupt. This interrupt looks at the program state and acts accordingly. If the program is currently receiving a message, it will re-sync the reception timer clock to the message, as the protocol uses bit stuffing to increase the clock variation tolerance. If the program state is currently idle, it will change the program state to receiving and kick the reception timer into gear.
The issue I'm having is with buffer overflow. If there isn't an available buffer for reception when the start bit of a new message arrives, the reception timer is not started, and the transition is ignored. If a buffer then becomes free in the middle of this message, the edge detector will detect an edge in the middle of the message, think its the start of a new message and kick the reception timer into gear, sampling the message from the middle onwards! This results in corrupt message reception occasionally.
The problem is compounded slightly, as the reception timer is coded to turn itself off at the end of message reception, as the messages are variable in length and have a fixed footer structure which indicates the end of a message.
What would the standard/best practice be here? It seems to me that there are two ways I could go:
Either I just overwrite the buffers, and maybe set some flag that signals there has been an overflow event
Or I could have the edge detector still kick the reception timer into gear even if the buffers are full, but just not store the sampled bits, instead just look for the message end footer, at which stage it turns itself off...
Help me internets! I can't decide which way to go.
Cheers.
I'm working on a project that involves receiving and transmitting data on a single wire serial bus. I have the documentation for the physical and message layers of the protocol, but cannot share them here as it's proprietary information.
I had a good study of AVR274 (software interrupt driven single wire UART) and it gave me a good jump off point to start some design. I've got a decent receiver and transmitter sorted out and they work separately reasonably well.
I am having a slight problem with the receiver however. My program detects transitions on the data bus, which trigger an interrupt. This interrupt looks at the program state and acts accordingly. If the program is currently receiving a message, it will re-sync the reception timer clock to the message, as the protocol uses bit stuffing to increase the clock variation tolerance. If the program state is currently idle, it will change the program state to receiving and kick the reception timer into gear.
The issue I'm having is with buffer overflow. If there isn't an available buffer for reception when the start bit of a new message arrives, the reception timer is not started, and the transition is ignored. If a buffer then becomes free in the middle of this message, the edge detector will detect an edge in the middle of the message, think its the start of a new message and kick the reception timer into gear, sampling the message from the middle onwards! This results in corrupt message reception occasionally.
The problem is compounded slightly, as the reception timer is coded to turn itself off at the end of message reception, as the messages are variable in length and have a fixed footer structure which indicates the end of a message.
What would the standard/best practice be here? It seems to me that there are two ways I could go:
Either I just overwrite the buffers, and maybe set some flag that signals there has been an overflow event
Or I could have the edge detector still kick the reception timer into gear even if the buffers are full, but just not store the sampled bits, instead just look for the message end footer, at which stage it turns itself off...
Help me internets! I can't decide which way to go.
Cheers.