im&u
Newbie level 4
- Joined
- Jan 21, 2011
- Messages
- 7
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1,281
- Activity points
- 1,349
Hi!
I'm designing a 1553 decoder.
1553 bit rate = as in here : https://en.wikipedia.org/wiki/MIL-STD-1553
The bit rate is 1.0 megabit per second (1 bit per μs). The combined accuracy and long-term stability of the bit rate is only specified to be within ±0.1%; the short-term clock stability must be within ±0.01%
I am trying to find a way to handle this accuracy in RTL but I don't understand well this requirement.
Here how I see it:
1 Mb/s +- 0.1% (worst case) => 1 bit each 1 us +- 1 ns
Here what I did:
I choose to sample the data with a clock @ 8 MHz, and I tolerated 1 clk cycle (125 ns)
This means that I will consider a signal of 1 us +- 125 as valid.
For example 1 us -125 = 875 ns but this is outside the 0.1 % (1 ns) required by the standard
I am confused.
Any idea/suggestion/link will be helpful.
I'm designing a 1553 decoder.
1553 bit rate = as in here : https://en.wikipedia.org/wiki/MIL-STD-1553
The bit rate is 1.0 megabit per second (1 bit per μs). The combined accuracy and long-term stability of the bit rate is only specified to be within ±0.1%; the short-term clock stability must be within ±0.01%
I am trying to find a way to handle this accuracy in RTL but I don't understand well this requirement.
Here how I see it:
1 Mb/s +- 0.1% (worst case) => 1 bit each 1 us +- 1 ns
Here what I did:
I choose to sample the data with a clock @ 8 MHz, and I tolerated 1 clk cycle (125 ns)
This means that I will consider a signal of 1 us +- 125 as valid.
For example 1 us -125 = 875 ns but this is outside the 0.1 % (1 ns) required by the standard
I am confused.
Any idea/suggestion/link will be helpful.