neoflash
Advanced Member level 1
Quoted from some lecture notes:
A channel with a BER of 10-7 and a average burst size of1000 bits is very different from one with independent random errors
• Example: For an average frame length of 10^4 bits
– random channel: E[Frame error rate] ~ 10^-3
– burst channel: E[Frame error rate] ~ 10^-6
Why burst channel has a lower Frame Error Rate? Is this because that most error bits locate in a single Frame thus the total frame error rate is lower?
A channel with a BER of 10-7 and a average burst size of1000 bits is very different from one with independent random errors
• Example: For an average frame length of 10^4 bits
– random channel: E[Frame error rate] ~ 10^-3
– burst channel: E[Frame error rate] ~ 10^-6
Why burst channel has a lower Frame Error Rate? Is this because that most error bits locate in a single Frame thus the total frame error rate is lower?