itsthetimetodisco
Member level 4
Hello,
Can somebody point out a way for simulating channel errors for an RTP based packetized data?
Specifically, there is a bitstream which consists of say 300 to 400 RTP packets. I need a way to introduce loss such that some number of packets are randomly lost. I would prefer to do this using a 2-state Markov model.
Any pointers please?
Can somebody point out a way for simulating channel errors for an RTP based packetized data?
Specifically, there is a bitstream which consists of say 300 to 400 RTP packets. I need a way to introduce loss such that some number of packets are randomly lost. I would prefer to do this using a 2-state Markov model.
Any pointers please?