hi everyone,
In wireless scenario, say a source starts transmitting data (+1, -1, 0) to receiver and there exists many other nodes which are transmitting ( say +1, -1, 0) and are in the range of receiver.
My query is suppose if the source transmits +1, 1.does the other nodes become interferers if they all transmit +1 as well.
2.with the same scenario as above does ber (bit error rate) increase, if i reduce number of simultaneous transmissions. (with all of them transmitting +1 as well).
3.do collision at the receiver play any role or is it same as interference?
hi everyone,
In wireless scenario, say a source starts transmitting data (+1, -1, 0) to receiver and there exists many other nodes which are transmitting ( say +1, -1, 0) and are in the range of receiver.
My query is suppose if the source transmits +1, 1.does the other nodes become interferers if they all transmit +1 as well.
If all transmit at the same frequency or time or code... then yes [/quote]
haisrilatha said:
2.with the same scenario as above does ber (bit error rate) increase, if i reduce number of simultaneous transmissions. (with all of them transmitting +1 as well).
That is an incorrect way to look at it. You can never be sure if all transmit +1, so do not use it. They might well be transmitting -1. You need to design your system for worst case scenario
haisrilatha said:
3.do collision at the receiver play any role or is it same as interference?
So for the second question can i proceed by taking worst case i.e considering every node transmits -1. Then BER will decrease if i reduce the number of interferers. Is this analysis correct?
I have seen a formula for ber as
BER=Σp{signal energy+(number of interfering nodes*its energy)+gaussian noise<0/interference vector} * p{interference vector}
where p{x/y} is probability of x given y
given that signal energy is the one when source transmits +1.
where p{signal energy+(number of interfering nodes*its energy)+gaussian noise<0/interference vector}=Q{(signal energy+(number of interfering nodes*its energy))/σstandard deviation}
where Q is Q function and σ is standard deviation
My doubt is- if i use some good specific MAC (medium access control) protocol, then the number of interfering nodes will reduce, which normally should reduce BER.
But if i take worst case then only BER is getting reduced in the Q function, otherwise not.
So am i proceeding in right way? that is when i reduce number of interferers, BER shud reduce and it shud always work?