# Error rate vs. bandwidth

1. ## Error rate vs. bandwidth

Hi,

The attached chart is Bit Error Probalitity vs. IF SNR bandwidth. It seems the wider IF bandwidth (1.5x) tends to have a higher error rate. This is something against my understanding: the more bandwidth the better. Can anyone explain it?

Senmeis  •

2. ## Re: Error rate vs. bandwidth

This looks like a very carelessly produced graph (e.g. what is "Bit Rage" supposed to mean?).

Perhaps the x-axis is supposed to be related to bit rate. Given a fixed baud rate, higher bit rates will usually lead to higher bit error probability (e.g. think about 64-QAM vs BPSK).

However, I usually find that it is not worth wasting your time trying to understand badly presented information like this. If the author isn't even capable of producing a basic graph, then there is a good chance that whatever they are trying to tell you is unhelpful or incorrect. •

3. ## Re: Error rate vs. bandwidth

The diagram is confusing. The X-axis is the SNR when the bandwidth = bit rate. When the IF is extended to 1.5 * bit rate, the noise will increase by 1.76 dB (assuming white noise) and the actual SNR will go down if the bandwidth already was enough to pass most of the wanted signal.
The diagram basically shows that there is an optimum IF bandwidth for a certain bit rate. If you decrease (obvious, but not shown in the diagram) or increase the IF BW from the optimum value, the error probability goes up.
The optimum bandwidth for a bit rate depends on the modulation method (NRZ-L PCM/FM in this case).

1 members found this post helpful. •

4. ## Re: Error rate vs. bandwidth Originally Posted by std_match The X-axis is the SNR when the bandwidth = bit rate.
I'm still confused. What bandwidth are we talking about on the x-axis? I'm confused that at the top we have "IF bandwidth = 1.5 times bit rate" and at the bottom we have "bandwidth = bit rate".

Can you describe, for example, what the data point at the top tells us? It looks to me like "The bit error probability when the IF SNR is ~9dB (with bandwidth = bit rate) is ~7e-3 when bandwidth = 1.5x bit rate" ... which is obviously nonsense. 5. ## Re: Error rate vs. bandwidth

I think the X-axis only shows the SNR for the case "bandwidth = bit rate". The other curve "bandwidth = 1.5 bit rate" can only be used to see the change in error probability when the IF BW is changed.
The SNR will also change, but that value can't be read from the diagram. We can draw the conclusion that the SNR gets worse, since the error probability increases.

To really understand the diagram I think we also need a diagram that plots the SNR as a function of the IF BW. I expect a peak not far from BW = bit rate.
If the wanted signal already "fits" inside the BW, nothing is gained by increasing the BW. The added signal will mostly be noise, so the SNR will be worse.

1 members found this post helpful. •

6. ## Re: Error rate vs. bandwidth I attached the entire page including this chart. It is an annex of a standard. It seems std_match is correct in terms of optimal IF bandwidth:

The IF bandwidth, for data receivers, should typically be selected so that 90 to 99 percent of the transmitted power spectrum is within the receiver 3-decibel (dB) bandwidth.

I suppose the 1.5x bandwidth leads to a higher error rate because more noise is introduced. Is it correct?

Senmeis --[[ ]]--