Continue to Site

# Coherence bandwidth and delay spread

Status
Not open for further replies.

#### luckyvictor

##### Member level 4
Hi all

My understanding of coherence bandwidth is the bandwidth on frequency spectrum of the channel, the portion which is approximately flat, so how can it be related to delay spread please?

Thanks

Hello,

Imagine when you have a flat frequency transfer function, but phase versus frequency isn't linear with frequency.

If the phase is linear with frequency, your filter behaves just as a delay with same delay time for all frequencies, so coherence will be maintained.

When the phase doesn't change linearly with frequency, the delay time becomes frequency dependent. You may search for "group delay time" (d(phi)/d(omega)). So there will be spread in the delay (depending on frequency).

A wide band signal fed to a flat transfer function with non-constant group delay time within the pass band will cause the signal to spread out in time, hence reducing coherence.

Wim

The impulse response of the channel is transformed in the frequency domain as summation of complex exponentials in the frequency domain, the magnitude spectrum has spectral peaks that are quasi-constant over a minimum band that is the inverse of the maximum delay (delay spread), the same for the phase spectrum where it is linear only in such band. The coherence bandwidth statistically quantifies this band and is therefore a function of the delay spread.

Hello,

Imagine when you have a flat frequency transfer function, but phase versus frequency isn't linear with frequency.

If the phase is linear with frequency, your filter behaves just as a delay with same delay time for all frequencies, so coherence will be maintained.

When the phase doesn't change linearly with frequency, the delay time becomes frequency dependent. You may search for "group delay time" (d(phi)/d(omega)). So there will be spread in the delay (depending on frequency).

A wide band signal fed to a flat transfer function with non-constant group delay time within the pass band will cause the signal to spread out in time, hence reducing coherence.

Wim

Okay. Consider this- I send narrow frequency range through the channel so that it's within the coherence bandwidth of the channel. Now it has some delay spread mainly because of the multipath propagation. This means that even though the bandwidth is within the coherence bandwidth it has still suffered the delay spread.
Now take another situation- I send a wide frequency range through the channel and it suffers because of the fact that it's less than the coherence bandwidth. But there are no buildings during its propagation so that there is no multipath delay and hence there is no delay spread.
I mean delay spread is mainly because of the multipath propagation and how could you relate this with the channel bandwidth?

Hello,

If you have multipath propagation (that means also delay spread), the channel isn't flat anymore. Only when the multi path delay time is small with respect to the symbol duration, the channel can be considered flat and coherence is maintained.

You are right that delay spread is caused by multi path. When the delay spread is no longer small with respect to symbol duration, energy is leaking from one symbol into adjacent symbols, hence making reliable communication impossible (unless you use echo cancelation).

(C)OFDM uses very low symbol rate to make sure that delay spread is always small w.r.t. symbol duration.

You say that the phases should be linear with frequency if there should be no delay spread. But phases should be same right within the coherence bandwidth?

Last edited:

Hello,

You agree that a delay will not harm coherence between input and output signals?

You can have a very long single delay (for example a space LOS path, ignoring ionosphere). The phase shift because of the delay: Phase = -(time delay)*2*pi*f [radians].

So changing the frequency will give a different phase shift. As long as phase shift is linear with frequency, and amplitude remains the same, you have maximum coherence between input and output. No phase shift versus frequency is also linear with frequency (that means there is no delay (zero length of cable) ).

iVenky

### iVenky

Points: 2
Okay. Understood. This means that even if two frequencies are coherent they undergo delay but there is no delay spread between the two. If that's not the case then there is delay spread between the two which happens if the phase does not vary linearly with frequency and hence there is some spread in the receiver.

I think your summary is fully correct.

Status
Not open for further replies.