Re: Slow Fading
sinu_gowde said:
Q: what does delay constraint for the user mean ???
Ans: Delay with respect to the fading is that... When a signal travels in all directions & meets to the reciever antenna... the antenna will get the same signals from different paths... So as a result of this some signal which are recieved after some fraction of second.. will have some delay. So all the combines affect is the cause of the signal Fading.
I think you got the Delay concept invloved in here... that is that signals which are recieved after some time duration are the delayed candidates.
This answer is totally misleading!!!
Most of delay in digital communication comes from processing delay instead of propagation delay. For example, in wireless comm, one needs interleaving to get a good coding performance and interleaver introduces delay, so is the deinterleaver. For example, in 3G system, interleaver may induce tens of ms delay.
What is the delay constraint?
It depends on the QoS, which in turn depends on the types of services. For voice service, people are tolerant of hundreds of ms. For data service, on the other hand, tens of second are ok.
Why delay constraint is an indicator of slow/fast fading? Why is it revelant?
The answer partially due to the channel coding, that is, if the delay tolerance is long, we can rely on strong and long channel coding techniques to operate close to Shannon limit. In information-theoretic jargon, we say we can achieve ergodic capacity. On the contrary, if the delay is tough, we cannot have long code and consequently we cannot operate close to capacity limit. In this case, another concept, i.e., outage capacity, is more appropriate.
Added after 4 minutes:
urwelcome said:
Ok.
In Rapport’s book slow fading is defined as:
Coherence time > symbol period
In David Tse’s book this is:
Coherence time > Delay constraint of user
Now this means if the delay constraint of a user is small, say one microsecond and coherence time is two microseconds than this is slow fading..
Now the thing is that one microsecond delay constraint of a user means that at the receiving end any reflected signal arriving after that time will be rejected …
Is this is so ???
Thanks..
David Tse's interpretation makes sense! The other one is useless. According to Rapporport's definition, to (almost all practical) scenarios (typical channel model parameters encoutered in reality and operating symbol duration for practical systems), they are slow-fading. However, this definition is useless. It doesn't shed any light on understanding the theory.