Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
Hi,
By definition, channel capacity (C) is the maximum data rate at which reliable transmission of information over the channel is possible.
At rates R<C reliable transmission of information over the channel is possible and at rates R>C reliable transmission is not possible.
Shannon formula states that : C=max_p(x) { I(x;y) }
where I(x;y) denotes the mutual information between X (channel input) and Y (channel output) and the maximization is carried out over all input probability distributions of the channel ( p(x) ).
the mutual information between to random variables X and Y is defined as :
I(x;y)=∑x∑y{ p(x)p(y|x)log[p(x,y)/( p(x)p )] }
where the mutual information is in bits and the logarithm is in base 2.
where the channel governs the conditional probability P(y|x) for particular input x and output y.
Mutual information is defined as the logarithm of the ratio of a posteriori to a priori probability, i.e., log P(x|y)/P(x). Clearly, mutual inforamtion itself is a random variable, having its mean, variance, etc. The mean value is given by
I(X;Y)=∑x∑y P(x,y) log P(x|y)/P(x).
Note that I(X;Y) is only a funciton of the channel.
Channel capacity goes one step further from mutual inforamtion by maximizing over all possible P(x). This makes sense in that we have control over the distribution of information bits out of channel encoder.
I have to say the above explaination are taken from Gallager's book.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.