Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Mutual Information VS Channel Capacity.

Status
Not open for further replies.

malaylah

Junior Member level 1
Joined
Feb 27, 2005
Messages
19
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,452
Is there any differece between mutual information and channel capacity?
 

Hi,
By definition, channel capacity (C) is the maximum data rate at which reliable transmission of information over the channel is possible.
At rates R<C reliable transmission of information over the channel is possible and at rates R>C reliable transmission is not possible.

Shannon formula states that :
C=max_p(x) { I(x;y) }

where I(x;y) denotes the mutual information between X (channel input) and Y (channel output) and the maximization is carried out over all input probability distributions of the channel ( p(x) ).

the mutual information between to random variables X and Y is defined as :

I(x;y)=xy{ p(x)p(y|x)log[p(x,y)/( p(x)p(y) )] }
where the mutual information is in bits and the logarithm is in base 2.


it's that simple
:D:D:D

plz press "helped me" if it was helpful for you.

regards
 

Let's take an example

X -->channel --> Y

where the channel governs the conditional probability P(y|x) for particular input x and output y.

Mutual information is defined as the logarithm of the ratio of a posteriori to a priori probability, i.e., log P(x|y)/P(x). Clearly, mutual inforamtion itself is a random variable, having its mean, variance, etc. The mean value is given by
I(X;Y)=∑x∑y P(x,y) log P(x|y)/P(x).
Note that I(X;Y) is only a funciton of the channel.

Channel capacity goes one step further from mutual inforamtion by maximizing over all possible P(x). This makes sense in that we have control over the distribution of information bits out of channel encoder.

I have to say the above explaination are taken from Gallager's book.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top