:::
A measure of how much information that can be
transmitted and received with a negligible probability
of error is called the channel capacity.
To determine this measure of channel potential, assume that a channel
encoder receives a source symbol every Ts second.
With an optimal source code, the average code length of all source symbols is equal to the entropy rate of the source.
If S represents the set of all source symbols and the entropy rate of the source is written as H(S),
the channel encoder will receive on average H(S)/ Ts information bits per second
"""
above lines are from a paper, what I dont understand is that
Entropy rate of the source H(s) means information contained in one transmitted symbol or
The information contained in all of possible source symbols a source has.