-
-
Measuring Information: Bits
-
Adding Up Bits
-
Using Bits to Count and Label
-
-
-
Physical Forms of Information
-
Entropy
-
-
-
Information and Probability
-
-
-
Fundamental Formula of Information
-
-
-
Computation and Logic: Information Processing
-
-
-
Mutual Information
-
-
-
-
Shannon's Coding Theorem
-
-
-
The Manifold Things Information Measures
-
-
-
Homework
-
Homework Solutions
-
8.1 Shannon's Coding Theorem » Some Notes on Channel Capacity
Note that in the tutorial, “channel capacity” is used to refer to two different, though closely related, things.
Remember that a communication channel is defined by a conditional probability distribution of output signals given input signals, which we can write as P(y|x). The tutorial sometimes refers to the “capacity” of the channel as the mutual information across the channel, I(X:Y), for a given distribution of sender signals, P(x). “Capacity” is also sometimes used to refer, however, to the maximum possible mutual information across the channel, given any distribution of sender signals:
C = maxP(x) I(X:Y)
It is this later meaning of capacity that determines the ultimate bound on the amount of information that can be transmitted across the channel, and is usually the meaning implied by the term “channel capacity” in the information theory literature.
For more information, see Chapter 7 in
Cover and Thomas, Elements of information theory, Wiley, 2006.