Complexity Explorer Santa Fe Institute

Your progress is not being saved! Enroll now or log in to track your progress or submit homework.

Introduction to Information Theory

Lead instructor:


8.1 Shannon's Coding Theorem » Some Notes on Channel Capacity

Note that in the tutorial, “channel capacity” is used to refer to two different, though closely related, things.

 

Remember that a communication channel is defined by a conditional probability distribution of output signals given input signals, which we can write as P(y|x). The tutorial sometimes refers to the “capacity” of the channel as the mutual information across the channel, I(X:Y), for a given distribution of sender signals, P(x). “Capacity” is also sometimes used to refer, however, to the maximum possible mutual information across the channel, given any distribution of sender signals:
  C = maxP(x) I(X:Y)

It is this later meaning of capacity that determines the ultimate bound on the amount of information that can be transmitted across the channel, and is usually the meaning implied by the term “channel capacity” in the information theory literature.

For more information, see Chapter 7 in
Cover and Thomas, Elements of information theory, Wiley, 2006.