# Complexity Explorer Santa Few Institute

## Introduction to Information Theory

• Measuring Information: Bits
• Using Bits to Count and Label
• Physical Forms of Information
• Fundamental Formula of Information
• Computation and Logic: Information Processing
• Mutual Information
• Communication Capacity in a Noisy Channel
• Shannon's Coding Theorem
• Homework Solutions

#### 8.1 Shannon's Coding Theorem » Some Notes on Channel Capacity

Note that in the tutorial, “channel capacity” is used to refer to two different, though closely related, things.

Remember that a communication channel is defined by a conditional probability distribution of output signals given input signals, which we can write as P(y|x). The tutorial sometimes refers to the “capacity” of the channel as the mutual information across the channel, I(X:Y), for a given distribution of sender signals, P(x). “Capacity” is also sometimes used to refer, however, to the maximum possible mutual information across the channel, given any distribution of sender signals:
C = maxP(x) I(X:Y)

It is this later meaning of capacity that determines the ultimate bound on the amount of information that can be transmitted across the channel, and is usually the meaning implied by the term “channel capacity” in the information theory literature.