# Complexity Explorer Santa Fe Institute

Your progress is not being saved! Enroll now or log in to track your progress or submit homework. ## Introduction to Information Theory

• Measuring Information: Bits
• Adding Up Bits
• Using Bits to Count and Label
• Physical Forms of Information
• Fundamental Formula of Information
• Computation and Logic: Information Processing
• Mutual Information
• Communication Capacity in a Noisy Channel
• Shannon's Coding Theorem
• Homework Solutions

#### 8.1 Shannon's Coding Theorem » Quiz Solutions

Question 1:

Shannon’s Noisy Channel Coding Theorem states that the maximal rate of information that can be send across a channel is determined by the mutual information between the sender and receiver.
In particular, it is determined by the maximum mutual information between the sender and receiver for any sender distribution.

Question 2:

We first determine the maximum amount of information that Alice needs to send.
We know that there are 100 possible messages, but we don’t know which distribution of messages Alice will use. At the same time, we know that the entropy of a distribution over 100 messages is maximized by the uniform distribution, which has entropy
Hmax = log2100 ≈ 6.64 bits
Thus, in the worst case, Alice needs a channel with capacity 6.64 bits.

Supplemental Information