9.1 The Manifold Things Information Measures » Quiz Solutions
Question 1:
Let X represent the outcome of Alice’s coin toss, and Y the coin toss result received by Bob. Remember that the information loss about Alice’s toss given Bob’s received result is given by the conditional information I(X|Y), which can be computed using I(X|Y) = I(XY) - I(Y).
To compute the joint information I(XY), we first calculate the joint probability of Alice’s and Bob’s outcomes. Given the bias in Alice’s coin (1/3 probability heads) and the noise in the channel (1/8 probability of error), the joint probability is
P(X=heads, Y=heads) = 1/3 * (1-1/8) = 7/24
P(X=heads, Y=tails) = 1/3 * 1/8 = 1/24
P(X=tails, Y=heads) = 2/3 * 1/8 = 2/24
P(X=tails, Y=tails) = 2/3 * (1-1/8) = 14/24
The joint information can then be computed as
I(XY) = -(7/24) log2 (7/24) - (1/24) log2 (1/24) - (2/24) log2 (2/24) - (14/24) log2 (14/24)
≈ 1.46 bits
To compute I(Y), we first calculate the marginal probability of P(Y),
P(Y=heads) = P(X=heads, Y=heads) + P(X=tails, Y=heads) = 7/24 + 2/24 = 3/8
P(Y=tails) = P(X=heads, Y=tails) + P(X=tails, Y=tails) = 1/24 + 14/24 = 5/8
Then, the information in Y is
I(Y) = -3/8 log2 3/8 - 5/8 log2 5/8 ≈ 0.95 bit
Finally, we can compute the bits of information loss as
I(Y|X) = I(XY) - I(Y) = 1.46 - 0.95 = .51 bits
Question 2:
Remember that the bits of noise introduced by the channel correspond to the conditional information I(Y|X), which also corresponds to I(XY) - I(X).
We already compute the joint information I(XY) in the answer to question 1, and measured it as ≈1.46 bits.
To compute the information in X, we use the fact that the marginal distribution over X is specified by question 1: P(X=heads) = 1/3 , which means that P(X=tails) = 2/3. Thus, the information in X is
I(X) = -1/3 log2 1/3 - 2/3 log2 2/3 ≈ 0.92 bits
Thus, the bits of noise introduced by the channel are
I(Y|X) = 1.46 - 0.92 = 0.54 bits