Complexity Explorer Santa Few Institute

Introduction to Information Theory

Lead instructor:

Your progress is not being saved! Enroll now or log in to track your progress or submit homework.

6.1 Mutual Information » Quiz Solutions

Question 1:

Let xh indicate the outcome that the first toss was a heads, xt the outcomes that the first toss was a tails, yh the outcomes that the second toss was a heads, and yt the outcomes that the second toss was a tails.

Then, there are four possible joint outcomes of tossing a coin twice in a row: (xh, yh), (xh, yt), (xt, yh), and (xt, yt).  The joint information can then be written,
I(XY)= -P(xh,yh) log2 P(xh,yh) -P(xh,yt) log2 P(xh,yt) -P(xt,yh) log2 P(xt,yh) -P(xt,yt) log2 P(xt,yt)

Let us assume that getting a tails (or heads) on the first toss does not change the probability of getting a tails or heads on the second toss. Then we can write  P(xh,yh)  = P(xh)*P(yh), P(xh) is the probability of landing a heads on single toss.  Since the coin is assumed to be fair, P(xh) = P(xt) = 0.5, meaning that P(xh,yh) = 0.5*0.5 = 0.25.

The same arguments applies to all four of the possible outcomes, not just the (xh,yh) outcome.  Thus, all four outcomes have probability 0.25.

Plugging this into the above equation for I(XY) gives
      I(XY) = -4*0.25*log2 0.25 = 2 bits


Question 2:

The mutual information is given by sum of the marginal information of X plus the marginal information of Y, minus their joint information:
    I(X:Y) = I(X) + I(Y) - I(XY)

We know that the amount of information in a single toss of a fair coin is 1 bit, so I(X) (the information in the first toss) is 1 bit, and I(Y) (the information in the second toss) is 1 bit.

Given the answer to the previous question, we also know that the joint information I(XY) = 2 bits.

Combining gives
  I(X:Y) = 1 bit + 1 bit - 2 bits = 0 bits
Thus, the resulting mutual information is then 0 bits.

 

Question 3: 

Remember the rule for finding marginal probabilities given joint probabilities, as stated in this part of the video:
  P(xrain) = P(xrain,yno-sun) + P(xno-rain,ysun)

Plugging in the probabilities specified in the question, we get
 P(xrain) = 0.3 + 0.1 = 0.4
 

Question 4: 

Recall again the formula for mutual information,
  I(X:Y) = I(X) + I(Y) - I(XY)
We first compute the marginal information terms, I(X) and I(Y), and the joint information term, I(XY).

We use the marginalization rule (see this part of the video) to calculate the marginal probabilities of X:
  P(xrain) = P(xrain , yno-sun) + P(xrain , ysun) = 0.3 + 0.1 = 0.4
  P(xno-rain) = P(xno-rain , yno-sun) + P(xno-rain , ysun) = 0.2 + 0.4 = 0.6

We can do the same to calculate the marginal probabilities of Y:
 P(ysun) = P(xrain , ysun) + P(xno-rain , ysun) = 0.1 + 0.4 = 0.5
 P(yno-sun) = P(xno-rain , yno-sun) + P(xrain , yno-sun) = 0.2 + 0.3 = 0.5

Recalling the Fundamental Formula for the information in a random variable with two outcomes, we compute
I(X) = -0.4 log2 0.4 - 0.6 log2 0.6 = 0.97
and
I(Y) = -0.5 log2 0.5 - 0.5 log2 0.5 = 1

Then, we compute the joint information by consider all four possible joint outcomes of X and Y (also see the answer to the last question):
I(XY) = - P(xrain , ysun) log2 P(xrain , ysun) - P(xno-rain , ysun) log2 P(xno-rain , ysun)
           - P(xrain , yno-sun) log2 P(xrain , yno-sun) - P(xno-rain , yno-sun) log2 P(xno-rain , yno-sun)
        = - 0.1 log2 0.1 - 0.4 log2 0.4 - 0.3 log2 0.3 - 0.2 log2 0.2 ≈ 1.84

Finally, we combine and compute the mutual information:
I(X:Y) = I(X) + I(Y) - I(XY) = 0.97 + 1 - 1.84 = 0.13 bits