Complexity Explorer Santa Few Institute

Introduction to Information Theory

Lead instructor:

Your progress is not being saved! Enroll now or log in to track your progress or submit homework.

6.1 Mutual Information » A Note on Information Notation

It is common to find “marginal information” I(X) and “joint information” I(XY) referred to as “marginal entropy” and “joint entropy” respectively, and indicated by H(X) and H(X,Y).

Mutual information between X and Y is usually indicated as I(X:Y) or I(X;Y), and sometimes MI(X:Y) or MI(X;Y).