UP | HOME

Joint Entropy

1 Statement

The joint entropy between \(X\) and \(Y\) with outcome spaces \(\mathcal{X}\) and \(\mathcal{Y}\) is: \[-\sum_{x\in \mathcal{X}}\sum_{y\in \mathcal{Y}} p(x,y) \log p(x,y)\]

2 relationship to conditional entropy

The joint entropy can be written in terms of the conditional entropy: \[ H(X,Y) = H(X) + H(Y \mid X) \]

2.1 proof

\[\begin{align*} H(X, Y) &= \sum_{x,y} p(x,y) \frac{1}{\log p(x,y)} \\ &= \sum_{x,y} p(x) p(y\mid x) \left(\log \frac{1}{p(x)} + \log \frac{1}{p(y \mid x)}\right) \\ &= \sum_{x} p(x) \log \frac{1}{p(x)} \sum_{y} p(y \mid x) + \sum_{x} p(x) \sum_{y} p(y\mid x) \log \frac{1}{p(y \mid x)}\\ &= H(X) + \mathbb{E}_{X} [H(Y \mid x)] \\ & = H(X) + (Y\mid X) \end{align*}\]

3 Sources

Created: 2021-09-14 Tue 21:43