UP | HOME

conditional entropy

The entropy of a random variable \(Y\) given an observation \(X=x\) is: \[ H(Y \mid X=x) = \sum_{y\in Y} p(y \mid x) \log \left(\frac{1}{p(y \mid x)}\right) \]

Averaging over all possible values of \(x\) gives: \[ H(Y\mid X) = \sum_{x\in X} p(x) H(Y\mid X=x) \]

1 properties

Note that if \(Y\) is a deterministic function of \(X\), then for each \(x\in X\), there is one \(y\in Y\) for which \(p(y\mid x) = 1\). For all other \(y\), \(p(y\mid x) = 0\). So \[ \sum_{y\in Y} p(y\mid x) \log\left(\frac{1}{p(y\mid x)}\right) = 0 \] So, \(H(Y\mid X) = 0\). Or the uncertainty of \(Y\), having observed \(X\), is 0.

Created: 2021-09-14 Tue 21:44