UP | HOME

bayesian hypothesis test

1 likelihood ratio test

If we know the costs and the prior probabilities, then we can characterize how good a decision rule is according to a Baye's risk.

We can think of the Baye's risk is a measure of our decision rule that is weighted by:

  • how much we care about each type of mistake (the costs)
  • how often we think mistakes will occur (which come from the prior probabilities)

    If we know the costs and the prior probabilities, then it turns out the optimal decision rule that minimizes the Baye's risk is given by Theorem 1.

2 theorem 1 – Baye's LRT

If the costs \(C_{ij}\) and the prior probabilities \(P_i\) are known, then the optimal decision rule is given by: \[ L(\mathbf{y}) \triangleq \frac {p_{\mathsf{y} \mid \mathsf{H}} (\mathbf{y} \mid H_1)} {p_{\mathsf{y} \mid \mathsf{H}} (\mathbf{y} \mid H_0)} \underset{\hat{H}(\mathbf{y}) = H_2}{\overset{\hat{H}(\mathbf{y}) = H_1}{\gtreqless}} \frac{P_0(C_{10} - C_{00})}{P_1(C_{01} - C_{11})} \triangleq \eta \]

2.1 intuitive quick read

\(P_0(C_{10} - C_{00})\) is the fraction of times that hypothesis 0 will be true, i.e. a 0 will be sent over the wire, weighted by the cost of misclassifying the 0. The higher this value is, the more you're incentivized to not classify 0 as 1, so the higher the threshold for classifying something as 0 is.

3 related links

Created: 2021-09-14 Tue 21:44