convergence of random variables
1 types of convergence
2 law of large numbers
The difference between the strong and weak law of large numbers comes from the difference between convergence almost surely and convergence in probability.
3 continuous functions
If \(f\) is a continuous function then, if \(Y_n\) converges to \(Y\) (a.s. OR in probability OR in distribution) then \(f(Y_n)\) converges to \(f(Y)\)
4 marginal convergence
If \(X_n\) converges (a.s. OR in probability) and \(Y_n\) converges (a.s. OR in probability) then \((X_n, Y_n)\) converges to (X, Y)
BUT: \(X_n \overset{d}{\rightarrow} X\) and \(Y_n \overset{d}{\rightarrow} Y\) do not imply \((X_n, Y_n) \overset{d}{\rightarrow} (X, Y)\)
4.0.1 example
Let \(X_1 = X_2 = \cdots = X_n\) where \(X_1 \sim \mathcal{N}(0,1)\) and \(X\) is an independent Gaussian \(\mathcal{N}(0,1)\). We see that the \(X_n\) 's converge in distribution to \(X\), because they both have the same distribution
Let \(Y_1 = -X_1 = Y_2 = \cdots = Y_n\) where \(Y_1 \sim \mathcal{N}(0,1)\) and \(Y\) is an independent Gaussian \(\mathcal{N}(0,1)\). Similarly, the \(Y_n\) 's converge to \(Y\).
But for \((X_n, Y_n)\), \(X_n + Y_n = 0\) but \(X+Y \sim \mathcal{N}(0,2)\).
5 Slutsky's theorem
If \(X_n \overset{d}{\rightarrow} X\) and \(Y_n \overset{d}{\rightarrow} c\) for some constant \(c\). Then, \((X_n, Y_n) \overset{d}{\rightarrow} (X, c)\)
5.1 consequences
- \(X_n + Y_n\) converges in distribution to \(X+c\)
- \(X_nY_n\) converges in distribution to \(Xc\)