3.2 Limiting Distributions
3.2.2 Converge in distribution
Theorem 3.1 Continuous Mapping Theorem
Let Xn be a sequence of random variables such that Xnd→X, where X is some random variable (or constant). Let g(⋅) be a continuous function. Then
g(Xn)d→g(X).
If X is a constant c, then g(Xn)p→g(c). (Convergence in distribution to a constant implies convergence in probability to that constant).
Slutsky’s Theorem is a collection of results concerning the asymptotic behavior of random variables. It is extremely useful in establishing the limiting distributions of estimators and test statistics.
Slutzky theorem
If Xnd→X and Ynp→c, where c is a constant, then If Xnp→X and Ynp→Y, thenSlutsky’s theorem essentially says that convergence in distribution and convergence in probability behave “nicely” with continuous functions and arithmetic operations, especially when one of the sequences converges to a constant. You can often treat limits of random variables much like limits of ordinary sequences, provided you are careful about the mode of convergence.