3.2 Limiting Distributions

3.2.1 Converge in probability

3.2.2 Converge in distribution

Theorem 3.1 Continuous Mapping Theorem

Let Xn be a sequence of random variables such that XndX, where X is some random variable (or constant). Let g() be a continuous function. Then

g(Xn)dg(X).

If X is a constant c, then g(Xn)pg(c). (Convergence in distribution to a constant implies convergence in probability to that constant).

Slutsky’s Theorem is a collection of results concerning the asymptotic behavior of random variables. It is extremely useful in establishing the limiting distributions of estimators and test statistics.

Slutzky theorem

If XndX and Ynpc, where c is a constant, then If XnpX and YnpY, then

Slutsky’s theorem essentially says that convergence in distribution and convergence in probability behave “nicely” with continuous functions and arithmetic operations, especially when one of the sequences converges to a constant. You can often treat limits of random variables much like limits of ordinary sequences, provided you are careful about the mode of convergence.