Convergence
References
- STATS 203 - Large Sample Theory
- Master Program: Probability Theory - Lecture 3: Applications of independence
Introduction
TLDR
The defining feature of convergence is the restriction on the underlying probability space.
- In convergence in Law/distribution, we do not require \(X_n\) and \(X\) to be defined on the same probability space.
- In convergence in probability or expectation, we require that for each \(n\), \(X_n\) and \(X\) are defined on the same probability space, but this probability space is allowed to change with \(n\).
- In convergence almost surley, the underlying probability space for \(X_n\) and \(X\) must be the same and fixed for all \(n\).
Weak Law of Large Numbers
Presentation
It seems like there are at least two distinct ways to illustrate this result. The first places more emphasis on random variables, the second, which I do below, places more emphasis on empirical measures. The relationship between these two approaches is the following:
We begin by defining a probability space parameterized by \(n\).
Projection Random Variables
Defined as follows:
with \(n\) i.i.d projection random variables.
From this, we can define the following sequence of random variables:
We can show that the following result.
Result
We say
Convergence in Distribution (or Law)
- A random variable \(X_n\) convergences in distribution (or in law )to \(X\) when the corresponding sequence of CDFs converge "pointwise" to the CDF of \(X\)
Helly-Bray Theorem
TLDR
If the parameters of our model converge in distribution or law, then we have an asymptotically unbiased estimate.
Let \(\theta_n \to_D \theta\). Let \(g\) be a continuous and bounded function. Then:
Note, if we define our estimator \(\hat{\beta}_n\) as follows.
Then this theorem tells us that
Convergence in Probability
If the limit is a random variable, then we care aboout the joint distribution!
- A random variable \(X_n\) converges in probability to \(X\) when for all \(\varepsilon > 0\) the following holds:
Applications
Weak Law of Large Numbers
Proof given here
Let \(X_1, \dots, X_n\) be a family of i.i.d random variables with a finite second moment. Then