<< 4.4 Order Statistics | 5.2 Convergence in Distribution.md >>
To emphasize the fact that we are working with sequences of random variables, we may place a subscript on the appropriate random variable, e.g., write sequence of as
Definition 5.1.1: Convergence in probability
Definition
Let
- : Sequence of Random variables
- : Random variable
If $ \lim_{ n \to \infty } P[|X_{n}-X|\geq \epsilon] = 0, \quad \forall \epsilon>0$$
- Or equivalently $ \lim_{ n \to \infty } P[|X_{n}-X| < \epsilon] = 1, \quad \forall \epsilon>0 $$
Then
- We say converges in probability to
- We write $ X_{n} \xrightarrow P X $$
Theorem 5.1.1: Weak law of large numbers
Theorem
Let
- : Sequence of random samples, with
- Common mean
- Common variance
- $\bar{X}{n}=\frac{1}{n}\sum{i=1}^nX_{i}$$
Then
In the following sections (Theorem 5.1.2 to Theorem 5.1.5) we describe some theorems related to convergence of sequence of random variables. For brevity, we implicitly let:
- : Random variables
- : Sequences of respectively
- : Some constant
Theorem 5.1.2
Suppose
Then
Theorem 5.1.3
Theorem 5.1.4
Let
Suppose
Then
Theorem 5.1.5
Suppose
Then
Definition 5.1.2: Consistent estimator
Definition
Let
- : Random variable, with
- : Parameter space
- cdf
- : Random sample of
- : Statistic
If
Then we say is a consistent estimator of
Theorem: Law of large numbers for sample variance
Let
- : Sequence of random samples, with
- Common mean
- Common variance
- : Sample variance
Then
Note
This theorem states that the sample variance is a consistent estimator of the population variance .
Before stating the strong law of large numbers, we need to introduce the concept of almost sure convergence, which is a stronger form of convergence than convergence in probability.
Definition: Almost sure convergence
Definition
Let
- : Sequence of Random variables
- : Random variable
If $ P\left[\lim_{n \to \infty} X_{n} = X\right] = 1 $$
Then
- We say converges almost surely to
- We write $ X_{n} \xrightarrow{a.s.} X $$
Theorem: Strong law of large numbers
Let
- : Sequence of random samples, with
- Common mean
- Common variance
Then
The strong law of large numbers provides a stronger guarantee than the weak law of large numbers.
While the weak law states that converges to in probability, the strong law states that == converges to almost surely==, meaning that with probability 1, the sample mean will eventually stabilize around the true mean as .
The key difference between strong and weak law of large numbers are:
- Weak: For any , the probability that is “far” from becomes small as gets large
- Strong: With probability , the sequence will converge to