Modes of Convergence
Let { Xn} = {x1, x2, . . . , xn} be a sequence of random variables defined on the same probability space(Ω,Á, P}. Then the sequence of random variables{ Xn} is said to converge to a random variable X in probability or stochastically or weakly, denoted by $$X_n \ \overset{P}{\rightarrow} \ X,$$ if for every positive∈ ( > 0 ) as n→∞, $$P \ \left [ \ | \ X_n \ - X \ | \ ≥ \ ∈ \ \right ] \ \rightarrow \ 0$$ Equivalently, if for every∈ > 0, asn→∞ $$P \ \left [ \ |X_n \ - \ X \ | < \ ∈ \ \right ] \ \rightarrow \ 1.$$ From this definition, it means that a sequence or random variables {Xn} is said to converge to a random variable X in probability, if the sequence { Xn - X } converges to zero in probability asn→∞. In other words, the convergence of the sequence { Xn} to a random variable X in probability means that the difference between Xn and X likely to be small with large probability for large n. Symbolically, $$X_n \ \overset{P}{\rightarrow} \ X,$$ if for every positive ∈ ( > 0 ), $$\lim_{x \to 0} \ P \ \left [ \ |X_n \ - \ X \ | < \ ∈ \ \right ] \ \rightarrow \ 1.$$ This mode of convergence in probability is frequently occurred in our real life problems. When the convergence in probability take place, the weak law of large numbers holds.
Summary
Let { Xn} = {x1, x2, . . . , xn} be a sequence of random variables defined on the same probability space(Ω,Á, P}. Then the sequence of random variables{ Xn} is said to converge to a random variable X in probability or stochastically or weakly, denoted by $$X_n \ \overset{P}{\rightarrow} \ X,$$ if for every positive∈ ( > 0 ) as n→∞, $$P \ \left [ \ | \ X_n \ - X \ | \ ≥ \ ∈ \ \right ] \ \rightarrow \ 0$$ Equivalently, if for every∈ > 0, asn→∞ $$P \ \left [ \ |X_n \ - \ X \ | < \ ∈ \ \right ] \ \rightarrow \ 1.$$ From this definition, it means that a sequence or random variables {Xn} is said to converge to a random variable X in probability, if the sequence { Xn - X } converges to zero in probability asn→∞. In other words, the convergence of the sequence { Xn} to a random variable X in probability means that the difference between Xn and X likely to be small with large probability for large n. Symbolically, $$X_n \ \overset{P}{\rightarrow} \ X,$$ if for every positive ∈ ( > 0 ), $$\lim_{x \to 0} \ P \ \left [ \ |X_n \ - \ X \ | < \ ∈ \ \right ] \ \rightarrow \ 1.$$ This mode of convergence in probability is frequently occurred in our real life problems. When the convergence in probability take place, the weak law of large numbers holds.
Things to Remember
- $$if \ X_n \ \overset{P}{\rightarrow} \ X \ and \ C \ is \ a \ constant, \ CX_n \ \overset{P}{\rightarrow} \ CX.$$
- $$If \X_n \ \overset{P}{\rightarrow} \ 0, \ then \X_n^2 \ \overset{P}{\rightarrow} \ 0.$$
-
$$if \ X_n \ \overset{P}{\rightarrow} \ X \ and \\ Y_n \ \overset{P}{\rightarrow} \ Y \ then,$$
$$X_n \ + \ Y_n \ \overset{P}{\rightarrow} \ X \ + \ Y$$
MCQs
No MCQs found.
Subjective Questions
No subjective questions found.
Videos
No videos found.

Modes of Convergence
Modes of Convergence
When we deal with the computation of probability and expectation with respect to large number of random variables, we take an approximate solution to the problem of finding the probability of these random variables or we take an approximate value of certain averages computed from a large sample of independent observed values of the random variables. In order to employ "approximately" to the probability and expectation we must take the limit as the number of variables increased. So, we introduced and use the word "convergence" which means "approximately" or "it tends to this" as the number of variables or sample size n is sufficiently large. Thus, the "convergence" means "it tends to something" in the limit n tends to infinity ( n→∞ ).
Definition:
The different ways or types of convergence of a sequence of random variables defined on probability space (Ω,Á, P), is called modes of convergence.
The important modes of convergence are the following :
- Convergence in probability
- Convergence almost surely
- Convergence in the rth mean
- Convergence in distribution
Convergence in probability
Definition:Let { Xn} = {x1, x2, . . . , xn} be a sequence of random variables defined on the same probability space(Ω,Á, P}. Then the sequence of random variables{ Xn} is said to converge to a random variable X in probability or stochastically or weakly, denoted by
$$X_n \ \overset{P}{\rightarrow} \ X,$$
if for every positive∈ ( > 0 ) as n→∞,
$$P \ \left [ \ | \ X_n \ - X \ | \ ≥ \ ∈ \ \right ] \ \rightarrow \ 0$$
Equivalently, if for every∈ > 0, asn→∞
$$P \ \left [ \ |X_n \ - \ X \ | < \ ∈ \ \right ] \ \rightarrow \ 1.$$
From this definition, it means that a sequence or random variables {Xn} is said to converge to a random variable X in probability, if the sequence { Xn - X } converges to zero in probability asn→∞. In other words, the convergence of the sequence { Xn} to a random variable X in probability means that the difference between Xn and X likely to be small with large probability for large n. Symbolically,
$$X_n \ \overset{P}{\rightarrow} \ X,$$
if for every positive ∈ ( > 0 ),
$$\lim_{x \to 0} \ P \ \left [ \ |X_n \ - \ X \ | < \ ∈ \ \right ] \ \rightarrow \ 1.$$
This mode of convergence in probability is frequently occurred in our real life problems. When the convergence in probability take place, the weak law of large numbers holds.
Properties of Convergence in Probability:
Property 1:
$$if \ X_n \ \overset{P}{\rightarrow} \ X \ and \ C \ is \ a \ constant, \ CX_n \ \overset{P}{\rightarrow} \ CX.$$
Proof:If C = 0, then the result is trivial.
If C≠ 0, then for every∈ > 0,
$$P \ \left ( \ | \ CX_n \ - CX \ | \ ≥ \ ∈ \ \right ) \ = \ P \ \left [ \ | \ C \ | \ | \ X_n \ - X \ | \ ≥ \ ∈ \ \right ]$$
$$= \ P \ \left [ \ |X_n \ - \ X \ | \ ≥ \ \frac{∈}{|C|} \ \right ] \ \rightarrow \ 0 \ \ \ \ as n \ \rightarrow \ \infty.$$
Hence, the result follows.
Property 2 :
$$If \ X_n \ \overset{P}{\rightarrow} \ 0, \ then \ X_n^2 \ \overset{P}{\rightarrow} \ 0.$$
Proof:
For every positive∈ ( > 0 ), we have
$$P \ \left [ \ | \ X_n \ - \ 0 \ | ≤ \ \sqrt{∈} \ \right ] \ \rightarrow \ 0 \ as \ n \ \rightarrow \ \infty$$
$$\Rightarrow \ P \ \left [ \ | \ X_n \ | ≥ \ \sqrt{∈} \ \right ] \ \rightarrow \ 0 \ as \ n \ \rightarrow \ \infty$$
$$ P \ \left [ \ | \ X_n^2 \ | \ ≥ \ ∈ \ \right ] \ \rightarrow \ 0.$$
$$ P \ \left [ \ | \ X_n^2 \ - \ 0 \ | \ ≥ \ ∈ \ \right ] \ \rightarrow \ 0 \ as \ n \ \rightarrow \ \infty$$
Hence, the property is proved.
Property 3:
$$if \ X_n \ \overset{P}{\rightarrow} \ X \ and \\ Y_n \ \overset{P}{\rightarrow} \ Y \ then,$$
$$X_n \ + \ Y_n \ \overset{P}{\rightarrow} \ X \ + \ Y$$
Proof:
For every positive ∈ ( > 0 ), we have
$$P \ \left [ \ | \ ( \ X_n \ + \ Y_n \ ) \ - \ ( \ X \ + \ y \ ) \ | \ ≥ \ ∈ \ \right ] \ = \ P \ \left [ \ | \ ( \ X_n \ - \ X \ ) \ + \ ( \ Y_n \ - \ y \ ) \ | \ ≥ \ ∈ \ \right ]$$
$$≤ P\ \left [ \ | \ X_n \ - \ X \ | \ + \ | \ Y_n \ - \ y \ | \ ≥ \ ∈ \ \right ]$$
$$≤ P\ \left [ \ | \ X_n \ - \ X \ | \ ≥ \ \frac{∈}{2} \ \right ] \ + \ P \ \left [ \ | \ Y_n \ - \ y \ | \ ≥ \ ∈ \ ≥ \ \frac{∈}{2} \right ]$$
$$≤ \ 0 \ + \ 0$$
$$\rightarrow \ 0 \ \ \ \ \ \ \ as \ n \ \rightarrow \ \infty$$
$$Because, \ \ P \ \left [ \ | \ X_n \ - \ X \ | \ ≥ \ \frac{∈}{2} \ \right ] \ \rightarrow \ \infty$$
$$and \ \ P \ \left [ \ | \ Y_n \ - \ y \ | \ ≥ \ ∈ \ ≥ \ \frac{∈}{2} \right ] \ \rightarrow \ \infty$$
Hence, this property follows.
Bibliography
Sukubhattu N.P. (2013). Probability & Inference - II. Asmita Books Publishers & Distributors (P) Ltd., Kathmandu.
Larson H.J. Introduction to Probability Theory and Statistical Inference. WileyInternational, New York.
Lesson
Convergence
Subject
Statistics
Grade
Bachelor of Science
Recent Notes
No recent notes.
Related Notes
No related notes.