Properties of Convergence Almost Surely, Convergence In r^th Mean and convergence in Distribution
Convergence in rth Mean: Definition: A sequence of random variable {Xn} is said to converge to X in mean, denoted by $$X_n \ \overset{r.m.}{\rightarrow} \ X, \ if$$ $$E \ | \ X_n \ - \ X \ |^r \ \rightarrow \ o. \ \ as \ n \ \rightarrow \ \infty.$$ i.e. if the rth mean of difference between Xn and X tends to zero as n tends to infinity. Thus, we give the following two important definitions: $$1. \ \ X_n \ \overset{m}{\rightarrow} \ X, \ if$$ $$E \ | \ X_n \ - \ X \ | \ \rightarrow \ 0 \ \ \ \ \ as \ n \ \rightarrow \ n$$ $$i.e. \ \lim_{x \to \ \infty} \ E \ ( X_n \ - \ X \ ) \ = \ 0.$$ or, if the mean of difference between Xnand X tends to zero as n→∞ $$2. \ \ A \ sequence \ of \ random \ variable \ {X_n} \ is \ said \ tto \ converge \ to \ X \ in \ mean \ squqre \ or \ quadratic \ meam, \ written \ as \ X_n \ \overset{q.m.}{\rightarrow} \ X, \ if$$ $$E \ | \ X_n \ - \ X \ |^2 \ \rightarrow \ 0 \ \ \ \ \ as \ n \ \rightarrow \ \infty.$$ $$i.e. \ \lim_{x \ \to \ \infty} \ E \ ( \ X_n \ - \ X \ )^2 \ = \ 0.$$ Convergence In Distribution: Definition: A sequence of a distribution function { f ( x ) } of random variables Xior a sequence of distribution function Fn is said to converge to F in distribution or in law or weak, denoated by $$F_n \ ( X ) \ \overset{D}{\rightarrow} \ F(x) \ or \ F_n \ \overset{D}{\rightarrow} \ F, \ if$$ $$F_n \ ( X ) \ \rightarrow \ F (X) \ or F_n \ \rightarrow \ F.$$ For all x∈ C (F), a set of points of continuity of F.
Summary
Convergence in rth Mean: Definition: A sequence of random variable {Xn} is said to converge to X in mean, denoted by $$X_n \ \overset{r.m.}{\rightarrow} \ X, \ if$$ $$E \ | \ X_n \ - \ X \ |^r \ \rightarrow \ o. \ \ as \ n \ \rightarrow \ \infty.$$ i.e. if the rth mean of difference between Xn and X tends to zero as n tends to infinity. Thus, we give the following two important definitions: $$1. \ \ X_n \ \overset{m}{\rightarrow} \ X, \ if$$ $$E \ | \ X_n \ - \ X \ | \ \rightarrow \ 0 \ \ \ \ \ as \ n \ \rightarrow \ n$$ $$i.e. \ \lim_{x \to \ \infty} \ E \ ( X_n \ - \ X \ ) \ = \ 0.$$ or, if the mean of difference between Xnand X tends to zero as n→∞ $$2. \ \ A \ sequence \ of \ random \ variable \ {X_n} \ is \ said \ tto \ converge \ to \ X \ in \ mean \ squqre \ or \ quadratic \ meam, \ written \ as \ X_n \ \overset{q.m.}{\rightarrow} \ X, \ if$$ $$E \ | \ X_n \ - \ X \ |^2 \ \rightarrow \ 0 \ \ \ \ \ as \ n \ \rightarrow \ \infty.$$ $$i.e. \ \lim_{x \ \to \ \infty} \ E \ ( \ X_n \ - \ X \ )^2 \ = \ 0.$$ Convergence In Distribution: Definition: A sequence of a distribution function { f ( x ) } of random variables Xior a sequence of distribution function Fn is said to converge to F in distribution or in law or weak, denoated by $$F_n \ ( X ) \ \overset{D}{\rightarrow} \ F(x) \ or \ F_n \ \overset{D}{\rightarrow} \ F, \ if$$ $$F_n \ ( X ) \ \rightarrow \ F (X) \ or F_n \ \rightarrow \ F.$$ For all x∈ C (F), a set of points of continuity of F.
Things to Remember
- $$If \ X_n \ \overset{a.s.}{\rightarrow} \ X, \ then \ X_n \ \overset{P}{\rightarrow} \ X.$$
- $$If \ X_n \ \overset{a.s.}{\rightarrow} \ X, \ then \ {X_{nk}} \ \overset{a.s}{\rightarrow} \ X, \ provided \ that \ there \ exists \ a \ subsequence \ {X_{nk}} \ of \ sequence \ {X_n}$$
- $$Almost \ sure \ convergence \ and \ almost \ sure \ mutual \ convergence \ are \ equivalent, \ i.e. \ X_n \ \overset{a.s}{\rightarrow} \ X \ \Leftrightarrow \ X_n \ - \ X_m \ \overset{a.s.}{\rightarrow} \ 0.$$
MCQs
No MCQs found.
Subjective Questions
No subjective questions found.
Videos
No videos found.

Properties of Convergence Almost Surely, Convergence In r^th Mean and convergence in Distribution
Convergence Almost Surely
Definition:
A sequence of random variable {Xn} is said to converge to X almost surely (a.s.) or strongly, denoated by
$$X_n \ \overset{a.s.}{\rightarrow} \ X, \ if$$
$$P \ \left ( \ \lim_{n \to \infty} \ X_n \ = \ X \ \right ) \ = \ 1$$
In other words, a sequence of random variables {Xn} is said to converge to X almost surely, if
$$\lim_{n \to \infty} \ X_n \ (w) \ = \ X \ (w)$$
for almost all member of w of the sample space S on which the random variables are defined, i.e. for all w∈ S.
$$Symbolically \ X_n \\overset{a.s.}{\rightarrow} \ X iff \ X_n \ (w) \ \rightarrow \ X \ (w) \ for \w \ ∈ \ S. $$
Properties of Convergence AlmostSurely:
- $$If \ X_n \ \overset{a.s.}{\rightarrow} \ X, \ then \ X_n \ \overset{P}{\rightarrow} \ X.$$
- $$If \ X_n \ \overset{a.s.}{\rightarrow} \ X, \ then \ {X_{nk}} \ \overset{a.s}{\rightarrow} \ X, \ provided \ that \ there \ exists \ a \ subsequence \ {X_{nk}} \ of \ sequence \ {X_n}$$
- $$Almost \ sure \ convergence \ and \ almost \ sure \ mutual \ convergence \ are \ equivalent, \ i.e. \ X_n \ \overset{a.s}{\rightarrow} \ X \ \Leftrightarrow \ X_n \ - \ X_m \ \overset{a.s.}{\rightarrow} \ 0.$$
Convergence in rth Mean:
Definition:
A sequence of random variable {Xn} is said to converge to X in mean, denoted by
$$X_n \ \overset{r.m.}{\rightarrow} \ X, \ if$$
$$E \ | \ X_n \ - \ X \ |^r \ \rightarrow \ o. \ \ as \ n \ \rightarrow \ \infty.$$
i.e. if the rth mean of difference between Xn and X tends to zero as n tends to infinity.
Thus, we give the following two important definitions:
$$1. \ \ X_n \ \overset{m}{\rightarrow} \ X, \ if$$
$$E \ | \ X_n \ - \ X \ | \ \rightarrow \ 0 \ \ \ \ \ as \ n \ \rightarrow \ n$$
$$i.e. \ \lim_{x \to \ \infty} \ E \ ( X_n \ - \ X \ ) \ = \ 0.$$
or, if the mean of difference between Xnand X tends to zero as n→∞
$$2. \ \ A \ sequence \ of \ random \ variable \ {X_n} \ is \ said \ tto \ converge \ to \ X \ in \ mean \ squqre \ or \ quadratic \ meam, \ written \ as \ X_n \ \overset{q.m.}{\rightarrow} \ X, \ if$$
$$E \ | \ X_n \ - \ X \ |^2 \ \rightarrow \ 0 \ \ \ \ \ as \ n \ \rightarrow \ \infty.$$
$$i.e. \ \lim_{x \ \to \ \infty} \ E \ ( \ X_n \ - \ X \ )^2 \ = \ 0.$$
Convergence In Distribution:
Definition:
A sequence of a distribution function { f ( x ) } of random variables Xior a sequence of distribution function Fn is said to converge to F in distribution or in law or weak, denoated by
$$F_n \ ( X ) \ \overset{D}{\rightarrow} \ F(x) \ or \ F_n \ \overset{D}{\rightarrow} \ F, \ if$$
$$F_n \ ( X ) \ \rightarrow \ F (X) \ or F_n \ \rightarrow \ F.$$
For all x∈ C (F), a set of points of continuity of F.
Similarly we have given another definition :
A sequence of random variables {Xn} is said to converge in distribution ( or law or weakly ) to X with distribution function F, written as
$$X_n \ \overset{L}{\rightarrow} \ X, \ if$$
$$\lim_{x \ \to \ \infty} \ F_n \ ( \ X \ ) \ \rightarrow \ F \ (x) \ \ \ for \ all \ x \∈ \ C \ ( \ F \ ).$$
Where, Fn (x) \ is the distribution function of Xn and F (X) is the distribution of X.
The convergence in distribution follows that the sequence { Xn } has a limiting distribution F of the random variable variable X for sufficiently large n. Thus, the central limit theorem ( CLT ) which is an important limit theorem in probability theory, can be established by applying this mode of convergence in distristribution.
Relationship among the Various Modes of Convergence:
Among the above various modes of convergence, following relationships are established:
- The convergence almost surely implies that convergence in probability. That is, $$X_n \ \overset{a.s.}{\rightarrow} \ X \ \Rightarrow \ X_n \ \overset{P}{\rightarrow} \ X$$ In other words, if a sequence Xn converges almost surely to X, then it follows that Xn converges in probability to X.
- The convergence in rth mean implies that convergence in probability. $$i.e. \ X_n \ \overset{r.m}{\rightarrow} \ X \ \Rightarrow \ X_n \ \overset{P}{\rightarrow} \ X$$
- The convergence in quadratic mean implies that convergence in probability. $$i.e. \ if \ X_n \ \overset{q.m}{\rightarrow} \ X, \ then \ X_n \ \overset{p}{\rightarrow} \ X.$$
- The convergence in probability implies that convergence in distribution. $$i.e. \ if \ \ X_n \ x \ \overset{P}{\rightarrow} \ X \ then \ F_n \ (X) \ \rightarrow \ F(X) \ \ \ for \ all \∈ \ C \ ( F)$$
- When X has degenerate distribution, then convergence in distribution and convergence in probability and convergence in probability are equivalent.
A discrete distribution having a probability 1 at a single point is called degenerate distribution.
Bibliography
Sukubhattu N.P. (2013). Probability & Inference - II. Asmita Books Publishers & Distributors (P) Ltd., Kathmandu.
Larson H.J. Introduction to Probability Theory and Statistical Inference. WileyInternational, New York.
Lesson
Convergence
Subject
Statistics
Grade
Bachelor of Science
Recent Notes
No recent notes.
Related Notes
No related notes.