problem of Chebyshev's Inequality and Weak Law of Large Number (WLLN)
Weak Law of Large Number (WLLN) : Let X1, X2, . . ., Xn be a sequence of random variables and μ1, μ2, . . ., μn be their respective expectations and let Bn = Var (X1 + X2 + . . . + Xn ) < ∞, then $$P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ ≥ \ 1 \ - \ \eta$$ For all n > n0, where ∈ and η are arbitary small, positive numbers, provided that $$\lim_{n \to \infty} \ as \ \frac{B_n}{n^2} \ \rightarrow \ 0.$$ Proof : Using Chebyshev's inequality for ∈ > 0, we have, $$P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ ≥ \ 1 \ - \ \frac{B_n}{n^2 ∈^2}$$ $$Because, \ Var \ (\overset{-}X_n \ - \ \overset{-}\mu) \ = \ \frac{\sum_{k}^{n} \ V \ (X_k)}{n^2} \ = \ \frac{ \ V \ \left ( \ \sum_{k}^{n} \ X_k \ \right ) \ }{n^2} \ = \ \frac{B_n}{n^2}$$ $$Where \ \overset{-}X_n \ = \ \frac{\sum_{k \ = \ 1}^{n} \ V_k}{n} \ = \ \frac{X_1 \ + \ X_2 \ + \ . \ . \ . \ + \ X_n}{n}$$ $$and \ \overset{-}\mu_n \ = \ \frac{\sum_{k} \ \mu_k}{n} \ = \ \frac{\mu_1 \ + \ \mu_2 \ + \ . \ . \ . \ + \ \mu_n}{n}$$ $$Since \ \epsilon \ is \ arbitrary, \ assume \ \frac{B_n}{n^2 \epsilon^2} \ \rightarrow \ 0 \ as \ n \ \rightarrow 0.$$ On choosing two arbitrary small positive numbers ∈ and η, a number n ≥ n0 can be found so that $$\frac{B_n}{n^2 \epsilon^2} \ < \ \eta \ for \ n \ > \ n_0 \ \left ( \ = \ \frac{\sqrt{B_n}}{\epsilon \ \sqrt{n}} \ \right ).$$ That is the least value of n0 can be obtained as $$n_0 \ ≥ \ \frac{\sigma^2}{\epsilon^2 \ \eta} \ where \ Var(\overset{-}X_n) \ = \ \frac{\sigma^2}{n}.$$ $$\therefore \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ ≥ \ 1 \ - \ \eta$$ $$i.e. \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ \rightarrow \ 1 \ as \ n \ \rightarrow \ \infty$$ $$or, \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ ≥ \ ∈ \ ] \ \rightarrow \ 0 \ as \ n \ \rightarrow \ \infty$$ This proves that Chebyshev's inequality leads to the weak law of large numbers (WLLN).
Summary
Weak Law of Large Number (WLLN) : Let X1, X2, . . ., Xn be a sequence of random variables and μ1, μ2, . . ., μn be their respective expectations and let Bn = Var (X1 + X2 + . . . + Xn ) < ∞, then $$P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ ≥ \ 1 \ - \ \eta$$ For all n > n0, where ∈ and η are arbitary small, positive numbers, provided that $$\lim_{n \to \infty} \ as \ \frac{B_n}{n^2} \ \rightarrow \ 0.$$ Proof : Using Chebyshev's inequality for ∈ > 0, we have, $$P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ ≥ \ 1 \ - \ \frac{B_n}{n^2 ∈^2}$$ $$Because, \ Var \ (\overset{-}X_n \ - \ \overset{-}\mu) \ = \ \frac{\sum_{k}^{n} \ V \ (X_k)}{n^2} \ = \ \frac{ \ V \ \left ( \ \sum_{k}^{n} \ X_k \ \right ) \ }{n^2} \ = \ \frac{B_n}{n^2}$$ $$Where \ \overset{-}X_n \ = \ \frac{\sum_{k \ = \ 1}^{n} \ V_k}{n} \ = \ \frac{X_1 \ + \ X_2 \ + \ . \ . \ . \ + \ X_n}{n}$$ $$and \ \overset{-}\mu_n \ = \ \frac{\sum_{k} \ \mu_k}{n} \ = \ \frac{\mu_1 \ + \ \mu_2 \ + \ . \ . \ . \ + \ \mu_n}{n}$$ $$Since \ \epsilon \ is \ arbitrary, \ assume \ \frac{B_n}{n^2 \epsilon^2} \ \rightarrow \ 0 \ as \ n \ \rightarrow 0.$$ On choosing two arbitrary small positive numbers ∈ and η, a number n ≥ n0 can be found so that $$\frac{B_n}{n^2 \epsilon^2} \ < \ \eta \ for \ n \ > \ n_0 \ \left ( \ = \ \frac{\sqrt{B_n}}{\epsilon \ \sqrt{n}} \ \right ).$$ That is the least value of n0 can be obtained as $$n_0 \ ≥ \ \frac{\sigma^2}{\epsilon^2 \ \eta} \ where \ Var(\overset{-}X_n) \ = \ \frac{\sigma^2}{n}.$$ $$\therefore \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ ≥ \ 1 \ - \ \eta$$ $$i.e. \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ \rightarrow \ 1 \ as \ n \ \rightarrow \ \infty$$ $$or, \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ ≥ \ ∈ \ ] \ \rightarrow \ 0 \ as \ n \ \rightarrow \ \infty$$ This proves that Chebyshev's inequality leads to the weak law of large numbers (WLLN).
Things to Remember
Weak Law of Large Number (WLLN) :
Let X1, X2, . . ., Xn be a sequence of random variables and μ1, μ2, . . ., μn be their respective expectations and let Bn = Var (X1 + X2 + . . . + Xn ) < ∞, then
$$P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \ ∈ \ ] \ ≥ \ 1 \ - \ \eta$$
For all n > n0, where ∈ and η are arbitary small, positive numbers, provided that $$\lim_{n \to \infty} \ as \ \frac{B_n}{n^2} \ \rightarrow \ 0.$$
MCQs
No MCQs found.
Subjective Questions
No subjective questions found.
Videos
No videos found.

problem of Chebyshev's Inequality and Weak Law of Large Number (WLLN)
Example :
If X is the number scored in a throw of a fair die, show that the Chebyshev's inequality gives $$P \ ( \ | \ X \ - \ \mu \ | \ > \ 2.5 \ ) \ < \ 0.47$$ whereμ is the mean of X, while the actual probability is zero.
Solution :
Let X be a random variable which takes the values 1, 2, 3, . . ., 6 each with probability 1/6 in throwing a die. Therefore,
$$E (X) \ = \ \sum_{i \ = \ 1}^{6} \ X_i \ p_i \ = \ \frac{1}{6} \ [ 1 \ + \ 2 \ + \ 3 \ + \ 4 \ + \ 5 \ + \ 6 ] \ = \ \frac{1}{6} \ \left [ \ \frac{(6) \ (7)}{2} \ \right ] \ = \ \frac{7}{2}$$
$$E (X^2) \ = \ \sum_{i \ = \ 1}^{6} \ X_i^{2} \ p\_i^{2} \ = \ \frac{1}{6} \ [ 1^2 \ + \ 2^2 \ + \ . \ . \ . \ + \ 6^2 ] \ = \ \frac{1}{6} \ \left [ \ \frac{6 \ (6 \ + \ 1) \ (2 \× \ 6 \ + \ 1)}{6} \ \right ] \ = \ \frac{91}{6}$$
$$\therefore \ V (X) \ = \ E(X^2) \ - \ [E(X)]^2 \ = \ \frac{91}{6} \ - \ \left ( \ \frac{7}{2} \ \right )^2 \ = \ 2.9167$$
Now, for k > 0, Chebyshev's inequality gives,
$$P \ ( \ | \ X \ - \ E(X) \ | \ > \ k \ ) \ < \ \frac{V(X)}{k^2}$$
choosing k = 2.5, we get
$$P \ ( \ | \ X \ - \ \mu \ | \ > \ 2.5 \ ) \ < \ \frac{2.9167}{(2.5)^2} \ = \ 0.47$$
$$\therefore \P \ ( \ | \ X \ - \ \mu \ | \ > \ 2.5 \ ) \ < \ 0.47$$
But the actually probability is given by
$$P \ ( \ | \ X \ - \ \mu \ | \ > \ 2.5 \ ) \ = \ P \ \left ( \ | X \ - \ \frac{7}{2} | \ > \ 2.5 \ \right )$$
= P [ X lies outside the limits (3.5 - 2.5, 3.5 + 2.5) i.e. (1,6) ] = 0
Because, since X is the number on the die when thrown, it can not lie outside the limits 1 and 6.
Weak Law of Large Number (WLLN) :
Let X1, X2, . . ., Xn be a sequence of random variables andμ1,μ2, . . .,μn be their respective expectations and let Bn = Var (X1+X2 +. . . + Xn) <∞, then
$$P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \∈ \ ] \≥ \ 1 \ - \ \eta$$
For all n > n0, where∈ andη are arbitary small, positive numbers, provided that $$\lim_{n \to \infty} \ as \ \frac{B_n}{n^2} \ \rightarrow \ 0.$$
Proof :
Using Chebyshev's inequality for∈ > 0, we have,
$$P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \∈ \ ] \≥ \ 1 \ - \\frac{B_n}{n^2 ∈^2}$$
$$Because, \ Var \ (\overset{-}X_n \ - \ \overset{-}\mu) \ = \ \frac{\sum_{k}^{n} \ V \ (X_k)}{n^2} \ = \ \frac{ \ V \ \left ( \ \sum_{k}^{n} \ X_k \ \right ) \ }{n^2} \ = \ \frac{B_n}{n^2}$$
$$Where \ \overset{-}X_n \ = \ \frac{\sum_{k \ = \ 1}^{n} \ V_k}{n} \ = \ \frac{X_1 \ + \ X_2 \ + \ . \ . \ . \ + \ X_n}{n}$$
$$and \ \overset{-}\mu_n \ = \ \frac{\sum_{k} \ \mu_k}{n} \ = \ \frac{\mu_1 \ + \ \mu_2 \ + \ . \ . \ . \ + \ \mu_n}{n}$$
$$Since \\epsilon \ is \ arbitrary, \ assume \ \frac{B_n}{n^2\epsilon^2} \ \rightarrow \ 0 \ as \ n \ \rightarrow 0.$$
On choosing two arbitrary small positive numbers∈ andη, a number n≥ n0 can be found so that $$\frac{B_n}{n^2\epsilon^2} \ < \ \eta \ for \ n \ > \ n_0 \ \left ( \ = \ \frac{\sqrt{B_n}}{\epsilon \ \sqrt{n}} \ \right ).$$ That is the least value of n0 can be obtained as $$n_0 \≥ \ \frac{\sigma^2}{\epsilon^2 \ \eta} \ where \ Var(\overset{-}X_n) \ = \ \frac{\sigma^2}{n}.$$
$$\therefore \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \∈ \ ] \≥ \ 1 \ - \ \eta$$
$$i.e. \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ < \∈ \ ] \ \rightarrow \ 1 \ as \ n \ \rightarrow \ \infty$$
$$or, \ P [ \ | \overset{-}x_n \ - \ \overset{-}\mu-n | \ ≥\∈ \ ] \ \rightarrow \ 0 \ as \ n \ \rightarrow \ \infty$$
This proves that Chebyshev's inequality leads to the weak law of large numbers (WLLN).
Bibliography
Sukubhattu N.P. (2013). Probability & Inference - II. Asmita Books Publishers & Distributors (P) Ltd., Kathmandu.
Larson H.J. Introduction to Probability Theory and Statistical Inference. WileyInternational, New York.
Lesson
Inequalities
Subject
Statistics
Grade
Bachelor of Science
Recent Notes
No recent notes.
Related Notes
No related notes.