Generalized form of Bienayme-Chebysheve's Inequality, Applications of Chebyshev's inequality and Markov's inequality
Markov's Inequality : Let X be a discrete or continuous random variable. Then for every k > 0, Markov's inequality is $$P \ \left [|X| \≥ \ k \ \right ] \≤ \ \frac{E [X]}{k}$$ Proof : We known that Bienayme-Chebyshev's inequality is $$P \ \left [ g(X) \≥ \ k \ \right ] \≤ \ \frac{E [g(X)]}{k}$$ If we take g (X) = | X | we get for any k > 0 $$P \ \left [|X| \≥ \ k \ \right ] \≤ \ \frac{E [X]}{k}$$ Which is Markov's inequality But, if we choose g (X) = | X |r and replace k by kr above where k > 0 and r > 0, we get a more generalized form of the Markov's inequality, $$P \ \left [|X|^r \≥ \ k^r \ \right ] \≤ \ \frac{E [X]^r}{k^r}$$
Summary
Markov's Inequality : Let X be a discrete or continuous random variable. Then for every k > 0, Markov's inequality is $$P \ \left [|X| \≥ \ k \ \right ] \≤ \ \frac{E [X]}{k}$$ Proof : We known that Bienayme-Chebyshev's inequality is $$P \ \left [ g(X) \≥ \ k \ \right ] \≤ \ \frac{E [g(X)]}{k}$$ If we take g (X) = | X | we get for any k > 0 $$P \ \left [|X| \≥ \ k \ \right ] \≤ \ \frac{E [X]}{k}$$ Which is Markov's inequality But, if we choose g (X) = | X |r and replace k by kr above where k > 0 and r > 0, we get a more generalized form of the Markov's inequality, $$P \ \left [|X|^r \≥ \ k^r \ \right ] \≤ \ \frac{E [X]^r}{k^r}$$
Things to Remember
-
The Chebyshev's inequality can be applied to any probability distribution, whether, the probability histogram or frequency curve is normal (bell shaped) or not.
-
The Chebyshev's inequality enable us to find upper bound and lower bound on respective probabilities P( | X - E(X) |≥ k ) and P( | X - E(X) | < k ) for any k > 0.
-
The Chebyshev's inequality may also be used to find the convergence in probability.
-
The Chebyshev's inequality leads to weak law of large numbers (WLLN).
MCQs
No MCQs found.
Subjective Questions
No subjective questions found.
Videos
No videos found.

Generalized form of Bienayme-Chebysheve's Inequality, Applications of Chebyshev's inequality and Markov's inequality
Generalized form of Bieneyme-Chebyshev's Inequality :
Let g(X) be a non-negative function of a random variable X. then for every k > 0.
$$P \ \left [ g(X) \ ≥ \ k \ \right ] \ ≤ \ \frac{E [g(X)]}{k}$$
Proof :
Let f(x) be probability density function of a random variable X and S be the set of all X where g (X) > k i.e. S = [S: g(X) ≥ k]. Then
$$P \ \left [ g(X) \ ≥ \ k \ \right ] \ = \ \int_{g(x) \ = \ k}^{\infty} \ f(x) \ dx \ = \ \int_s \ f(x) \ dx$$
$$E \ [ \ g(x) \ ] \ = \ \int_{-\infty}^{\infty} \ g(x) \ f(x) \ dx \ ≥\ \int_s \ g(x) \ f(x) \ dx$$
$$≥ \ k \ P \ \left [ g(X) \ ≥ \ k \ \right ]$$
$$\therefore \ P \ \left [ g(X) \ ≥ \ k \ \right ] \ ≤ \ \frac{E [g(X)]}{k}$$
The proof for discrete random variable can be done on replacing integration by summation over the given range of variable.
Remark 1:
In particular, if we take g(X) = [X - E(X)]2 = (X -μ)2 and replace k by k2σ2 then we get,
$$P[(X - \mu)^2 \ ≥ \ k^2\sigma^2] \ ≤ \ \frac{E(X \ - \ \mu)^2}{k^2 \sigma^2} \ = \ \frac{\sigma^2}{k^2 \sigma^2} \ = \ \frac{1}{k^2}$$
$$\Rightarrow \ P[ | X \ - \ \mu | \ ≥ \ k \sigma] \ ≤ \ \frac{1}{k^2}$$
Which is Chebyshev's inequality.
Applications of Chebyshev's Inequality :
1. The Chebyshev's inequality can be applied to any probability distribution, whether, the probability histogram or frequency curve is normal (bell shaped) or not.
2.The Chebyshev's inequality $$P[ | X \ - \ \mu | \ ≥ \ k \sigma] \ ≤ \ \frac{1}{k^2}$$ can be applied to fix or to find an upper bound to the probability that the value ot the variate X choosen at random will differ from the mean E (X) by more than k times the standard deviation 'σ'.
But by the Chebyshev's inequality $$P[ | X \ - \ \mu | \ < \ k \sigma] \ ≥ \ 1 \ - \ \frac{1}{k^2}$$ can be applied to obtain a lower bound to the probability that the random variable X is in the intervalμ± kσ, that is the actual probability that the random variable X falls in an interval μ± kσ usually exceeds the lower bound for the probability (1 - 1/k2) by a constant amount.
The Chebyshev's inequality enable us to find upper bound and lower bound on respective probabilities P( | X - E(X) |≥ k ) and P( | X - E(X) | < k ) for any k > 0.
3.The Chebyshev's inequality may also be used to find the convergence in probability.
4.The Chebyshev's inequality leads to weak law of large numbers (WLLN).
The applications of Chebyshev's inequalitu will also be illustrated in the following examples and in the letter subsequent sub-sections.
Markov's Inequality :
Let X be a discrete or continuous random variable. Then for every k > 0, Markov's inequality is
$$P \ \left [|X| \ ≥ \ k \ \right ] \ ≤ \ \frac{E [X]}{k}$$
Proof :
We known that Bienayme-Chebyshev's inequality is
$$P \ \left [ g(X) \ ≥ \ k \ \right ] \ ≤ \ \frac{E [g(X)]}{k}$$
If we take g (X) = | X | we get for any k > 0
$$P \ \left [|X| \ ≥ \ k \ \right ] \ ≤ \ \frac{E [X]}{k}$$
Which is Markov's inequality
But, if we choose g (X) = | X |r and replace k by kr above where k > 0 and r > 0, we get a more generalized form of the Markov's inequality,
$$P \ \left [|X|^r \ ≥ \ k^r \ \right ] \ ≤ \ \frac{E [X]^r}{k^r}$$
Example :
Let X be a random variable with mean of 11 and a variance of 9. Using Chebyshev's inequality, find the lower bound for P (6 < X < 16)
Solution :
We known the Chebyshev's inequality that, for any k > 0,
$$or, \ P \ ( \ | \ X \ - \ \mu \ | \ < \ k \sigma \ ) \ ≥ \ 1 \ - \ \frac{1}{k^2}$$
$$or, \ P \ ( \mu \ - \ k\sigma \ < X < \mu \ + \ k \sigma ) \ ≥ \ 1 \ - \ \frac{1}{k^2}$$
Since,μ = 11 andσ2 = 9, then to findP (6 < X < 16) we have,
$$\mu \ - \ k \sigma \ = \ 6 \ and \ \mu \ + \ k \sigma \ = \ 16$$
$$These \ imply \ that \ k \ = \ \frac{5}{3}$$
$$Thus, \ P \ (6 \ < \ X \ < \ 16) \ ≥ \ 1 \ - \ \frac{1}{ \ \left ( \ \frac{5}{3} \ \right )^2} \ = \ 1 \ - \ \frac{9}{25} \ = \ frac{16}{25}$$
∴ The lower bound forP (6 < X < 16) is 16/25 or 0.64. That is, the random variable X will be between 6 and 16 with high probability at least 0.64.
Bibliography
Sukubhattu N.P. (2013). Probability & Inference - II. Asmita Books Publishers & Distributors (P) Ltd., Kathmandu.
Larson H.J. Introduction to Probability Theory and Statistical Inference. WileyInternational, New York.
Lesson
Inequalities
Subject
Statistics
Grade
Bachelor of Science
Recent Notes
No recent notes.
Related Notes
No related notes.