Geometric Distribution, Moments Of Geometric Distribution And Mode Of Geometric Distribution

A random variable (r.v) x is said to have a geometric distribution with parameter p, if it assumes only non-negative values and its probability mass function is given by P (X=x) = p (x) = qx p ; x = 0, 1, 2, 3, ..., 0 < p < 1, q = 1 - p = 0 ; otherwise

Summary

A random variable (r.v) x is said to have a geometric distribution with parameter p, if it assumes only non-negative values and its probability mass function is given by P (X=x) = p (x) = qx p ; x = 0, 1, 2, 3, ..., 0 < p < 1, q = 1 - p = 0 ; otherwise

Things to Remember

  1. $$ \ The \ mean \ of \ the\ geometric \ distribution \ with \ parameter \ p \ is \frac{p}{q}.$$
  2. $$The \ variance \ of the \ geometric \ distribution \ is \frac{q}{p^2}.$$
  3. $$The \ mode \ of \ the \ geometric \ distribution \ is \frac{1}{x+1}.$$

MCQs

No MCQs found.

Subjective Questions

No subjective questions found.

Videos

No videos found.

Geometric Distribution, Moments Of Geometric Distribution And Mode Of Geometric Distribution

Geometric Distribution, Moments Of Geometric Distribution And Mode Of Geometric Distribution

Introduction

let us suppose a sequence of independent Bernoulli trials in which the probability of success p remains the same through each trials. Then, an experiment is known as geometric experiment if the trials in the experiment are repeated until an event A occurs for the first time. In such an experiment we find the probability that there are x failures preceding the first success in x+1 trials. The geometric distribution is also a special case of negative binomial distrinution when k = 1. And the geometric distribution with the parameter p and random variable X is denoated by X ~ Geom(p).

The random variable (r.v) x, the number of failures before the first success is called the geometric veriable which follows a distribution known as geometric distribution. The reason for the name geometric distribution is that the various probabilities for x = 0, 1, 2, 3, ... are the succesive terms of geometric series with the probabilities of success p remains the same for each trial.

Hence, a random variable (r.v) x is said to have a geometric distribution with parameter p, if it assumes only non-negative values and its probability mass function is given by

P (X=x) = p (x) = qx p ; x = 0, 1, 2, 3, ..., 0 < p < 1, q = 1 - p

= 0 ; otherwise

Another form of geometric distribution : A random variable (r.v) X, the number of trials required has geometric distribution with parameter p if its probability mass function (p.m.f) is given by

P (X=x) = p(x) = qx-1. p ; x= 0, 1, 2, 3, ...

= 0 ; otherwise

boost.org Figure : geometric distributionboost.org

Figure : geometric distribution

Derivation of geometric distribution

The geometric distribution is a special case of negative binomial distribution when k = 1. We have, the probability mass function i.e p.m.f of negative binomial distribution is

$$P(X=x) \ = p(x) \ = \binom{x+k-1}{k-1} p^k q^x ;x \ = 0, 1, 2, ...$$

When k = 1, we get

$$p(x) \ = \binom{x}{0} p q^x$$

Therefore, p (x) = qxp ; x = 0, 1, 2, ...

This is the probability function of geometric distribution with parameter p and with random variable x. This geometric probability function gives the probability that there are x failure preceding the first success. $$The \ function \ p(x) \ = q^xp \ is \ a \ probability \ mass \ function \ because \sum_{x=0}^\infty p(x) \ = 1.$$

Thus, it can be easily verified that

$$\sum_{x=0}^\infty p(x) \ = \sum_{x=0}^\infty q^x p$$

$$= \ p \sum_{x=0}^\infty q^x$$

= p ( 1 + q + q2+ ... )

$$= \frac{p}{1-q}$$

$$= \frac{p}{p}$$

= 1

We use the notation X ~ Geom (p) to denote the random variable (r.v) x follows geometric distribution with parameter p.

Moments of geometric distribution

Let a random variable X follows a geometric distribution with a parameter p i.e X ~ Geom (p). Then the rth moments about origin of geometric distribution with parameter p is given by

μr|= E (Xr)

$$= \sum_{x=0}^\infty x^r p(x)$$

$$= \sum_{x=0}^\infty x^r q^x p$$

Now, putting r = 1, we get

μ1| = E (X)

$$= \sum_{X=0}^\infty x q^x p$$

$$= \ p \sum_{x=0}^\infty x q^x$$

= p q [ 1 + 2q + 3q2 + ... ]

Let us suppose,

S = 1 + 2q + 3q2 + ...

Sq = q + 2q2 + ...

⇒ S(1-q) = 1 + q + q2 + ...

$$= \frac{1}{1-q}$$

$$S \ = \frac{1}{(1-q)^2}$$

Hence,μ1| = E (X)

$$= \frac{pq}{(1-q)^2}$$

$$= \frac{pq}{p^2}$$

$$= \frac{q}{p}$$

$$Therefore, \ the \ mean \ of \ the\ geometric \ distribution \ with \ parameter \ p \ is \frac{p}{q}.$$

Similarly,

μ2| = E (X)2=E[ X ( x -1 ) + X ]

= E [ X ( X - 1 ) ] + E ( X )

Now, solving the term E [ X ( X - 1 ) ] only we get,

$$E[X(X-1)] \ = \sum_{x=0}^\infty x (x-1) p(x)$$

$$= \sum_{x=2}^\infty x (x-1) q^x p$$

$$= \ 2p q^2 \sum_{x=2}^\infty \frac{x(x-1)}{2q^2} q^x$$

= 2pq2 [ 1 + 3q + 6q2 + ... ]

$$= \ 2pq^2 \frac{1}{(1-q)^3}$$

$$= \frac{2q^2}{p^2}$$

Again we have,

μ22| - (μ1| )2

$$= \frac{2q^2}{p^2} \ + \frac{q}{p} \ - \frac{q^2}{p^2}$$

$$= \frac{q^2}{p^2} \ + \frac{q}{p}$$

$$= \frac{q^2 + pq}{p^2}$$

$$= \frac{p}{q^2}$$

$$Hence, \ the \ variance \ of the \ geometric \ distribution \ is \frac{q}{p^2}.$$

since 0 < p < 1 and 0 < q < 1,

$$\frac{q}{p} \ < \frac{q}{p^2}$$

Hence, it is important to note that for geometric distribution, mean is less than variance.

The other higher order moments of geometric distribution are not calculated because they are not frequently used in practice.

Moments generating function of geometric distribution

Let a random variable X, follows a geometric distribution with a parameter p i.e X ~ Geom (p). Then moent generating function of X is given by

Mx(t) = E [etx]

$$= \sum_{x=0}^\infty \ (e^t)^x q^x p$$

$$= \sum_{x=0}^\infty \ (qe^t)^x$$

$$= \ p \frac{1}{1-pe^t}$$

$$= \frac{p}{1-qe^t}$$

now,

μ1| = p (-1) (1-qet)-1|t = 0

= p q (1-q)-2

$$= \frac{q}{p}$$

$$Hence \ mean \ of \ a \ geometric \ distribution \ is \frac{q}{p}.$$

Now

μ2| = pq [ et(-2) (1-qet)-2(qet)-3 + (1-qet)-2 et] |t=0

= pq [2q (1-q)-3 + (1-q)-2]

$$= \frac{2pq^2}{(1-q)^3} \ + \frac{pq}{(1-q)^2}$$

$$= \frac{2q^2}{p^2} \ + \frac{q}{p}$$

Hence,μ22| - (μ1|)2

$$= \frac{2q^2}{p^2} \ + \frac{q}{p} \ - \frac{q^2}{p^2}$$

$$= \frac{q}{p^2}$$

$$So \ the \ variance \ is \frac{q}{p^2}.$$

Mode of geometric distribution

Mode is the value of variable X for which the geometric probabilityfunction p(x) = qxp is maximum. So, the mode is the solution of

$$\frac{d}{dp} p(x) \ = \ 0 \ and \frac{d^2}{dp^2} p(x) \ < \ 0.$$

$$Thus, \frac{d}{dp} q^x p \ = \ 0$$

$$or, \frac{d}{dp} (1-p)^x p \ = \ 0$$

or, (1-p)x + p x(1-p)x-1 (-1) = 0

or, (1-p)x-1 {(1-p)-px} = 0

or, 1 - p - px = 0

$$⇒ \ p \ = \frac{1}{x+1}$$

$$And \frac{d^2}{dp^2} q^x p \ = \frac{d}{dp} \ [(1-p)^x - p x (1-p)x-1$$

= x (1-p)x-1 (-1) - p x (x-1) (1-p)x-2 (-1) - x (1-p)x-1 . 1

= -2x (1-p)x-1 + p x (x-1)(1-p)x-2

< 0

$$Which \ is \ negative \ for \ p \ = \frac{1}{x+1}$$

$$Hence, \ the \ mode \ of \ the \ geometric \ distribution \ is \frac{1}{x+1}.$$

Bibliography

Sukubhattu N.P. (2013). Probability & Inference - II. Asmita Books Publishers & Distributors (P) Ltd., Kathmandu.

Larson H.J. Introduction to Probability Theory and Statistical Inference. WileyInternational, New York.

Lesson

Discrete probability distribution

Subject

Statistics

Grade

Bachelor of Science

Recent Notes

No recent notes.

Related Notes

No related notes.