Moments Of Negative Binomial Distribution

Moments of negative binomial distribution Let X follows a binomial distribution i.e X ~ B- ( k, p ) or X ~ NB ( k, p ) where X is a random variable ( r.v ) and k and p are the parameters. Then, the rthmoment about origin of X is given by $$μ_r^| = E (x) \ = \sum_{x=0}^\infty x^r p(x)$$ $$= \sum_{x=0}^\infty x^r \binom{x+k-1}{k-1} p^k q^x$$ Now, putting r = 1, we get, $$μ_1^| = E (x) \ = \sum_{x=0}^\infty x \binom{x+k-1}{k-1} p^k q^x$$ $$= \sum_{x=0}^\infty \frac{(x+k-1)!}{(x-1)! (k-1)!} p^K q^x$$ $$= \sum_{x=1}^\infty \frac{kq}{p} \binom{x+k-1}{k}p^k+1 q^x-1 $$ $$= \frac{kq}{p} $$∴ \ Mean \ of \ negative \ binomial \ distribution \ is \frac{kq}{p}$$ Putting r = 2, we get, μ2′= E (x2) $$= \sum_{x=0}^\infty x^2\binom{x+k-1}{k-1}p^kq^x$$ $$= \sum_{x=0}^\infty {x (x - 1) + x} \ \binom{x+k-1}{k-1} p^kq^x$$ $$= \sum_{x=0}^\infty \frac{k(k+1)q^2}{p^2} \binom{x+k-1}{k-1} \ {p^k}^{+2} {q^x}{-2} \ + \sum_{x=1}^\infty \frac{kq}{p} \binom{x+k-1}{k-1} \ {p^k}^{+1} {q^x}^{-1}$$ $$= \frac{k (k + 1)q^2}{p^2} \ + \ \frac{kq}{p}$$ ∴ V (x) =μ2 =μ2′ - (μ1′)2 $$= \frac{k(k+1)q^2}{p^2} + \frac{kq}{p} - \frac{k^2q^2}{p^2}$$ $$= \frac{kq}{p^2}$$ ∴ Variance of negative binomial distribution is $$V(x)= \frac{kq}{p^2}$$ $$As \ k \ > \ 1, \ p \ < \ 1 \ we \ have \ \frac{kq}{p} \ < \frac{kq}{p^2}$$ i.e. mean < variance which is an unique property of negative binomial distribution. The higher order moments of negative binomial distribution can be obtained on putting r = 3 and 4. Moment generating function of negative binomial distribution Let x followsnegative binomial distribution with parameters k and p. Then the probability mass function of X is $$ p(x) \ = \binom{x+k-1}{k-1} p^kq^x$$ $$If \ p \ = \frac{1}{Q} \ and \ q \ = \frac{P}{Q}, \ we \ get$$ $$ p(x) = \binom{x+k-1}{k-1} \ {Q^-}^{k} \ \frac{P^x}{Q^x}$$ The moment generating function of X ~ B- (k, p) is Mx(t) = E (etx) $$= \sum_{x=0}^\infty {e^t}{x} p(x)$$ $$= \sum_{x=0}^\infty {e^t}^{x} \binom{x+k-1}{k-1} {Q^-}^{k} \frac{P^x}{Q^x}$$ $$= \sum_{x=0}^\infty \binom{x+k-1}{k-1} {Q^-}^{k} \frac{(Pe^t)^x}{Q^x}$$ $$= \ {Q^-}^{k} \sum_{x=0}^\infty \binom{x+k-1}{k-1} \frac{(Pe^t)^x}{Q^x}$$ = (Q - Pet)-k ∴ Mx(t) =(Q - Pet)-k Now, the rth moment about origin of negative binomial distribution can be obtained by $$\mu_{r}^{'} \ = \frac{d^r M_x(t)}{dt^r} |_{t=0}$$ $$\therefore \mu_{1}^{'} \ = \frac{d M_x(t)}{dt^r} |_{t=0}$$ $$= \frac{d}{dt} (Q - Pe^t)^{-k} |_{t=0}$$ $$= \ (-k) \ (Q - Pe^t)^{-k-1} \ (-Pe^t) |_{t=0}$$ $$= kP$$ $$\therefore Mean, \ \mu_{1}^{'} \ = \ KP$$ Hence, the mean of negative binomial distribution is KP. $$Since, \ p \ = \frac{1}{Q} \Rightarrow \ Q \ = \frac{p}{Q} \ and \ q = \frac{P}{Q} \ \Rightarrow \ P \ = \ qQ \ = \frac{q}{p}$$ $$\mu_{1}^{'} \ = \ KP \ = \frac{kq}{p}$$ Similarly $$\mu_{2}{"} = \frac{d^2 M_x(t)}{dt^2} |_{t=0}$$ $$= \frac{d}{dt} \ {kPe^t (Q - Pe^t)^(-k-1)}|_{t=0}$$ $$ = \ kPe^t (Q- Pe^t)^(-k-1) \ + \ (-k-1) \ kPe^t (Q- Pe^t)^(-k-2) \ (-Pe^t)|_{t=0}$$ $$= \ kP \ + \ k(k+1) P^2$$ We have, $$\mu_{2} \ = \mu_{2}^{'} \ - (\mu_{1}^{'})^2$$ $$= \ kP \ + \ k (k + 1) P^2 \ - (kP)^2$$ $$= \ kP \ + \ kP^2$$ $$= \ kPQ$$ $$\therefore Variance, \mu_{2} \ = \ kPQ$$ Hence, variance of a negative binomial distribution is kPQ. $$Also, \ V(x) \ = \mu_{2} \ = \ = \ kPQ \ = \frac{kq}{p^2}$$ As Q > 1, kP < KPQ i.e. mean < variance. Thus, it is important to note that for negative binomial distribution. Mean < Variance The other higher moments can be obtained in the same manner from the moment generating function Mx(t). Converting P and Q in terms of p and q, we get $$ \mu_{3} \ = \frac{kq(1+p)}{p^3} \ and \mu_{4} \ = \frac{kq \ [p^2 + 3q (k+2)] \ }{p^4}$$ Then, moment coefficient of skeweness is $$\beta_{1} = \frac{(1+q)^2}{kq}$$ $$\gamma_{1} = \frac{ 1+ q} {\sqrt{kq}}$$ The moment coefficient of kurtosis is $$\beta_{2} \ = \frac{p^2 + 3q(k+1)}{kq}$$ $$\gamma_{2} \ = \frac{p^2 + 6q}{kq}$$ Additative property of negative binomial distribution Let X1, X2, ..., Xn be independent NB (ki, p) random variables i = 0, 1, 2, ..., n, respectively. Then we have Mxi(t) = (Q- Pet)-ki $$Now, \ the \ moment \ generating \ function \ of \ sum \ S_n \ = \sum_{x=0}^n X_i \ is \ given \ by$$ Msn(t) = MΣxi(t) = Mx1(t) . Mx2(t) ... Mxn(t) =(Q - Pet)-k1 (Q - Pet)-k2 ... (Q - Pet)-kn = (Q - Pet)-(k1+ k2 + ... + kn) which is the moment generating function of negative binomial distribution with parameters k1+ k2 + ... + kn and p.Hence by uniqueness theorem of mgfs, $$the \ sum \ S_n \ = \sum_{i=0}^n \ X_i \ has \ negative \ binomial \ distribution \ with \ parameters \sum_{i=0}^n k_i \ and \ p.$$ Recurrence relation for negative binomial distribution ( fitting of NBD) : Let X follows a negative binomial distribution i.e. X ~ B- (k, p). Then we have $$p (x) = \binom{x+k-1}{k-1} \ p^k \ q^x$$ $$and \ p(x+1) \ = \binom{x+k}{k-1} \ p^k \ {q^x}^{+1}$$ Now, $$\frac{p(x+1)}{p(x)} \ = \frac{(x+k)! (k-1)! x!}{(x+1)! (k-1)! (x+k-1)!} \ . \ q$$ $$= \frac{

Summary

Moments of negative binomial distribution Let X follows a binomial distribution i.e X ~ B- ( k, p ) or X ~ NB ( k, p ) where X is a random variable ( r.v ) and k and p are the parameters. Then, the rthmoment about origin of X is given by $$μ_r^| = E (x) \ = \sum_{x=0}^\infty x^r p(x)$$ $$= \sum_{x=0}^\infty x^r \binom{x+k-1}{k-1} p^k q^x$$ Now, putting r = 1, we get, $$μ_1^| = E (x) \ = \sum_{x=0}^\infty x \binom{x+k-1}{k-1} p^k q^x$$ $$= \sum_{x=0}^\infty \frac{(x+k-1)!}{(x-1)! (k-1)!} p^K q^x$$ $$= \sum_{x=1}^\infty \frac{kq}{p} \binom{x+k-1}{k}p^k+1 q^x-1 $$ $$= \frac{kq}{p} $$∴ \ Mean \ of \ negative \ binomial \ distribution \ is \frac{kq}{p}$$ Putting r = 2, we get, μ2′= E (x2) $$= \sum_{x=0}^\infty x^2\binom{x+k-1}{k-1}p^kq^x$$ $$= \sum_{x=0}^\infty {x (x - 1) + x} \ \binom{x+k-1}{k-1} p^kq^x$$ $$= \sum_{x=0}^\infty \frac{k(k+1)q^2}{p^2} \binom{x+k-1}{k-1} \ {p^k}^{+2} {q^x}{-2} \ + \sum_{x=1}^\infty \frac{kq}{p} \binom{x+k-1}{k-1} \ {p^k}^{+1} {q^x}^{-1}$$ $$= \frac{k (k + 1)q^2}{p^2} \ + \ \frac{kq}{p}$$ ∴ V (x) =μ2 =μ2′ - (μ1′)2 $$= \frac{k(k+1)q^2}{p^2} + \frac{kq}{p} - \frac{k^2q^2}{p^2}$$ $$= \frac{kq}{p^2}$$ ∴ Variance of negative binomial distribution is $$V(x)= \frac{kq}{p^2}$$ $$As \ k \ > \ 1, \ p \ < \ 1 \ we \ have \ \frac{kq}{p} \ < \frac{kq}{p^2}$$ i.e. mean < variance which is an unique property of negative binomial distribution. The higher order moments of negative binomial distribution can be obtained on putting r = 3 and 4. Moment generating function of negative binomial distribution Let x followsnegative binomial distribution with parameters k and p. Then the probability mass function of X is $$ p(x) \ = \binom{x+k-1}{k-1} p^kq^x$$ $$If \ p \ = \frac{1}{Q} \ and \ q \ = \frac{P}{Q}, \ we \ get$$ $$ p(x) = \binom{x+k-1}{k-1} \ {Q^-}^{k} \ \frac{P^x}{Q^x}$$ The moment generating function of X ~ B- (k, p) is Mx(t) = E (etx) $$= \sum_{x=0}^\infty {e^t}{x} p(x)$$ $$= \sum_{x=0}^\infty {e^t}^{x} \binom{x+k-1}{k-1} {Q^-}^{k} \frac{P^x}{Q^x}$$ $$= \sum_{x=0}^\infty \binom{x+k-1}{k-1} {Q^-}^{k} \frac{(Pe^t)^x}{Q^x}$$ $$= \ {Q^-}^{k} \sum_{x=0}^\infty \binom{x+k-1}{k-1} \frac{(Pe^t)^x}{Q^x}$$ = (Q - Pet)-k ∴ Mx(t) =(Q - Pet)-k Now, the rth moment about origin of negative binomial distribution can be obtained by $$\mu_{r}^{'} \ = \frac{d^r M_x(t)}{dt^r} |_{t=0}$$ $$\therefore \mu_{1}^{'} \ = \frac{d M_x(t)}{dt^r} |_{t=0}$$ $$= \frac{d}{dt} (Q - Pe^t)^{-k} |_{t=0}$$ $$= \ (-k) \ (Q - Pe^t)^{-k-1} \ (-Pe^t) |_{t=0}$$ $$= kP$$ $$\therefore Mean, \ \mu_{1}^{'} \ = \ KP$$ Hence, the mean of negative binomial distribution is KP. $$Since, \ p \ = \frac{1}{Q} \Rightarrow \ Q \ = \frac{p}{Q} \ and \ q = \frac{P}{Q} \ \Rightarrow \ P \ = \ qQ \ = \frac{q}{p}$$ $$\mu_{1}^{'} \ = \ KP \ = \frac{kq}{p}$$ Similarly $$\mu_{2}{"} = \frac{d^2 M_x(t)}{dt^2} |_{t=0}$$ $$= \frac{d}{dt} \ {kPe^t (Q - Pe^t)^(-k-1)}|_{t=0}$$ $$ = \ kPe^t (Q- Pe^t)^(-k-1) \ + \ (-k-1) \ kPe^t (Q- Pe^t)^(-k-2) \ (-Pe^t)|_{t=0}$$ $$= \ kP \ + \ k(k+1) P^2$$ We have, $$\mu_{2} \ = \mu_{2}^{'} \ - (\mu_{1}^{'})^2$$ $$= \ kP \ + \ k (k + 1) P^2 \ - (kP)^2$$ $$= \ kP \ + \ kP^2$$ $$= \ kPQ$$ $$\therefore Variance, \mu_{2} \ = \ kPQ$$ Hence, variance of a negative binomial distribution is kPQ. $$Also, \ V(x) \ = \mu_{2} \ = \ = \ kPQ \ = \frac{kq}{p^2}$$ As Q > 1, kP < KPQ i.e. mean < variance. Thus, it is important to note that for negative binomial distribution. Mean < Variance The other higher moments can be obtained in the same manner from the moment generating function Mx(t). Converting P and Q in terms of p and q, we get $$ \mu_{3} \ = \frac{kq(1+p)}{p^3} \ and \mu_{4} \ = \frac{kq \ [p^2 + 3q (k+2)] \ }{p^4}$$ Then, moment coefficient of skeweness is $$\beta_{1} = \frac{(1+q)^2}{kq}$$ $$\gamma_{1} = \frac{ 1+ q} {\sqrt{kq}}$$ The moment coefficient of kurtosis is $$\beta_{2} \ = \frac{p^2 + 3q(k+1)}{kq}$$ $$\gamma_{2} \ = \frac{p^2 + 6q}{kq}$$ Additative property of negative binomial distribution Let X1, X2, ..., Xn be independent NB (ki, p) random variables i = 0, 1, 2, ..., n, respectively. Then we have Mxi(t) = (Q- Pet)-ki $$Now, \ the \ moment \ generating \ function \ of \ sum \ S_n \ = \sum_{x=0}^n X_i \ is \ given \ by$$ Msn(t) = MΣxi(t) = Mx1(t) . Mx2(t) ... Mxn(t) =(Q - Pet)-k1 (Q - Pet)-k2 ... (Q - Pet)-kn = (Q - Pet)-(k1+ k2 + ... + kn) which is the moment generating function of negative binomial distribution with parameters k1+ k2 + ... + kn and p.Hence by uniqueness theorem of mgfs, $$the \ sum \ S_n \ = \sum_{i=0}^n \ X_i \ has \ negative \ binomial \ distribution \ with \ parameters \sum_{i=0}^n k_i \ and \ p.$$ Recurrence relation for negative binomial distribution ( fitting of NBD) : Let X follows a negative binomial distribution i.e. X ~ B- (k, p). Then we have $$p (x) = \binom{x+k-1}{k-1} \ p^k \ q^x$$ $$and \ p(x+1) \ = \binom{x+k}{k-1} \ p^k \ {q^x}^{+1}$$ Now, $$\frac{p(x+1)}{p(x)} \ = \frac{(x+k)! (k-1)! x!}{(x+1)! (k-1)! (x+k-1)!} \ . \ q$$ $$= \frac{

Things to Remember

  1.  In negative binomial distribution, number of trials is random variable and number of successes is fixed.
  2.  $$Mean \ of \ negative \ binomial \ distribution \ is \ \frac{kq}{p}$$
  3. variance of negative binomial distribuyion is $$\frac{kq}{P^2}.$$
  4. In negative binomial distribution mean < variance.

MCQs

No MCQs found.

Subjective Questions

No subjective questions found.

Videos

No videos found.

Moments Of Negative Binomial Distribution

Moments Of Negative Binomial Distribution

Moments of negative binomial distribution

Let X follows a binomial distribution i.e X ~ B- ( k, p ) or X ~ NB ( k, p ) where X is a random variable ( r.v ) and k and p are the parameters. Then, the rthmoment about origin of X is given by

$$μ_r^| = E (x) \ = \sum_{x=0}^\infty x^r p(x)$$

$$= \sum_{x=0}^\infty x^r \binom{x+k-1}{k-1} p^k q^x$$

Now, putting r = 1, we get,

$$μ_1^| = E (x) \ = \sum_{x=0}^\infty x \binom{x+k-1}{k-1} p^k q^x$$

$$= \sum_{x=0}^\infty \frac{(x+k-1)!}{(x-1)! (k-1)!} p^K q^x$$

$$= \sum_{x=1}^\infty \frac{kq}{p} \binom{x+k-1}{k}p^k+1 q^x-1 $$

$$= \frac{kq}{p}

$$∴ \ Mean \ of \ negative \ binomial \ distribution \ is \frac{kq}{p}$$

Putting r = 2, we get,

μ2= E (x2)

$$= \sum_{x=0}^\infty x^2\binom{x+k-1}{k-1}p^kq^x$$

$$= \sum_{x=0}^\infty {x (x - 1) + x} \ \binom{x+k-1}{k-1} p^kq^x$$

$$= \sum_{x=0}^\infty \frac{k(k+1)q^2}{p^2} \binom{x+k-1}{k-1} \ {p^k}^{+2} {q^x}{-2} \ + \sum_{x=1}^\infty \frac{kq}{p} \binom{x+k-1}{k-1} \ {p^k}^{+1} {q^x}^{-1}$$

$$= \frac{k (k + 1)q^2}{p^2} \ + \ \frac{kq}{p}$$

∴ V (x) =μ22 - (μ1′)2

$$= \frac{k(k+1)q^2}{p^2} + \frac{kq}{p} - \frac{k^2q^2}{p^2}$$

$$= \frac{kq}{p^2}$$

∴ Variance of negative binomial distribution is

$$V(x)= \frac{kq}{p^2}$$

$$As \ k \ > \ 1, \ p \ < \ 1 \ we \ have \ \frac{kq}{p} \ < \frac{kq}{p^2}$$

i.e. mean < variance which is an unique property of negative binomial distribution.

The higher order moments of negative binomial distribution can be obtained on putting r = 3 and 4.

Moment generating function of negative binomial distribution

Let x followsnegative binomial distribution with parameters k and p. Then the probability mass function of X is

$$ p(x) \ = \binom{x+k-1}{k-1} p^kq^x$$

$$If \ p \ = \frac{1}{Q} \ and \ q \ = \frac{P}{Q}, \ we \ get$$

$$ p(x) = \binom{x+k-1}{k-1} \ {Q^-}^{k} \ \frac{P^x}{Q^x}$$

The moment generating function of X ~ B- (k, p) is

Mx(t) = E (etx)

$$= \sum_{x=0}^\infty {e^t}{x} p(x)$$

$$= \sum_{x=0}^\infty {e^t}^{x} \binom{x+k-1}{k-1} {Q^-}^{k} \frac{P^x}{Q^x}$$

$$= \sum_{x=0}^\infty \binom{x+k-1}{k-1} {Q^-}^{k} \frac{(Pe^t)^x}{Q^x}$$

$$= \ {Q^-}^{k} \sum_{x=0}^\infty \binom{x+k-1}{k-1} \frac{(Pe^t)^x}{Q^x}$$

= (Q - Pet)-k

∴ Mx(t) =(Q - Pet)-k

Now, the rth moment about origin of negative binomial distribution can be obtained by

$$\mu_{r}^{'} \ = \frac{d^r M_x(t)}{dt^r} |_{t=0}$$

$$\therefore \mu_{1}^{'} \ = \frac{d M_x(t)}{dt^r} |_{t=0}$$

$$= \frac{d}{dt} (Q - Pe^t)^{-k} |_{t=0}$$

$$= \ (-k) \ (Q - Pe^t)^{-k-1} \ (-Pe^t) |_{t=0}$$

$$= kP$$

$$\therefore Mean, \ \mu_{1}^{'} \ = \ KP$$

Hence, the mean of negative binomial distribution is KP.

$$Since, \ p \ = \frac{1}{Q} \Rightarrow \ Q \ = \frac{p}{Q} \ and \ q = \frac{P}{Q} \ \Rightarrow \ P \ = \ qQ \ = \frac{q}{p}$$

$$\mu_{1}^{'} \ = \ KP \ = \frac{kq}{p}$$

Similarly

$$\mu_{2}{"} = \frac{d^2 M_x(t)}{dt^2} |_{t=0}$$

$$= \frac{d}{dt} \ {kPe^t (Q - Pe^t)^(-k-1)}|_{t=0}$$

$$ = \ kPe^t (Q- Pe^t)^(-k-1) \ + \ (-k-1) \ kPe^t (Q- Pe^t)^(-k-2) \ (-Pe^t)|_{t=0}$$

$$= \ kP \ + \ k(k+1) P^2$$

We have,

$$\mu_{2} \ = \mu_{2}^{'} \ - (\mu_{1}^{'})^2$$

$$= \ kP \ + \ k (k + 1) P^2 \ - (kP)^2$$

$$= \ kP \ + \ kP^2$$

$$= \ kPQ$$

$$\therefore Variance, \mu_{2} \ = \ kPQ$$

Hence, variance of a negative binomial distribution is kPQ.

$$Also, \ V(x) \ = \mu_{2} \ = \ = \ kPQ \ = \frac{kq}{p^2}$$

As Q > 1, kP < KPQ i.e. mean < variance. Thus, it is important to note that for negative binomial distribution.

Mean < Variance

The other higher moments can be obtained in the same manner from the moment generating function Mx(t).

Converting P and Q in terms of p and q, we get

$$ \mu_{3} \ = \frac{kq(1+p)}{p^3} \ and \mu_{4} \ = \frac{kq \ [p^2 + 3q (k+2)] \ }{p^4}$$

Then, moment coefficient of skeweness is

$$\beta_{1} = \frac{(1+q)^2}{kq}$$

$$\gamma_{1} = \frac{ 1+ q} {\sqrt{kq}}$$

The moment coefficient of kurtosis is

$$\beta_{2} \ = \frac{p^2 + 3q(k+1)}{kq}$$

$$\gamma_{2} \ = \frac{p^2 + 6q}{kq}$$

Additative property of negative binomial distribution

Let X1, X2, ..., Xn be independent NB (ki, p) random variables i = 0, 1, 2, ..., n, respectively. Then we have

Mxi(t) = (Q- Pet)-ki

$$Now, \ the \ moment \ generating \ function \ of \ sum \ S_n \ = \sum_{x=0}^n X_i \ is \ given \ by$$

Msn(t) = MΣxi(t)

= Mx1(t) . Mx2(t) ... Mxn(t)

=(Q - Pet)-k1 (Q - Pet)-k2 ... (Q - Pet)-kn

= (Q - Pet)-(k1+ k2 + ... + kn)

which is the moment generating function of negative binomial distribution with parameters k1+ k2 + ... + kn and p.Hence by uniqueness theorem of mgfs,

$$the \ sum \ S_n \ = \sum_{i=0}^n \ X_i \ has \ negative \ binomial \ distribution \ with \ parameters \sum_{i=0}^n k_i \ and \ p.$$

Recurrence relation for negative binomial distribution ( fitting of NBD) :

Let X follows a negative binomial distribution i.e. X ~ B- (k, p). Then we have

$$p (x) = \binom{x+k-1}{k-1} \ p^k \ q^x$$

$$and \ p(x+1) \ = \binom{x+k}{k-1} \ p^k \ {q^x}^{+1}$$

Now,

$$\frac{p(x+1)}{p(x)} \ = \frac{(x+k)! (k-1)! x!}{(x+1)! (k-1)! (x+k-1)!} \ . \ q$$

$$= \frac{x+k}{x+1} \ q$$

$$p(x+1) \ = (\frac{x+k}{x+1}) \ q \ p(x)$$

This is the recurrence relation for probabilities of negative binomial distribution for x= 0, 1, 2, ... and k > 0. This recurrence formula is useful for fitting the negative binomial distribution to the given empirical data. The excepted frequency of X is given by

fe (x) = N. p(x) for x = 0, 1, 2, ..., and k > 0

Where, N =Σfo, the sum of observed frequencies.

Bibliography

Sukubhattu N.P. (2013). Probability & Inference - II. Asmita Books Publishers & Distributors (P) Ltd., Kathmandu.

Larson H.J. Introduction to Probability Theory and Statistical Inference. WileyInternational, New York.

Lesson

Discrete probability distribution

Subject

Statistics

Grade

Bachelor of Science

Recent Notes

No recent notes.

Related Notes

No related notes.