Baye's Theorem :Prior and Posterior Probabilities

This note provides us the definition of Baye's theorem and its properties.

Summary

This note provides us the definition of Baye's theorem and its properties.

Things to Remember

In case \(A_1,A_2,...,A_n\) are equally likely events i.e. if \(P(A_1)=P(A_2)=..........=P(A_n) \), then the Baye's theorem reduce to

$$P(\frac{A_i}{B})=\frac {P(\frac {B}{A_i})}{\sum_{i=1}^{n} P(\frac {B}{A_i})}$$

MCQs

No MCQs found.

Subjective Questions

No subjective questions found.

Videos

No videos found.

Baye's Theorem :Prior and Posterior Probabilities

Baye's Theorem :Prior and Posterior Probabilities

Bayes Theorem:Prior and Posterior Probabilities

Baye's theorem was developed by a British Mathematician Thoms Bayes(1702-61)but published in 1763 only after his death. It provides a powerful probability law for revising prior probabilities in the light to new information obtained from an experiment.The probabilities of some events that have been known before we gain a new information from the experiment are called prior probabilities.The revised probabilities i.e. the probabilities calculated after the new information obtained from the experiment are known as posterior probabilities.

Theorem 21.

If \(A_1,A_2,....A_n\) are n mutually exclusive and exhaustive events of an experiment and B is an event which happens in conjuction with either of \(A _i\) (i=1,2,......,n),then the probability of happening of \(A_i\) when B has actually occurred is given by

$$P(\frac {A_i}{B})=\frac{P(A_i)P(\frac {B}{A_i})}{\sum_{i=1}^{n} P(A_i)P(\frac{B}{A_i})}$$

$$ or \; \; \; P(\frac {A_i}{B})=\frac{P(A_i)P(\frac {B}{A_i})}{P(B)}$$

Proof:By Multiplication theorem of probability ,we have

$$P(B \cap A_i)=P(B).P(\frac {A_i}{B})$$

$$ \Rightarrow \; \; P(\frac {A_i}{B})=\frac{P(B \cap A_i}{P(B)} \; \; \; \; ........ (1)$$

Baye's theorem
Baye's theorem

Since the event B occurs with any one of the n mutually exclusive events \(A_1,A_2,…..,A_n\) We have

$$B=(B \cap A_1) \cup (B \cap A_2) \cup ….. \cup (B \cap A_n)$$

$$=\bigcup_{i=1}^{n}(B \cap A_i)$$

$$ or \; P(B) =\bigcup_{i=1}^{n}P[ (B \cap A_i)]$$

$$=\sum_{i=1}^{n}P(B \cap A_i) \; \; [By \; addition \; theorem \; and \; B \cap A_i \; are \; mutually \; disjoint.]$$

$$\therefore P(B)=\sum_{i=1}^{n}P(A_i)P(\frac{B}{A_i}) \; \; \; \; \; ............ \; (2)$$

$$From \; equations \; (1) \; and \; (2), \; we \; get$$

$$P(\frac{A_i}{B})=\frac {P(A_i)P(\frac{B}{A_i})}{\sum_{i=1}^{n} P(A_i)P(\frac {B}{A_i})}$$

In this Baye's theorem ,the probabilities \(P(A_i) \)i=1,2,.....,n are called prior probabilities and the probabilities \(P(\frac {A_i}{B})\) are called posterior probabilities i.e. revised probabilities calculated on the basis of the new event B that has actually occurred in the experiment .Also,the probabilities \(P(\frac {B}{A_i}\)) are called ''likehoods'' because they show likely the event B under consideration is to occur given a prior probabilities and hence \(P(\frac {B}{A_i}\)) are the conditional probabilities of B given that \(A_i\) has already occurred.

Bayes Theorem is to be applied only when the events \(A_1,A_2,.....,A_n\) are mutually exclusive and sum of their probabilities is equal to unity i.e. \(\sum_{i=1} ^{n} P(A_i)=1\).

A tree diagram for Baye's theorem is given below.

Tree Diagram for Baye's theorem
Tree Diagram for Baye's theorem

Baye's Theorem for equally likely events

In case \(A_1,A_2,...,A_n\) are equally likely events i.e. if \(P(A_1)=P(A_2)=..........=P(A_n) \), then the Baye's theorem reduce to

$$P(\frac{A_i}{B})=\frac {P(\frac {B}{A_i})}{\sum_{i=1}^{n} P(\frac {B}{A_i})}$$

Baye's theorem for future events

Theorem 22

If C is future event in regard to B,then

$$P(\frac {C}{B})=\frac {\sum_{i=1}^{n} P(A_i)P(\frac {B}{A_i}) P(\frac {C}{A_i \cap B})}{\sum_{i=1}^{n}P(A_i)P(\frac{B}{A_i})}$$

Proof: As the event B, the future event C can occur with any one of the n mutually exclusive events \(A_1,A_2,.....,A_n\) so we have

$$ C=(C \cap A_1) \cup (C \cap A_2) \cup ...... \cup (C \cap A_n)$$

$$\Rightarrow \; \; \; \frac {C}{B}=[\frac{(C \cap A_i)}{B}] \cup [\frac{(C \cap A_2)} {B}] \; \; \cup ......... \cup \; \; [\frac{(C \cap A_n)}{B}]$$

$$= \bigcup_{i=1}^{n} [\frac {(C \cap A_i)} {B}]$$

$$\Rightarrow P(\frac {C}{B})=P[\bigcup_{i=1}^{n} \frac {(C \cap A_i)}{B}]$$

$$=\sum_{i=1}^{n}P[\frac {(C \cap A_i)}{B}]$$

$$=\sum_{i=1}^{n}P[\frac {(A_i}{ B})P(\frac {C}{A_i \cap B})]$$

$$ substituting \; the \; value \; of \; P(\frac{A_i}{B}) \; from \; (1) \; we \; get,$$

$$P(\frac{C}{B})=\frac{ \sum_{i=1}^{n} P(A_i)P(\frac{B}{A_i})P(\frac {C}{A_i \cap B})}{ \sum_{i=1}^{n} P(A_i)P(\frac {B}{A_i})}$$

Example: Suppose A B.Sc. class contains 60 boys and 40 girls students.Among the students, 8% boys and 4%girls got a freeship. A student is selected at random from the class, and if the freeship is received by the student,what is the probability that the selected student is (i) a boy (ii) a girl?

Solution

Let B and G denote the events that selected is a boy and a girl respectively .Also,let F denotes by the student,what is the probability that the selected student received the freeship.Then we have,

$$P(B)=\frac{60}{100}=0.6, \; \; \; \; \; P(G)=\frac{40}{100}=0.4$$

\(P(\frac {F}{B})\)=probabilitiy of selecting a student who received freeship from the boys=8%=0.08 ,and \(P(\frac{F}{G})\)

i.) If the freeship is received by the student, the probability that the selected student is a boy is given by the Baye's theorem:

$$ P(\frac {B}{F})=\frac{P(B)P(\frac{F}{B})}{P(B)P(\frac{F}{B})+P(G)P(\frac {F}{G})}=\frac{06 \times 0.08}{06 \times 0.08+0.4 \times 0.04}=\frac{0.048}{0.064}=\frac{3}{4}$$

ii) The probability of selecting a girl who recieved freeship is given by

$$P(\frac{G}{F})=\frac{P(G)P(\frac {F}{G})}{P(F)}=\frac{ 0.4 \times 0.04}{0.064}=\frac{0.016}{0.064}=\frac{1}{4}$$

Alter:

$$P(\frac{G}{F})=1-P(\frac{B}{F})=1-\frac{3}{4}=\frac{1}{4}$$

This is because , the freeship is received by either boy or girl and therefore,

$$P(\frac{B}{F})+P(\frac{G}{F})=\frac{3}{4}+\frac{1}{4}=1$$

It is noted that probabilities \(P(\frac{B}{F})\) and \(P(\frac{G}{F})\) are known as posterior probabilities of pro\ior(original ) events B and G respectively after they received the freeship F.

References

Sukubhattu,Narendra Prasad. Probability and Inference-I. Asmita Books Publishers & Distributors (P)Ltd. 2013

Lesson

Introduction to Probability

Subject

Statistics

Grade

Bachelor of Science

Recent Notes

No recent notes.

Related Notes

No related notes.