Bayes Theorem1 is a logical extension of the conditional probability arguments we looked at in the Venn diagram section. We saw that:
P(A \mid B)=\frac{P (A \cap B)}{P(B)}and P(B \mid A)=\frac{P (B \cap A)}{P(A)}
Since P(A \cap B)=P(B \cap A) :
P(A \cap B)=P(B).P(A|B)=P(B \cap A) |
Hence: P(B|A)=\frac{P(B)P(A|B)}{P(A)}which is Bayes Theorem
and, in general,
P(A_i|B)=\frac{P(B|A_i).P(A_i)}{\displaystyle\sum_{i=1}^{n}P( B|A_i)P(A_i)} |
The following example illustrates the use of this equation. Many more are given in the section on Bayesian inference.
Example
Three machines A, B and C produce 20%, 45% and 35% respectively of a factory's wheel nuts output. 2%, 1% and 3% respectively of these machines outputs are defective.
a) What is the probability that any wheel nut randomly selected from the factory's stock will be defective? Let X be the event that the wheel nut is defective and A, B, and C be the events that the selected wheel nut came from machines A, B and C respectively:
P(X) = P(A). P(X/A)+P(B). P(X/B)+P(C). P(X/C)
= (0.2).(0.02)+(0.45.)(0.01)+(0.35).(0.03)
= 0.019
b) What is the probability that a randomly selected wheel nut comes from machine A if it is defective?
From Bayes Theorem,
P(A/X)=\frac{P(A).P(X/A)}{P(A).P(X/A)+P(B).P(X/B)+P(C).P(X/C)} |
=\frac{(0.2).(0.02)}{(0.2).(0.02)+(0.45).(0.01)+(0.35).(0.03)}\approx 0.211 |
In other words, in Bayes Theorem we divide the probability of the required path (probability that it came from machine A and was defective) by the probability of all possible paths (probability that it came from any machine and was defective).
1Rev. Thomas Bayes (1702-1761) - English philosopher. A short biography and a reprint of his original paper describing Bayes Theorem appear in Press (1989).