1.5 Bayes’ Theorem
Suppose a sequence of sets \(A_1,A_2,\cdots,A_n \subseteq \Omega\), \(A_i \cap A_j = \varnothing, \forall i \neq j\), and \(\bigcup_{i=1}^n A_i = \Omega\). Then we say this sequence of sets is a partition on \(\Omega\), denoted by \(\sum_{i=1}^n A_i = \Omega\).
Suppose a sequence of sets \(\{A_i\}_{i=1}^n\) is a partition on \(\Omega\). Then for any event \(B \subseteq \Omega\), we have \[ P(B) = \sum_{i=1}^n P(A_i \cap B) = \sum_{i=1}^n P(B|A_i)P(A_i) \]
Bayes’ Theorem is a fundamental principle in probability theory and statistics. It provides a way to update the probability of a hypothesis based on new evidence or information. The theorem is named after the Reverend Thomas Bayes, who introduced the concept in the 18th century.
Bayes’ Theorem
Suppose sequence of sets \(\{A_i\}_{i=1}^n\) is a partition on \(\Omega\). Then for any event \(B \subseteq \Omega\), we have \[ P(A_k|B)=\frac{P(B|A_k)P(A_k)}{\sum_{i=1}^n P(B|A_i)P(A_i)}, \quad k = 1,\cdots,n. \]
\(proof.\)
\[ P(A_k|B) = \frac{P(A_k \cap B)}{P(B)} = \frac{P(A_k \cap B)}{\sum_{i=1}^n P(A_i \cap B)} =\frac{P(B|A_k)P(A_k)}{\sum_{i=1}^n P(B|A_i)P(A_i)} \]
Bayes’ Theorem states that the probability of a hypothesis being true given some new evidence is proportional to the product of the prior probability of the hypothesis and the likelihood of the evidence given that hypothesis, divided by the overall probability of the evidence.