1.4 Conditional Probability and Independence

1.4.1 Conditional Probability

Conditional probability deals with the likelihood of an event occurring given that another event has already occurred. Here we give the formal definition

Let events \(A,B \subseteq \Omega\). If \(P(B) \neq 0\), then the conditional probability of \(A\) on \(B\) is \[ P(A|B) = \frac{P(A \cap B)}{P(B)} \]

The event \(B\) can be considered as a new sample space. Conditional probability also satisfies the three axioms of probability and hence is in a probability space.

Proposition:

Suppose events \(A,B,C \subseteq \Omega\), then

  1. \(P(A \cap B) = P(A|B)P(B) = P(B|A)P(A)\) (multiplication rule)

  2. \(P(C|C)=1\)

  3. \(P(A^c|C) = 1 - P(A|C)\)

  4. \(P(\varnothing|C)=0, \quad P(\Omega|C)=P(C)\)

  5. If \(A \subseteq B\), then \(P(A|C) \leq P(B|C)\).

  6. \(P(A \cup B|C) = P(A|C) + P(B|C) - P(A \cap B|C)\). If \(A,B\) are disjoint, then \(P(A \cup B|C) = P(A|C) + P(B|C)\) (addition rule)

  7. The law of total probability \(P(A|C)=P(A \cap B|C)+P(A \cap B^c|C)\)

  8. Boole’s inequality \(P(A \cap B|C) \leq P(A|C) + P(B|C)\)

  9. Bonferroni’s inequality \(P(A \cap B|C) \geq P(A|C)+P(B|C)-1\)

Note that condition on different events is not meaningful.

1.4.2 Independence

Suppose events \(A,B \subseteq \Omega\). We say two events \(A,B\) are independent if \[ P(A|B)=P(A) \] denoted by \(A \perp \!\!\! \perp B\).

Note that the following are equivalent

\[ \begin{align*} A \perp \!\!\! \perp B &\Longleftrightarrow P(A|B)=P(A) \\ &\Longleftrightarrow P(B|A)=P(B) \\ &\Longleftrightarrow P(A \cap B)=P(A)P(B) \end{align*} \]

It’s important that independent events and mutually exclusive events are different.

Proposition:

Suppose two events \(A,B \subseteq \Omega\), then \[ A \perp \!\!\! \perp B \Longleftrightarrow A^c \perp \!\!\! \perp B \Longleftrightarrow A \perp \!\!\! \perp B^c \Longleftrightarrow A^c \perp \!\!\! \perp B^c \]

Pairwise independent

The following discuss the independence for more than three events.

Suppose there are three events \(A,B,C \subseteq \Omega\). We say \(A,B,C\) are mutual independence if and only if the following two statements are satisfied:

  1. \(P(A \cap B)=P(A)P(B), \quad P(A \cap C)=P(A)P(C), \quad P(B \cap C)=P(B)P(C)\)
  2. \(P(A \cap B \cap C)=P(A)P(B)P(C)\)

If only 1. was satisfied, we say \(A,B,C\) are pairwise independent.

It is clear that mutual independence must be pairwise independent, but the converse is not necessarily true.

Gambler’s fallacy & Hot hand fallacy