4 Probability Models: Probability Measures
\[ \def\IP{{\bf P}} \]
- Probability models can be applied to any situation in which there are multiple potential outcomes and there is uncertainty about which outcome will occur.
- Outcomes, events, and random variables define what is possible.
- Probability measures determine how probable.
- A probability measure, typically denoted \(\text{P}\), assigns probabilities to events to quantify their relative likelihoods according to the assumptions of the model of the random phenomenon.
- The probability of event \(A\), computed according to probability measure \(\text{P}(A)\), is denoted \(\text{P}(A)\).
- A valid probability measure \(\text{P}\) must satisfy the following three logical consistency “axioms”.
- For any event \(A\), \(0 \le \text{P}(A) \le 1\).
- If \(\Omega\) represents the sample space then \(\text{P}(\Omega) = 1\).
- (Countable additivity.) If \(A_1, A_2, A_3, \ldots\) are disjoint then \[ \text{P}(A_1 \cup A_2 \cup A_2 \cup \cdots) = \text{P}(A_1) + \text{P}(A_2) +\text{P}(A_3) + \cdots \]
- Additional properties of a probability measure follow from the axioms
- Complement rule. For any event \(A\), \(\text{P}(A^c) = 1 - \text{P}(A)\).
- Subset rule. If \(A \subseteq B\) then \(\text{P}(A) \le \text{P}(B)\).
- Addition rule for two events. If \(A\) and \(B\) are any two events \[ \text{P}(A\cup B) = \text{P}(A) + \text{P}(B) - \text{P}(A \cap B) \]
- Law of total probability. If \(C_1, C_2, C_3\ldots\) are disjoint events with \(C_1\cup C_2 \cup C_3\cup \cdots =\Omega\), then \[ \text{P}(A) = \text{P}(A \cap C_1) + \text{P}(A \cap C_2) + \text{P}(A \cap C_3) + \cdots \]
- A probability model (or probability space) is the collection of all outcomes, events, and random variables associated with a random phenomenon along with the probabilities of all events of interest under the assumptions of the model.
- The axioms of a probability measure are minimal logical consistent requirements that ensure that probabilities of different events fit together in a valid, coherent way.
- A single probability measure corresponds to a particular set of assumptions about the random phenomenon.
- There can be many probability measures defined on a single sample space, each one corresponding to a different probability model for the random phenomenon.
- Probabilities of events can change if the probability measure changes.
Example 4.1 Consider a Cal Poly student who frequently has blurry, bloodshot eyes, generally exhibits slow reaction time, always seems to have the munchies, and disappears at 4:20 each day. Which of the following events, \(A\) or \(B\), has a higher probability? (Assume the two probabilities are not equal.)
- \(A\): The student has a GPA above 3.0.
- \(B\): The student has a GPA above 3.0 and smokes marijuana regularly.
Example 4.2 Consider a single roll of a four-sided die, but suppose the die is weighted so that the outcomes are no longer equally likely. Suppose that the probability of event \(\{2, 3\}\) is 0.5, of event \(\{3, 4\}\) is 0.7, and of event \(\{1, 2, 3\}\) is 0.6. In what particular way is the die weighted? That is, what is the probability of each the four possible outcomes?
Example 4.3 Consider again a single roll of a weighted four-sided die. Suppose that
- Rolling a 1 is twice as likely as rolling a 4
- Rolling a 2 is three times as likely as rolling a 4
- Rolling a 3 is 1.5 times as likely as rolling a 4
In what particular way is the die weighted? That is, what is the probability of each the four possible outcomes? Compute the probability of these events: \(\{2, 3\}\), \(\{3, 4\}\), \(\{1, 2, 3\}\).
4.1 Equally Likely Outcomes
- For a sample space \(\Omega\) with finitely many possible outcomes, assuming equally likely outcomes corresponds to a probability measure \(\text{P}\) which satisfies \[ \text{P}(A) = \frac{|A|}{|\Omega|} = \frac{\text{number of outcomes in $A$}}{\text{number of outcomes in $\Omega$}} \qquad{\text{when outcomes are equally likely}} \]
Example 4.4 Roll a four-sided die twice, and record the result of each roll in sequence. One choice of probability measure \(\textbf{P}\) corresponds to assuming that the die is fair, the rolls are independent, and the 16 possible outcomes are equally likely.
- Compute \(\text{P}(A)\), where \(A\) is the event that the sum of the two dice is 4.
- Compute \(\text{P}(C)\), where \(C\) the event that the larger of the two rolls (or the common roll if a tie) is 3.
- Compute and interpret \(\text{P}(A\cap C)\). (Is it equal to the product of \(\text{P}(A)\) and \(\text{P}(C)\)?)
- Let \(X\) be the sum of the two dice. Compute \(\text{P}(X = 4) \equiv \text{P}(\{X = 4\})\). Then interpret the probability both as a long relative frequency and as a relative likelihood.
- Construct a table and plot of \(\text{P}(X = x)\) for each possible value \(x\) of \(X\).
- Let \(Y\) be the larger of the two rolls (or the common value if both rolls are the same). Construct a table and plot of \(\text{P}(Y = y)\) for each possible value \(y\) of \(Y\).
- Compute \(\text{P}(X = x, Y = y)\) for each possible \((x, y)\) pair.
4.2 Uniform Probability Measures
- The continuous analog of equally likely outcomes is a uniform probability measure. When the sample space is uncountable, size is measured continuously (length, area, volume) rather that discretely (counting). \[ \text{P}(A) = \frac{|A|}{|\Omega|} = \frac{\text{size of } A}{\text{size of } \Omega} \qquad \text{if $\text{P}$ is a uniform probability measure} \]
Example 4.5 Regina and Cady plan to meet for lunch between noon and 1:00 but they are not sure of their arrival times. Recall the sample space from Example 3.2. Let \(R\) be the random variable representing Regina’s arrival time (minutes after noon), and \(Y\) for Cady. Assume a uniform probability measure \(\text{P}\), which corresponds to assuming that they each arrive uniformly at random at a time between noon and 1:00, independently of each other. Compute and interpret the following probabilities.
- \(\text{P}(R > Y)\).
- \(\text{P}(T < 30)\), where \(T = \min(R, Y)\).
- \(\text{P}(R > Y, W < 15)\), where \(W = |R - Y|\).
- \(\text{P}(R < 24)\).