## 1.1 Some probability notation

Sample Space

The sample space is the set of all possible outcomes of an experiment. Usually, we denote the sample space to be $$\Omega$$ (a Greek letter pronounced 'omega').

For example, suppose a fair coin is to be tossed once. The outcomes are either head ($$H$$) or tail ($$T$$). So, we can write down the sample space of this experiment as:

$\Omega = \left\{H, T \right\}$

Now let's suppose the experiment were a bit more complicated so that this time the fair coin is to be tossed twice and the result ($$H$$ or $$T$$) noted each time. This time, we could toss a head and then a head $$HH$$, or a head and then a tail $$HT$$, and so on. These are the possible outcomes. We can therefore write down the sample space of this slightly more complicated experiment as:

$\Omega = \left\{HH, HT, TH, TT \right\}$

Events

A simple event is any event with just one outcome. From the second experiment above, $$\left\{HH \right\}$$ is an example of a simple event, as are all of the other single outcomes, $$\left\{HT\right\}$$, $$\left\{TH\right\}$$, and $$\left\{TT\right\}$$.

An event can be slightly more complicated: a set of one or more outcomes from the sample space.

Events are normally denoted $$A$$, $$B$$, $$C$$, etc. As an example, considering the second experiment above, let $$A$$ be the event that one or more heads are tossed; let $$B$$ be the event that exactly one head is tossed, let $$C$$ be the event that no heads are tossed, and let $$D$$ be the event that no heads or tails are tossed. These events can be written down as:

$A = \left\{HH, HT, TH\right\}, B = \left\{HT, TH\right\}, C = \left\{TT\right\}, D = \emptyset$

Note that $$A, B$$, and $$C$$ are all events, and that in addition, $$C$$ is also a simple event. The symbol $$\emptyset$$ denotes a null event (or null set). Recalling $$D$$ is the event that no heads or tails are tossed, it makes sense that $$D = \emptyset$$, because there are no outcomes within $$\Omega$$ where this can occur.

Probability of an Event

Normally, we denote the probability of an event with $$P$$, so that for some event $$A$$, the probability of $$A$$ occurring can be denoted $$P(A)$$. Probabilities are always between 0 and 1 so that:

• $$P(A) = 0$$ means that event $$A$$ will definitely not occur
• $$P(A) = 0.5$$ means that event $$A$$ is equally likely to occur or not occur
• $$P(A) = 1$$ means that event $$A$$ will definitely occur.

For example, from our first experiment above, where we only tossed the coin once, we know that:

• The probability the coin toss would result in a 'tail' is equal to 0.5 (there is a 50% chance this would occur)
• The probability the coin toss would result in either a 'head' or a 'tail' is equal to 1 (this will definitely occur)
• The probability the coin toss would result in neither a 'head' nor a 'tail' is equal to 0 (this will definitely not occur, because the coin toss has to result in either a 'head' or a 'tail').

The last even described above, where the coin toss results in neither a 'head' nor a 'tail', is another example of a null event (or null set), $$\emptyset$$. This event does not consist of any event or set of events from the sample space $$\Omega$$. That is why it is a null, or empty, set. In other words, it is impossible for this event to occur, meaning it has probability zero.

Some probability facts

Before we go on, now is a good time to define a few more symbols, some of which may be revision for you. In addition to the 'equals' sign $$=$$, we have:

• $$\neq$$ means 'not equal to'
• $$\approx$$ means 'approximately equal to'
• $$>$$ means 'greater than'
• $$<$$ means 'less than'
• $$\geq$$ means 'greater than or equal to'
• $$\leq$$ means 'less than or equal to'

Some very important facts about probability can be summarised as follows:

Some probability facts

• $$P(A) \geq 0$$, meaning a probability cannot be negative
• $$P(\Omega) = 1$$, meaning it is guaranteed that at least one of the events in the sample space will occur

The above facts can help lead to other important probability results, one of which is the complement rule. Let $$A^C$$ be the complement of $$A$$, meaning it contains all of the outcomes that are not in $$A$$. Then the complement rule is:

The complement rule

$P(A^C) = 1 - P(A)$

For example, considering the first experiment from above where a fair coin was tossed just once, we know that

$\Omega = \left\{H, T \right\}.$

Since the coin is fair, it follows that the probability the coin toss results in a 'tail', $$P(T)$$, is 0.5. $$P(T^C)$$ is the probability the coin toss does not result in a 'tail', i.e., that the coin toss results in a 'head', because this is the only other possible outcome within the sample space. From the complement rule above, it follows that

$P(T^C) = 1 - P(T) = 1 - 0.5 = 0.5.$

In other words, the probability the coin toss results in a 'head' is 0.5. What if we were told that the coin was biased, so that the probability of the coin toss resulting in a 'tail' was 0.7? We would then have that

$P(T^C) = 1 - P(T) = 1 - 0.7 = 0.3.$

In other words, the probability the coin toss results in a 'head' is 0.3.

Equally likely events

Tossing a fair coin is an example of an experiment with equally likely events. Equally likely events are events that have an equal probability of occurring. If all simple events in $$\Omega$$ are equally likely to occur, their probability can simply be calculated as

$\displaystyle\frac{1}{\text{number of simple events in }\Omega}.$

Since the toss of a coin has two equally likely outcomes ($$H$$ or $$T$$), we have that $$P(H) = P(T) = \displaystyle\frac{1}{2} = 0.5$$. How about throwing a fair, 6-sided die? Since there are six possible outcomes that are all equally likely to occur, the probability of any one of those outcomes occurring is equal to

$\displaystyle\frac{1}{\text{number of simple events in }\Omega} = \frac{1}{6}.$