Chapter 4 Independence and Conditional Statistics
4.1 Independence
Recall the statement of independent random variables Definition 2.3.1 that states two random variables \(X,Y\) are independent ifSubstituting this identify into Definition 3.2.1, gives an equivalent condition for two random variables to be independent. Specifically for any \(y\) such that \(f_Y(y)>0\):
``
This is to say that the conditional probability density function \(f_{X|Y}(x \mid y)\) is equal to the probability density function \(f_X(x)\), and does not depend on \(y\) at all. This makes sense: the information on the outcome of \(Y\) is irrelevant to the outcome of \(X\) since both are independent.
Consider again the game played by Annie and Bertie in Example 3.2.2, the scores of which are governed by the random variables \(X\) and \(Y\). We saw in Example 2.3.3 that \(X\) and \(Y\) are dependent. Verify this using the new equivalent definition for independence.
In Example 4.2.2, we calculated that the conditional PDF \(f_{X \mid Y}(x \mid y)\) is given by
From Example 2.1.8:
Note the result of Example 4.1.1 could have been deduced solely from the fact that the expression for \(f_{X|Y}(x|y)\) contains \(y\): without looking at \(f_{X}(x)\) explicitly we know it will contain no reference to \(y\) and so cannot be equal to \(f_{X|Y}(x|y)\).
Similarly independence is equivalent to the condition that \(f_{Y|X}(y|x) = f_{Y}(y)\). This criteria is deduced in an analogous fashion to the above identity \(f_{X|Y}(x|y) = f_{X}(x)\).
4.2 Conditional Expectation
In the Nike versus Adidas example at the opening of the chapter, two random variables were considered: \(S_{\text{Nike}}\) and \(S_{\text{Adidas}}\). One piece of information that we could give to Phil Knight is the expectation of \(S_{\text{Adidas}}\). However this doesn’t take into account all the information at our disposable. We know the value that the random variable \(S_{\text{Nike}}\) takes. The value we really want to calculate is the expectation of \(S_{\text{Adidas}}\) given the value the \(S_{\text{Nike}}\).
Posing this question in a general setting, motivates the following definition:
The conditional expectation of \(X\) given \(Y=y\), is defined by \[\mathbb{E}[X|Y=y] = \begin{cases} \sum\limits_x xp_{X|Y}(x|y), & \text{if $X$ is discrete,} \\[7pt] \int_{-\infty}^\infty xf_{X|Y}(x|y) \,dx, & \text{if $X$ is continuous.} \end{cases} \] where either \(p_{X|Y}(x|y)\) is the conditional PMF of \(X\) given \(Y\), or \(f_{X|Y}(x|y)\) is the conditional PDF of \(X\) given \(Y\).
Note that this definition covers both discrete and continuous random variables.
You are seeking to buy a car, and want to estimate the cost. You find an article online that has collected data on cars listed across various sites. The article models the cost of the car by a random variable \(X\) and the age of the car by a random variable \(Y\). The probability of finding a car with \(X=x\) and \(Y=y\) is governed by the joint PDF
\[ f_{X,Y}(x,y)= \begin{cases} \frac{1}{y} e^{-(\frac{x}{y}+y)}, & 0<x,y<\infty, \\[3pt] 0, & \text{otherwise.}\end{cases} \]
For some fixed age of car \(y>0\), find the expected cost.
Mathematically the question is asking us to calculate \(E[X|Y=y]\).
In general it is enough to consider only the region on which \(f_{X,Y}(x,y)\) is non-zero.
First we want to calculate \(f_{X \mid Y}(x \mid y)\). To do so, first calculate the marginal distribution \(f_Y(y)\):
\[\begin{align*} f_Y(y) &= \int_{-\infty}^\infty f_{X,Y}(x,y) \,dx \\[3pt] &= \int_0^\infty \frac{1}{y}e^{-(\frac{x}{y}+y)} \,dx \\[3pt] &= \left[ - e^{-\frac{x}{y}-y} \right]_{0}^{\infty} \\[3pt] &= e^{-y} \end{align*}\]
Hence, for \(y>0\), the conditional probability density function is
\[\begin{align*} f_{X|Y}(x|y) &= \frac{f_{X,Y}(x,y)}{f_Y(y)} \\[5pt] &= \begin{cases} \frac{\frac{1}{y}e^{-\left(\frac{x}{y}+y\right)}}{e^{-y}} & \text{if } x>0,\\[3pt] 0, & \text{if } x \leq 0. \end{cases} \\[5pt] &= \begin{cases} \frac{1}{y}e^{-\frac{x}{y}} & \text{if } x>0,\\[3pt] 0, & \text{if } x \leq 0. \end{cases} \end{align*}\]
Therefore by Definition 4.2.1 the conditional expectation of \(X\) is
\[\begin{align*} E[X|Y=y] &= \int_{-\infty}^\infty x f_{X|Y}(x|y) \,dx \\[3pt] &= \int_0^\infty\frac{x}{y}e^{-\frac{x}{y}} \,dx \\[3pt] &= \left[ -x e^{-\frac{x}{y}} \right]_{0}^{\infty} + \int_{0}^{\infty} e^{-\frac{x}{y}} \,dx \\[3pt] &= 0 + \left[ -ye^{-\frac{x}{y}} \right]_{0}^{\infty} \\[3pt] &=y. \end{align*}\] where the third equality follows from integration by parts.Critique the model that the article uses in Example 4.2.2.
Many of the nice properties of expectation such as linearity are inherited by conditional expectation.
Let \(a\) be a real number, and \(X,Y,Z\) be random variables. Each of the following rules pertaining to conditional expectation given \(Z=z\) hold in general
Similarly to the case of joint expectation in Definition 2.5.4, the definition of conditional expectation can extend to functions of random variables:
Let \(X,Y\) be continuous random variables, with conditional PDF \(f_{X \mid Y}\). Then, for a function \(g(X)\) of \(X\), we have \[\mathbb{E}[g(X)\mid Y=y] = \int_{-\infty}^{\infty} g(x) f_{X\mid Y}(x\mid y) \,dx.\]
Consider the random variables \(X\) and \(Y\) from Example 4.2.2. Calculate \(\mathbb{E}[X^2 \mid Y=y]\).
In Example 4.2.2, we found
\[\begin{align*} f_{X|Y}(x|y) &= \frac{f_{X,Y}(x,y)}{f_Y(y)} \\[5pt] &= \begin{cases} \frac{\frac{1}{y}e^{-\left(\frac{x}{y}+y\right)}}{e^{-y}} & \text{if } x>0,\\[3pt] 0, & \text{if } x \leq 0. \end{cases} \\[5pt] &= \begin{cases} \frac{1}{y}e^{-\frac{x}{y}} & \text{if } x>0,\\[3pt] 0, & \text{if } x \leq 0. \end{cases} \end{align*}\]
Therefore applying Theorem 4.2.4
\[\begin{align*} \mathbb{E} \left[ X^2 \mid Y=y \right] &= \int_{-\infty}^{\infty} x^2 f_{X\mid Y}(x \mid y) \,dx \\[5pt] &= \int_{0}^{\infty} \frac{x^2}{y} e^{- \frac{x}{y}} \,dx \\[5pt] &= \left[ -x^2 e^{-\frac{x}{y}} \right]_0^\infty + \int_0^\infty 2x e^{-\frac{x}{y}} \,dx \\[5pt] &= 2 \int_0^\infty x e^{-\frac{x}{y}} \,dx \\[5pt] &= 2 \left[ -xye^{-\frac{x}{y}} \right]_0^\infty + 2 \int_0^\infty y e^{-\frac{x}{y}} \,dx \\[5pt] &= 2 \left[ -y^2 e^{-\frac{x}{y}} \right]_0^\infty \\[5pt] &= 2y^2. \end{align*}\]
Note that integration by parts has been used twice in the evaluation of the integral.4.3 Conditional Variance
Considering again the opening Nike versus Adidas example, it would be sensible to provide Phil Knight with the variance of \(S_{\text{Adidas}}\). However calculating the variance of \(S_{\text{Adidas}}\) doesn’t utilise our knowledge of the \(S_{\text{Nike}}\) random variable. What we really want to calculate is the variance of \(S_{\text{Adidas}}\) given the value of \(S_{\text{Nike}}\).
Moving this idea to the general setting leads to the following definition:
The conditional variance of \(X\) given \(Y=y\), is defined by \[\text{Var}(X \mid Y=y) = \mathbb{E} \left[ \big(X-\mathbb{E}[X \mid Y=y] \big)^2 \mid Y=y \right]\]
Note that both of the expectations that appear on the right hand side of Definition 4.3.1 are conditional expectations. Calculating the conditional variance using this expression can be computationally difficult. The following proposition offers an alternative method by which to calculate conditional variance.
An equivalent definition of conditional variance is \[\text{Var}(X \mid Y=y) = \mathbb{E} [X^2 \mid Y=y] - \left( \mathbb{E}[X \mid Y=y] \right)^2.\]
The right hand side of Proposition 4.3.2 can be calculated using Theorem 4.2.4.
Consider again the random variables from Example 4.2.2. Calculate the conditional variance of \(X\) given \(Y\).
From Example 4.2.2 and Example 4.2.5, we know
\[\begin{align*} \mathbb{E}[X \mid Y=y] &= y, \\[3pt] \mathbb{E}[X^2 \mid Y=y] &= 2y^2. \end{align*}\]
Therefore by Proposition 4.3.2:
\[\begin{align*} \text{Var}(X \mid Y=y) &= \mathbb{E}[X^2 \mid Y=y ] - \left( \mathbb{E}[X \mid Y] \right)^2 \\[3pt] &= 2y^2 -(y)^2 \\[3pt] &= y^2. \end{align*}\]There is an interesting result linking the variance to a random variable \(X\) to an expression in terms of the conditional expectations and variances of \(X\) given a second random variable \(Y\). This result is known as the Law of Total Variance:
Let \(X,Y\) be two random variables. Then \[\text{Var}(X) = \mathbb{E}\big[ \text{Var}(X \mid Y=y) \big] + \text{Var} \big( \mathbb{E}[X \mid Y=y] \big).\]
The Law of Total Variance states that the variance of \(X\) decomposes as the sum of the expected variance of \(X\) given \(Y\) and the variance of the expectation of \(X\) given \(Y\). This is a particularly powerful result that will be explored further in MATH1058: Statistical Models and Methods.