22 Expected Values of Linear Combinations of Random Variables
22.1 Linear rescaling
If \(X\) is a random variable and \(a, b\) are non-random constants then
\[\begin{align*} \text{E}(aX + b) & = a\text{E}(X) + b\\ \text{SD}(aX + b) & = |a|\text{SD}(X)\\ \text{Var}(aX + b) & = a^2\text{Var}(X) \end{align*}\]
22.2 Linearity of expected value
Example 22.1 Refer to the tables and plots in Example 5.29 in the textbook. Each scenario contains SAT Math (\(X\)) and Reading (\(Y\)) scores for 10 hypothetical students, along with the total score (\(T = X + Y\)) and the difference between the Math and Reading scores (\(D = X - Y\), negative values indicate lower Math than Reading scores). Note that the 10 \(X\) values are the same in each scenario, and the 10 \(Y\) values are the same in each scenario, but the \((X, Y)\) values are paired in different ways: the correlation is 0.78 in scenario 1, -0.02 in scenario 2, and -0.94 in scenario 3.
What is the mean of \(T = X + Y\) in each scenario? How does it relate to the means of \(X\) and \(Y\)? Does the correlation affect the mean of \(T = X + Y\)?
What is the mean of \(D = X - Y\) in each scenario? How does it relate to the means of \(X\) and \(Y\)? Does the correlation affect the mean of \(D = X - Y\)?
- Linearity of expected value. For any two random variables \(X\) and \(Y\), \[\begin{align*} \text{E}(X + Y) & = \text{E}(X) + \text{E}(Y) \end{align*}\]
- That is, the expected value of the sum is the sum of expected values, regardless of how the random variables are related.
- Therefore, you only need to know the marginal distributions of \(X\) and \(Y\) to find the expected value of their sum. (But keep in mind that the distribution of \(X+Y\) will depend on the joint distribution of \(X\) and \(Y\).)
- Whether in the short run or the long run, \[\begin{align*} \text{Average of $X + Y$ } & = \text{Average of $X$} + \text{Average of $Y$} \end{align*}\] regardless of the joint distribution of \(X\) and \(Y\).
- A linear combination of two random variables \(X\) and \(Y\) is of the form \(aX + bY\) where \(a\) and \(b\) are non-random constants. Combining properties of linear rescaling with linearity of expected value yields the expected value of a linear combination. \[ \text{E}(aX + bY) = a\text{E}(X)+b\text{E}(Y) \]
- Linearity of expected value extends naturally to more than two random variables.
22.3 Variance of linear combinations of random variables
Example 22.2 Consider a random variable \(X\) with \(\text{Var}(X)=1\). What is \(\text{Var}(2X)\)?
- Walt says: \(\text{SD}(2X) = 2\text{SD}(X)\) so \(\text{Var}(2X) = 2^2\text{Var}(X) = 4(1) = 4\).
- Jesse says: Variance of a sum is a sum of variances, so \(\text{Var}(2X) = \text{Var}(X+X)\) which is equal to \(\text{Var}(X)+\text{Var}(X) = 1+1=2\).
Who is correct? Why is the other wrong?
Example 22.3 Recall Example Example 22.1.
In which of the three scenarios is \(\text{Var}(X + Y)\) the largest? Can you explain why?
In which of the three scenarios is \(\text{Var}(X + Y)\) the smallest? Can you explain why?
In which scenario is \(\text{Var}(X + Y)\) roughly equal to the sum of \(\text{Var}(X)\) and \(\text{Var}(Y)\)?
In which of the three scenarios is \(\text{Var}(X - Y)\) the largest? Can you explain why?
In which of the three scenarios is \(\text{Var}(X - Y)\) the smallest? Can you explain why?
In which scenario is \(\text{Var}(X - Y)\) roughly equal to the sum of \(\text{Var}(X)\) and \(\text{Var}(Y)\)?
- Variance of sums and differences of random variables. \[\begin{align*} \text{Var}(X + Y) & = \text{Var}(X) + \text{Var}(Y) + 2\text{Cov}(X, Y)\\ \text{Var}(X - Y) & = \text{Var}(X) + \text{Var}(Y) - 2\text{Cov}(X, Y) \end{align*}\]
Example 22.4 Assume that SAT Math (\(X\)) and Reading (\(Y\)) scores follow a Bivariate Normal distribution, Math scores have mean 527 and standard deviation 107, and Reading scores have mean 533 and standard deviation 100. Compute \(\text{Var}(X + Y)\) and \(\text{SD}(X+Y)\) for each of the following correlations.
\(\text{Corr}(X, Y) = 0.77\)
\(\text{Corr}(X, Y) = 0.40\)
\(\text{Corr}(X, Y) = 0\)
\(\text{Corr}(X, Y) = -0.77\)
Example 22.5 Continuing the previous example. Compute \(\text{Var}(X - Y)\) and \(\text{SD}(X-Y)\) for each of the following correlations.
\(\text{Corr}(X, Y) = 0.77\)
\(\text{Corr}(X, Y) = 0.40\)
\(\text{Corr}(X, Y) = 0\)
\(\text{Corr}(X, Y) = -0.77\)
- The variance of the sum is the sum of the variances if and only if \(X\) and \(Y\) are uncorrelated. \[\begin{align*} \text{Var}(X+Y) & = \text{Var}(X) + \text{Var}(Y)\qquad \text{if $X, Y$ are uncorrelated}\\ \text{Var}(X-Y) & = \text{Var}(X) + \text{Var}(Y)\qquad \text{if $X, Y$ are uncorrelated} \end{align*}\]
- The variance of the difference of uncorrelated random variables is the sum of the variances
- If \(a, b, c\) are non-random constants and \(X\) and \(Y\) are random variables then \[ \text{Var}(aX + bY + c) = a^2\text{Var}(X) + b^2\text{Var}(Y) + 2ab\text{Cov}(X, Y) \]
22.4 Bilinearity of covariance
\[\begin{align*} \text{Cov}(X, X) &= \text{Var}(X)\qquad\qquad\\ \text{Cov}(X, Y) & = \text{Cov}(Y, X)\\ \text{Cov}(X, c) & = 0 \\ \text{Cov}(aX+b, cY+d) & = ac\text{Cov}(X,Y)\\ \text{Cov}(X+Y,\; U+V) & = \text{Cov}(X, U)+\text{Cov}(X, V) + \text{Cov}(Y, U) + \text{Cov}(Y, V) \end{align*}\]