# 21 Covariance and Correlation

- Quantities like expected value and variance summarize characteristics of the
*marginal*distribution of a single random variable. - When there are multiple random variables their
*joint*distribution is of interest. - Covariance and correlation summarize a characteristic of the joint distribution of two random variables, namely, the degree to which they “co-deviate from the their respective means”.

## 21.1 Covariance

- The
**covariance**between two random variables \(X\) and \(Y\) is \[\begin{align*} \text{Cov}(X,Y) & = \text{E}\left[\left(X-\text{E}[X]\right)\left(Y-\text{E}[Y]\right)\right]\\ & = \text{E}(XY) - \text{E}(X)\text{E}(Y) \end{align*}\] - Covariance is defined as the long run average of the products of the paired deviations from the mean
- Covariance can be computed as the expected value of the product minus the product of expected values.

**Example 21.1 **Consider the probability space corresponding to two rolls of a fair four-sided die. Let \(X\) be the sum of the two rolls and \(Y\) the larger of the two rolls. The joint pmf is contained in Example 10.1. It can also be shown that \(\text{E}(X) = 5\) and \(\text{E}(Y) = 3.125\).

Find \(\text{E}(XY)\). Is it equal to \(\text{E}(X)\text{E}(Y)\)?

Find \(\text{Cov}(X, Y)\). Why is the covariance positive?

- \(\text{Cov}(X,Y)>0\) (positive association): above average values of \(X\) tend to be associated with above average values of \(Y\)
- \(\text{Cov}(X,Y)<0\) (negative association): above average values of \(X\) tend to be associated with below average values of \(Y\)
- \(\text{Cov}(X,Y)=0\) indicates that the random variables are
**uncorrelated**: there is no overall positive or negative association. \[ X \text{ and } Y \text{ are uncorrelated} \Longleftrightarrow \text{Cov}(X, Y) = 0 \Longleftrightarrow \text{E}(XY) = \text{E}(X)\text{E}(Y) \] - But be careful! Uncorrelated is
*not*the same as independent.

**Example 21.2 **What is another name for \(\text{Cov}(X, X)\)?

## 21.2 Law of the unconscious statistician (LOTUS)

- In general \[\begin{align*} \text{E}\left(g(X, Y)\right) & \neq g\left(\text{E}(X), \text{E}(Y)\right) \end{align*}\]
- LOTUS for two random variables \[\begin{align*} & \text{Discrete $X, Y$ with joint pmf $p_{X, Y}$:} & \text{E}[g(X, Y)] & = \sum_{x}\sum_{y} g(x, y) p_{X, Y}(x, y)\\ & \text{Continuous $X, Y$ with joint pdf $f_{X, Y}$:} & \text{E}[g(X, Y)] & = \int_{-\infty}^\infty\int_{-\infty}^\infty g(x, y) f_{X, Y}(x, y)\,dxdy \end{align*}\]

**Example 21.3 **Let \(X\) and \(Y\) denote the resistances (ohms) of two randomly selected resistors, with, respectively, Uniform(135, 165) and Uniform(162, 198) marginal distributions. Assume \(X\) and \(Y\) are independent. Suppose the resistors are connected in parallel so that the system resistance is \[
R = \frac{1}{1/X + 1/Y} = \frac{XY}{X+Y}
\] Compute \(\text{E}(R)\).

## 21.3 Correlation

- The
**correlation (coefficient)**between random variables \(X\) and \(Y\) is \[\begin{align*} \text{Corr}(X,Y) & = \text{Cov}\left(\frac{X-\text{E}(X)}{\text{SD}(X)},\frac{Y-\text{E}(Y)}{\text{SD}(Y)}\right)\\ & = \frac{\text{Cov}(X, Y)}{\text{SD}(X)\text{SD}(Y)} \end{align*}\] - The correlation for two random variables is the covariance between the corresponding standardized random variables. Therefore, correlation is a
*standardized*measure of the association between two random variables. - A correlation coefficient has no units and is measured on a universal scale. Regardless of the original measurement units of the random variables \(X\) and \(Y\) \[ -1\le \textrm{Corr}(X,Y)\le 1 \]
- \(\textrm{Corr}(X,Y) = 1\) if and only if \(Y=aX+b\) for some \(a>0\)
- \(\textrm{Corr}(X,Y) = -1\) if and only if \(Y=aX+b\) for some \(a<0\)
- Therefore, correlation is a standardized measure of the strength of the
*linear*association between two random variables. - Covariance is the correlation times the product of the standard deviations. \[ \text{Cov}(X, Y) = \text{Corr}(X, Y)\text{SD}(X)\text{SD}(Y) \]

**Example 21.4 **Let \(X\) be the time (hours), starting now, until the next earthquake (of any magnitude) occurs in SoCal, and let \(Y\) be the time (hours), starting now, until the second earthquake from now occurs (so that \(Y-X\) is the time between the first and second earthquake). Suppose that \(X\) and \(Y\) are continuous RVs with joint pdf

\[ f_{X, Y}(x, y) = \begin{cases} 4e^{-2y}, & 0 < x< y < \infty,\\ 0, & \text{otherwise} \end{cases} \]

Recall that in Example 20.5 we found the marginal distributions of \(X\) and \(Y\). Using the marginal distributions, it can be shown that \(\text{E}(X) = 1/2\), \(\text{E}(Y) = 1\), \(\text{SD}(X)=1/2\), and \(\text{SD}(Y)=1/\sqrt{2}.\)

Compute \(\text{Cov}(X, Y)\).

Compute \(\text{Corr}(X, Y)\).