2.2 Expected values and Variance

Definition:

Let \(X\) be a random variable with range \(R_X\).

  1. Define the expected value of a random variable \(\mathbb{E}(X)\) by \[ \mu = \mathbb{E}(X) = \begin{cases} \displaystyle \sum_{x \in R_X} xp_X(x), & X \mbox{ is discrete} \\ \displaystyle \int_{x \in R_X} xf_X(x) \, dx, & X \mbox{ is continuous} \\ \end{cases} \]
  2. Define the variance of a random variable \(\text{Var}(X)\) by \[ \sigma^2 = \text{Var}(X) = \mathbb{E}[(X-\mu)^2] = \begin{cases} \displaystyle \sum_{x \in R_X} (x-\mu)^2 p_X(x), & X \mbox{ is discrete} \\ \displaystyle \int_{x \in R_X} (x-\mu)^2 f_X(x) \, dx, & X \mbox{ is continuous} \\ \end{cases} \]
  3. Define the standard deviation of a random variable \(\text{SD}(X)\) by \[\sigma = \text{SD}(X) = \sqrt{\text{Var}(X)}\]


Theorem: (the rule of the lazy statistician)

Let \(X\) be a random variable with range \(R_X\), and \(g(\cdot)\) be any real-valued function, then \[ \mathbb{E}[g(X)] = \begin{cases} \displaystyle \sum_{x \in R_X} g(x) p_X(x), & X \mbox{ is discrete} \\ \displaystyle \int_{x \in R_X} g(x) f_X(x) \, dx, & X \mbox{ is continuous} \\ \end{cases} \]

(the proof is not trivial !)


Proposition:

Suppose \(X\) is a random variable, \(a,b,c \in \mathbb{R}\), \(g(\cdot)\) is any real-valued function, then

  1. \(\mathbb{E}(X^2) = \sigma^2 + \mu^2\)
  2. \(\mathbb{E}[ag(X)+b] = a\mathbb{E}[g(X)]+b\)
  3. \(\text{Var}(X) \geq 0\)
  4. \(\text{Var}(c)=0\)
  5. \(\text{Var}(ag(X)+b) = a^2\text{Var}(g(X))\)
  6. \(\text{Var}(X) = \mathbb{E}(X^2) - [\mathbb{E}(X)]^2\)
  7. \(\text{SD}(X) \geq 0\)
  8. \(\text{SD}(c)=0\)
  9. \(\text{SD}(ag(X)+b) = |a|\text{SD}(g(X))\)

2.2.1 *Approximation of a random variable

If we want to calculate the expectation and variance of the transformation of a random variable \(g(X)\), then we can use Taylor expansion to approximate it’s value.

Expand \(g(X)\) at \(X=\mu\): \[ \begin{align*} g(X) &= \frac{g(\mu)}{0!}(X-\mu)^0 + \frac{g'(\mu)}{1!}(X-\mu)^1 + \frac{g''(\mu)}{2!}(X-\mu)^2 + \cdots \\ &= g(\mu) + g'(\mu)(X-\mu) + \frac{g''(\mu)}{2}(X-\mu)^2 + \cdots \end{align*} \]

Take expectation with the first three terms: \[ \begin{align*} \mathbb{E}[g(X)] &\approx \mathbb{E}[g(\mu)] + g'(\mu)\mathbb{E}[(X-\mu)] + \frac{1}{2}g''(\mu)\mathbb{E}[(X-\mu)^2] \\ &= g(\mu) + \frac{1}{2}g''(\mu)\sigma^2 \end{align*} \]

Take variance with the first two terms: \[ \begin{align*} \text{Var}[g(X)] &\approx \text{Var}[g(\mu)] + [g'(\mu)]^2 \text{Var}[(X-\mu)] \\ &= [g'(\mu)]^2 \sigma^2 \end{align*} \]

This is related to a method in statistics called the \(\Delta\)-method, which is a method of deriving the asymptotic distribution of a random variable. We will introduce in Chapter ??.