4.1 Definitions
The mean (or the expected value) of a rv, say \(X\), is denoted \(E[X]\) (and sometimes by \(\mu\)) and is defined as \[ E[X] = \begin{cases} \displaystyle \int_S x f(x)\, dx & \qquad\text{for $x$ continuous};\notag\\ ~&\\ \displaystyle E[X] = \sum_S x f(x) & \qquad\text{for $x$ discrete}.\notag\\ \end{cases} \]
where \(S\) is the sample space. The mean is also called the first moment about \(0\).
More generally, the expected value of any function of \(X\), say \(g(X)\), is denoted \(E[g(X)]\) and is defined as \[ E[g(X)] = \begin{cases} \displaystyle\int_S g(x) f(x)\, dx & \qquad\text{for $x$ continuous};\notag\\ ~&\\ \displaystyle\sum_S g(x) f(x) & \qquad\text{for $x$ discrete}.\notag \end{cases} \] One important application of this is for the variance, where \(g(X) = (x-\mu)^2\). The variance of a rv, say \(X\), is denoted \(\text{var}[X]\) and is defined as \[ \text{var}[X] = E[(x-\mu)^2] = \begin{cases} \displaystyle\int_S (x-\mu)^2 f(x)\, dx &\qquad\text{for $x$ continuous};\notag\\ ~&\\ \displaystyle\sum_S (x-\mu)^2 f(x) &\qquad\text{for $x$ discrete}.\notag \end{cases} \] The variance is called the second moment about the mean. The standard deviation is the positive square root of the variance, and measures the amount of variation in the rv \(X\).
The covariance between two variables, say \(X\) and \(Z\), is \[\begin{align} \text{Cov}[X, Z] &= E[ (X-E[X]) (Z-E[Z])]\notag\\ &= E[XZ] - E[X]E[Z].\notag \end{align}\] If \(\text{Cov}[X, Z] = 0\), then \(X\) and \(Z\) are said to be uncorrelated.
If \(X\) and \(Z\) are independent, then \(\text{Cov}[X, Z] = 0\).
However, if \(\text{Cov}[X,Z] = 0\) then \(X\) and \(Z\) may be or may not be independent.