# 31 Stationary Stochastic Processes

- A stochastic process \(X(t)\) is
**wide-sense stationary (WSS)**if:- The mean function \(\mu_X(t)\) does not depend on \(t\): \(\mu_X(t) = \mu_X\) for all \(t\)
- The autocovariance function \(C_{X}(t,s)\) satisfies: for all times \(t, s\) and time shifts \(\tau\), \(C_{X}(t,t+\tau)=C_{X}(s,s+\tau)\).

- For a WSS process the mean function is constant in time and the autocovariance at two points in time depends only on how far apart the times are from each other, but not what the times are.
- That is, a process is WSS if its ensemble-
*average*properties are invariant under time shifts - For a WSS stochastic process \(X(t)\) we can define the autocovariance function as a function of the time shift, \(C_{X}(\tau)\): \[ C_{X}(\tau) = \text{Cov}[X(t+\tau),X(t)] \text{ for all }t \]
- Similarly, for a WSS stochastic process \(X(t)\) we can define the autocorrelation function \[ R_{X}(\tau) = \text{E}[X(t+\tau)X(t)] \text{ for all }t \]

**Example 31.1 **Determine whether each of the following stochastic processes is WSS.

A symmetric simple random walk (like in the Harry/Tom example).

\(X_n\) is a

*discrete time white noise process*: \(\ldots\), \(X_{-2}\), \(X_{-1}\), \(X_{0}\), \(X_{1}\), \(X_{2}\), \(\ldots\) is a sequence of independent and identically distributed (i.i.d.) random variables (e.g., \(X_n\sim N(0, 1)\) for all \(n\))

\(X(t) = A \cos(2\pi t)\), where the amplitude \(A\) is a random variable (with non-zero variance).

\(X(t) = \cos(2\pi t + \Theta)\), where the random phase shift \(\Theta\) has a Uniform\((0, 2\pi)\) distribution.

\(X(t) =A \cos(2\pi t+\Theta)\), where \(A\) and \(\Theta\) are independent random variables, and \(\Theta\) has a Uniform\((0, 2\pi)\) distribution.

\(X(t) = s(t) + N(t)\), where \(s(t)\) is a non-random signal (e.g. \(s(t)=3\cos(2\pi t+\pi/2)\)) and \(N(t)\) is Gaussian random noise, for which at any time \(t\), \(N(t)\) has a Normal(Gaussian) distribution with mean 0 and standard deviation \(\sigma\), and \(N(t)\) and \(N(s)\) are independent for any \(t, s\).

- If \(X(t)\) is WSS its autocovariance function \(C_{X}(\tau)\) satisfies
- \(C_{X}(0) = \sigma_X^2 = \text{Var}[X(t)]\) for all \(t\)
- \(C_{X}(\tau) = C_{X}(-\tau)\); the autocovariance function is symmetric about 0
- \(|C_{X}(\tau)|\le C_{X}(0)\); the autocovariance function is maximized at \(\tau=0\)
- If \(X(t)\) is periodic, so is \(C_{X}(\tau)\), and with the same period
- If \(X(t)\) has no periodic component then
^{1}\(C_X(\tau)\to 0\) as \(\tau\to \infty\)

- If \(X(t)\) is WSS its autocorrelation function \(R_{X}(\tau)\) satisfies
- \(R_{X}(0) = \sigma_X^2 +\mu_X^2= \text{E}[X^2(t)]\) for all \(t\)
- \(R_{X}(\tau) = R_{XX}(-\tau)\); that is, the autocorrelation function is symmetric (about 0) in \(\tau\)
- \(|R_{X}(\tau)|\le R_{XX}(0)\); that is, the autocorrelation function is maximized at \(\tau=0\)
- If \(X(t)\) is periodic, so is \(R_{X}(\tau)\), and with the same period
- If \(X(t)\) has no periodic component then
^{2}\(R_X(\tau)\to \mu_X^2\) as \(\tau\to \infty\)

**Example 31.2 **Suppose \(X(t)\) is a WSS process with autocorrelation function \[
R_{X}(\tau) = 4 + \frac{25}{1+6\tau^2}
\]

Find \(\text{E}[X^2(t)]\).

Find \(\text{E}[X(t)]\).

Find the autocovariance function.

Find the covariance of X(15) and X(18).

Find \(\text{Var}[X(t)]\).

Do we have enough information to compute \(\text{P}(X(15) > 3)\)?

- Roughly speaking, a stochastic process is stationary if its statistical properties do not change over time.
- A stochastic process is WSS if its ensemble-
*average*properties are invariant under time shifts. - A stochastic process \(X(t)\) is
**strict-sense stationary**if*all*of its statistical properties are invariant to time shifts: for any \(n\) and time points \(t_1, \ldots, t_n\) and any time shift \(\tau\), the joint distribution of \([X(t_1), \ldots, X(t_n)]\) is the same as the joint distribution of \([X(t_1+\tau), \ldots, X(t_n+\tau)]\)- In particular, \(X(t)\) has the same (marginal) distribution at each point in time \(t\)
- But this property of the marginal distributions alone is not enough to imply that the process is strict-sense stationary.

- If a process is strict-sense stationary then it is WSS, but the converse is not true

## 31.1 Some notes about ergodic processes

- Roughly speaking, an
**ergodic process**is one whose statistical (ensemble) properties match the corresponding temporal (time-related) properties. - Engineers often assume their noise models to be ergodic, so that their statistical properties can be deduced by analyzing just a single sample path of the random process.
- Roughly speaking, an ergodic process is stationary, but a stationary process is not necessarily ergodic
- An example of an ergodic process: The statistical properties of rolling one die 100 times are the same as the statistical properties of rolling 100 dice once each.
- One die, 100 times = evolution of one realization (sample path) of a process over time
- 100 dice, once each = 100 realizations of the process at a single time
- Since time properties match ensemble properties, the process of die-rolling is ergodic.
- Die-rolling is also stationary, since the statistical properties of the dice do not change per over time (i.e., as we continue to roll the dice).

- A stationary process which is not ergodic: Let \(X\) be a random variable with some distribution (say N(0, 1)) and define \(X(t)=X\) for all \(t\).
- \(X(t)\) is stationary, since the distribution of \(X\) obviously doesn’t change with time.
- \(X(t)\) is not ergodic: a single realization would be a constant, \(X = x\). This single constant has none of the statistical properties of the random variable.

- The
*time average*of a function \(x(t)\) is denoted \(\left\langle x(t)\right\rangle\) and defined as \[ \left\langle x(t)\right\rangle = \lim_{T\to \infty} \frac{1}{2T}\int_{-T}^T x(t) dt \] - Despite the fact that the notation includes \(t\), \(\left\langle x(t)\right\rangle\) is a number, not a function of \(t\) (since \(t\) has been integrated out)
- A stochastic process \(X(t)\) is
*mean ergodic*if its time average and ensemble average are the same: \[ \text{mean ergodic:}\quad \left\langle X(t)\right\rangle = \text{E}[X(t)] \] - Since \(\left\langle X(t)\right\rangle\) does not depend on \(t\), a mean ergodic process must have a constant ensemble mean, \(\text{E}[X(t)]=\mu_X\) for all \(t\).
- In general, \(\left\langle X(t)\right\rangle\) is a random variable but for a mean ergodic process it is a constant. Any sample path of a mean ergodic process will (eventually, in the limit) produce the same time average.
- The
*time autocorrelation*of a function \(x(t)\) is \[ \left\langle x(t)x(t+\tau)\right\rangle = \lim_{T\to \infty} \frac{1}{2T}\int_{-T}^T x(t)x(t+\tau) dt \] - Again, despite the fact that the notation includes \(t\), \(\left\langle x(t)x(t+\tau)\right\rangle\) is a function of \(\tau\) only, and not of \(t\) (since \(t\) has been integrated out)
- A stochastic process \(X(t)\) is
*autocorrelation ergodic*if \[ \text{autocorrelation ergodic:} \quad \left\langle X(t)X(t+\tau)\right\rangle = \text{E}[X(t)X(t+\tau)] \] Since \(\left\langle X(t)X(t+\tau)\right\rangle\) does not depend on \(t\), an autocorrelation ergodic process must have an autocorrelation function which depends only on the time shift \(\tau\), \(\text{E}[X(t)X(t+\tau)]=R_{X}(\tau)\) for all \(t\). - In general, \(\left\langle X(t)X(t+\tau)\right\rangle\) is a random function of \(\tau\) but for an autocorrelation ergodic process it is a deterministic function of \(\tau\). Any sample path of an autocorrelation ergodic process will (eventually, in the limit) produce the same time autocorrelation.
- If a stochastic process is both mean ergodic and autocorrelation ergodic, then it is WSS.

Technically, for this property to hold \(X(t)\) must also be

*mean ergodic*. Unless stated otherwise, you may assume that our random processes are mean ergodic.↩︎Technically, for this property to hold \(X(t)\) must also be

*mean ergodic*. Unless stated otherwise, you may assume that our random processes are mean ergodic.↩︎