31  Stationary Stochastic Processes

Example 31.1 Determine whether each of the following stochastic processes is WSS.

  1. A symmetric simple random walk (like in the Harry/Tom example).




  2. \(X_n\) is a discrete time white noise process: \(\ldots\), \(X_{-2}\), \(X_{-1}\), \(X_{0}\), \(X_{1}\), \(X_{2}\), \(\ldots\) is a sequence of independent and identically distributed (i.i.d.) random variables (e.g., \(X_n\sim N(0, 1)\) for all \(n\))




  3. \(X(t) = A \cos(2\pi t)\), where the amplitude \(A\) is a random variable (with non-zero variance).




  4. \(X(t) = \cos(2\pi t + \Theta)\), where the random phase shift \(\Theta\) has a Uniform\((0, 2\pi)\) distribution.




  5. \(X(t) =A \cos(2\pi t+\Theta)\), where \(A\) and \(\Theta\) are independent random variables, and \(\Theta\) has a Uniform\((0, 2\pi)\) distribution.




  6. \(X(t) = s(t) + N(t)\), where \(s(t)\) is a non-random signal (e.g. \(s(t)=3\cos(2\pi t+\pi/2)\)) and \(N(t)\) is Gaussian random noise, for which at any time \(t\), \(N(t)\) has a Normal(Gaussian) distribution with mean 0 and standard deviation \(\sigma\), and \(N(t)\) and \(N(s)\) are independent for any \(t, s\).




Example 31.2 Suppose \(X(t)\) is a WSS process with autocorrelation function \[ R_{X}(\tau) = 4 + \frac{25}{1+6\tau^2} \]

  1. Find \(\text{E}[X^2(t)]\).




  2. Find \(\text{E}[X(t)]\).




  3. Find the autocovariance function.




  4. Find the covariance of X(15) and X(18).




  5. Find \(\text{Var}[X(t)]\).




  6. Do we have enough information to compute \(\text{P}(X(15) > 3)\)?




31.1 Some notes about ergodic processes

  • Roughly speaking, an ergodic process is one whose statistical (ensemble) properties match the corresponding temporal (time-related) properties.
  • Engineers often assume their noise models to be ergodic, so that their statistical properties can be deduced by analyzing just a single sample path of the random process.
  • Roughly speaking, an ergodic process is stationary, but a stationary process is not necessarily ergodic
  • An example of an ergodic process: The statistical properties of rolling one die 100 times are the same as the statistical properties of rolling 100 dice once each.
    • One die, 100 times = evolution of one realization (sample path) of a process over time
    • 100 dice, once each = 100 realizations of the process at a single time
    • Since time properties match ensemble properties, the process of die-rolling is ergodic.
    • Die-rolling is also stationary, since the statistical properties of the dice do not change per over time (i.e., as we continue to roll the dice).
  • A stationary process which is not ergodic: Let \(X\) be a random variable with some distribution (say N(0, 1)) and define \(X(t)=X\) for all \(t\).
    • \(X(t)\) is stationary, since the distribution of \(X\) obviously doesn’t change with time.
    • \(X(t)\) is not ergodic: a single realization would be a constant, \(X = x\). This single constant has none of the statistical properties of the random variable.
  • The time average of a function \(x(t)\) is denoted \(\left\langle x(t)\right\rangle\) and defined as \[ \left\langle x(t)\right\rangle = \lim_{T\to \infty} \frac{1}{2T}\int_{-T}^T x(t) dt \]
  • Despite the fact that the notation includes \(t\), \(\left\langle x(t)\right\rangle\) is a number, not a function of \(t\) (since \(t\) has been integrated out)
  • A stochastic process \(X(t)\) is mean ergodic if its time average and ensemble average are the same: \[ \text{mean ergodic:}\quad \left\langle X(t)\right\rangle = \text{E}[X(t)] \]
  • Since \(\left\langle X(t)\right\rangle\) does not depend on \(t\), a mean ergodic process must have a constant ensemble mean, \(\text{E}[X(t)]=\mu_X\) for all \(t\).
  • In general, \(\left\langle X(t)\right\rangle\) is a random variable but for a mean ergodic process it is a constant. Any sample path of a mean ergodic process will (eventually, in the limit) produce the same time average.
  • The time autocorrelation of a function \(x(t)\) is \[ \left\langle x(t)x(t+\tau)\right\rangle = \lim_{T\to \infty} \frac{1}{2T}\int_{-T}^T x(t)x(t+\tau) dt \]
  • Again, despite the fact that the notation includes \(t\), \(\left\langle x(t)x(t+\tau)\right\rangle\) is a function of \(\tau\) only, and not of \(t\) (since \(t\) has been integrated out)
  • A stochastic process \(X(t)\) is autocorrelation ergodic if \[ \text{autocorrelation ergodic:} \quad \left\langle X(t)X(t+\tau)\right\rangle = \text{E}[X(t)X(t+\tau)] \] Since \(\left\langle X(t)X(t+\tau)\right\rangle\) does not depend on \(t\), an autocorrelation ergodic process must have an autocorrelation function which depends only on the time shift \(\tau\), \(\text{E}[X(t)X(t+\tau)]=R_{X}(\tau)\) for all \(t\).
  • In general, \(\left\langle X(t)X(t+\tau)\right\rangle\) is a random function of \(\tau\) but for an autocorrelation ergodic process it is a deterministic function of \(\tau\). Any sample path of an autocorrelation ergodic process will (eventually, in the limit) produce the same time autocorrelation.
  • If a stochastic process is both mean ergodic and autocorrelation ergodic, then it is WSS.

  1. Technically, for this property to hold \(X(t)\) must also be mean ergodic. Unless stated otherwise, you may assume that our random processes are mean ergodic.↩︎

  2. Technically, for this property to hold \(X(t)\) must also be mean ergodic. Unless stated otherwise, you may assume that our random processes are mean ergodic.↩︎