Sec 4 Modeling dependence

Easiest to model dependence in stationary case

4.1 Stationary

<Def> stationary: dependence does not change with time

<Def> strictly stationary

(Xt1,Xt2,...,Xtk)d=(Xt1+h,Xt2+h,...,Xtk+h)

  • joint probability distribution does not change with time
  • k=1
identicallydistributed:X1d=X2d=X3d=
    • means are all identical if means exist
      (rule out (排除) trend, seasonality)
    • variances are all identical if variances exist
      (rule out heteroskedasticity)
  • k=2
(Xt,Xs)d=(Xt+h,Xs+h),t,h Cov(Xt,Xs)=Cov(Xt+h,Xs+h)
    if variance exist
  • k3, get increasingly complicated
    • e.g. If Xt is i.i.d. then Xt is strictly stationary
  • strictly stationary is a very strong assumption

<Def> weakly stationary
also known as “stationary”, “covariance stationary”, “second order stationary”

  • Var(Xt)<
  • E(Xt) does not depend on t
  • Cov(Xt,Xt+h) does not depend on t
    • normally depends on h
  • first and second order moment properties do not change with time
    • implies all means, variances, covariances exist
    • implies means are identical / constant
      (rule out trend, seasonality)
    • with h=0
      Cov(Xt,Xt+h)=Var(Xt)
      implies variance are constant
      (rule out heteroskedasticity)

if there is trend, seasonality or heteroskedasticity in time series, this time series is not stationary

4.2 Weakly Stationary

need to check:
1. Var(Xt)<
2. E(Xt) does not depend on t
3. Cov(Xt,Xt+h) does not depend on t

e.g.1: {Xt} is strictly stationary and Var(Xt)< weakly stationary

strictly stationary weakly stationary

e.g.2:
    \{X_t\} independent,
    X_t\sim N(0,1) for t odd,
    X_t=\pm1 with prob. 0.5 for t even. (i.e. discrete uniform {-1,1})
  1. E(X_t)=0 \text{ for t odd}\\ E(X_t)=1\cdot0.5+(-1)\cdot0.5=0\text{ for t even}\\ \therefore E(X_t)\text{ does not depend on t}

  2. Var(X_t)=1\text{ for t odd}\\ \begin{array}{ll}Var(X_t)&=E(X_t^2)-[E(X_t)]^2\\&=1^2\cdot0.5+(-1)^2\cdot0.5-0^2=1\text{ for t even}\end{array}\\ \therefore Var(X_t)<\infty

  3. Cov(X_t,X_{t+h})=\big\{\begin{array}{ll} 1, &h=0\\ 0, &h\neq0 \end{array} \text{ does not depend on t}\\ \color{red}{\because\{X_t\}\text{ are indep. and }Var(X_t)<\infty}\\ \therefore\{X_t\}\text{ is weakly stationary}

4.3 White Noise (WN)

A sequence \{W_t\} is called white noise process if each value in the sequence has

  1. E(W_t)=0
  2. Var(W_t)=σ^2\;\;∀t
  3. Cov(W_t,W_s)=0\;\;if\;\;t≠s

Assume the error term is:

  • i.i.d. with normal distribution in regression
  • WN in time series

e.g.3:
    Z_t\sim WN(0,1)
    X_t=Z_t-0.5Z_{t-1} MA(1)

MA: moving average
MA(1): X_t=Z_t+a_1Z_{t-1}
MA(2): X_t=Z_t+a_1Z_{t-1}+a_2Z_{t-2}

  1. \begin{array}{ll} Var(X_t)&=Var(Z_t-0.5Z_{t-1})\\ &=Var(Z_t)+(-0.5)^2Var(Z_{t-1})+2\cdot1\cdot(-0.5)\cdot Cov(Z_t,Z_{t-1}) \color{red}{\because WN\therefore Cov(Z_t,Z_{t-1})=0}\\ &=1+0.25+0=1.25<\infty \end{array}
X,Y: r.v.'s \hspace{1cm} A,B: constants\\ Var(AX+BY)=A^2Var(X)+B^2Var(Y)+2ABCov(X,Y)
  1. \begin{array}{ll} E(X_t)&=E(Z_t-0.5Z_{t-1})\\ &=E(Z_t)-0.5E(Z_{t-1})=0-0=0\text{ does not depend on t} \end{array}

  2. \begin{array}{ll} Cov(X_t,X_{t+h})&=Cov(Z_t-0.5Z_{t-1},Z_{t+h}-0.5Z_{t+h-1})\\ &=\Bigg\{\begin{array}{ll} 1.25, &h=0\\ -0.5, &h=\pm1\\ 0, &o.w. \end{array}\text{ does not depend on t} \end{array}\\ \therefore\{X_t\}\text{ is stationary}

4.4 Random Walk Process

<Def> random walk process X_t=X_{t-1}+Z_t, \hspace{1cm}Z_t\sim WN(0, \sigma^2)

  • X_t: price on time t
  • X_{t-1}: price on time t-1
  • Z_t=X_t-X_{t-1}
    • If Z_t>0, price \uparrow
    • If Z_t<0, price \downarrow

Assume X_0=0 \begin{array}{ll} Var(X_t)&=Var(X_{t-1}+Z_t)\\ &=Var[(X_{t-2}+Z_{t-1})+Z_t]\\ &\vdots\\ &=Var(X_0+Z_1+Z_2+\cdots+Z_t)\\ &=Var\left(\sum_{j=1}^{t}Z_j\right)=\sum_{j=1}^{t}Var(Z_j)=t\cdot\sigma^2\text{ depends on t} \end{array}\\ \therefore\{X_t\}\text{ is stationary}