8.6 Exercises

  1. Simulate the dynamic linear model assuming XtN(1,0.1σ2), wtN(0,0.5σ2), μtN(0,σ2), β0=1, B0=0.5σ2, σ2=0.25, and Gt=1, for t=1,,100. Then, perform the filtering recursion fixing Σ=25×0.25, Ω1=0.5Σ (high signal-to-noise ratio) and Ω2=0.1Σ (low signal-to-noise ratio). Plot and compare the results.

  2. Simulate the dynamic linear model yt=βtxt+μt, βt=βt1+wt, where xtN(1,0.1σ2), wtN(0,0.5σ2), μtN(0,σ2), β0=0, B0=0.5σ2, and σ2=1, for t=1,,100. Perform the filtering and smoothing recursions from scratch.

  3. Simulate the process y_t = \alpha z_t + \beta_t x_t + \boldsymbol{h}^{\top}\boldsymbol{\epsilon}_t, \beta_t = \beta_{t-1} + \boldsymbol{H}^{\top}\boldsymbol{\epsilon}_t, where \boldsymbol{h}^{\top} = [1 \ 0], \boldsymbol{H}^{\top} = [0 \ 1/\tau], \boldsymbol{v}_t \sim N(\boldsymbol{0}_2, \sigma^2 \boldsymbol{I}_2), x_t \sim N(1, 2\sigma^2), z_t \sim N(0, 2\sigma^2), \alpha = 2, \tau^2 = 5, and \sigma^2 = 0.1, for t = 1, \dots, 200. Assume \pi({\beta}_0, {\alpha}, \sigma^2, {\tau}) = \pi({\beta}_0)\pi({\alpha})\pi(\sigma^2)\pi(\tau^2) where \sigma^2 \sim IG(\alpha_0/2, \delta_0/2), \tau^2 \sim G(v_{0}/2, v_{0}/2), {\alpha} \sim N({a}_0, {A}_0), and {\beta}_0 \sim N({b}_0, {B}_0) such that \alpha_0 = \delta_0 = 1, v_0 = 5, a_0 = 0, A_0 = 1, \beta_0 = 0, B_0 = \sigma^2/\tau^2. Program the MCMC algorithm including the simulation smoother.

  4. Show that the posterior distribution of \boldsymbol{\phi} \mid \boldsymbol{\beta}, \sigma^2, \boldsymbol{y}, \boldsymbol{X} in the model y_t = \boldsymbol{x}_t^{\top} \boldsymbol{\beta} + \mu_t where \phi(L) \mu_t = \epsilon_t and \epsilon_t \stackrel{iid}{\sim} N(0, \sigma^2) is N(\boldsymbol{\phi}_n, \boldsymbol{\Phi}_n)\mathbb{1}(\boldsymbol{\phi} \in S_{\boldsymbol{\phi}}), where \boldsymbol{\Phi}_n = (\boldsymbol{\Phi}_0^{-1} + \sigma^{-2} \boldsymbol{U}^{\top} \boldsymbol{U}), \boldsymbol{\phi}_n = \boldsymbol{\Phi}_n (\boldsymbol{\Phi}_0^{-1} \boldsymbol{\phi}_0 + \sigma^{-2} \boldsymbol{U}^{\top} \boldsymbol{\mu}), and S_{\boldsymbol{\phi}} is the stationary region of \boldsymbol{\phi}.

  5. Show that in the AR(2) stationary process, y_t = \mu + \phi_1 y_{t-1} + \phi_2 y_{t-2} + \epsilon_t, where \epsilon_t \sim N(0, \sigma^2), \mathbb{E}[y_t] = \frac{\mu}{1 - \phi_1 - \phi_2}, and \text{Var}[y_t] = \frac{\sigma^2(1 - \phi_2)}{1 - \phi_2 - \phi_1^2 - \phi_1^2 \phi_2 - \phi_2^2 + \phi_2^3}.

  6. Program a Hamiltonian Monte Carlo taking into account the stationary restrictions on \phi_1 and \phi_2, and \epsilon_0 such that the acceptance rate is near 65%.

  7. Stochastic volatility model

    • Program a sequential importance sampling (SIS) from scratch in the vanilla stochastic volatility model setting \mu = -10, \phi = 0.95, \sigma = 0.3, and T = 250. Check what happens with its performance.
    • Modify the sequential Monte Carlo (SMC) to perform multinomial resampling when the effective sample size is lower than 50% the initial number of particles.
  8. Estimate the vanilla stochastic volatility model using the dataset 17ExcRate.csv, provided by Ramı́rez-Hassan and Frazier (2024), which contains the exchange rate log daily returns for USD/EUR, USD/GBP, and GBP/EUR from one year before and after the WHO declared the COVID-19 pandemic on 11 March 2020.

  9. Simulate the VAR(1) process: \begin{bmatrix} y_{1t}\\ y_{2t}\\ y_{3t}\\ \end{bmatrix} = \begin{bmatrix} 2.8\\ 2.2\\ 1.3\\ \end{bmatrix} + \begin{bmatrix} 0.5 & 0 & 0\\ 0.1 & 0.1 & 0.3\\ 0 & 0.2 & 0.3\\ \end{bmatrix} \begin{bmatrix} y_{1t-1}\\ y_{2t-1}\\ y_{3t-1}\\ \end{bmatrix} + \begin{bmatrix} \mu_{1t}\\ \mu_{2t}\\ \mu_{3t}\\ \end{bmatrix}, where \boldsymbol{\Sigma} = \begin{bmatrix} 2.25 & 0 & 0\\ 0 & 1 & 0.5\\ 0 & 0.5 & 0.74\\ \end{bmatrix}.

    • Use vague independent priors setting \boldsymbol{\beta}_0 = \boldsymbol{0}, \boldsymbol{B}_0 = 100\boldsymbol{I}, \boldsymbol{V}_0 = 5\boldsymbol{I}, \alpha_0 = 3, and estimate a VAR(1) model using the rsurGibbs function from the package bayesm. Then, program from scratch

References

Ramı́rez-Hassan, Andrés, and David T. Frazier. 2024. “Testing Model Specification in Approximate Bayesian Computation Using Asymptotic Properties.” Journal of Computational and Graphical Statistics 33 (3): 1–14.