# 11 September 22

## 11.1 Announcements

• Assignment #3 is posted and due next week.

## 11.2 Extreme precipitation in Kansas

• What we will need to learn
• How to use R as a geographic information system
• New general tools from statistics
• Gaussian process
• Metropolis and Metropolis–Hastings algorithms
• Gibbs sampler
• How to use the hierarchical modeling framework to describe Kriging
• Hierarchical Bayesian model vs. “empirical” hierarchical model
• Specialized language used in spatial statistics (e.g., range, nugget, variogram)

## 11.3 Gibbs sampler

• History
• Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images (Geman and Geman 1984))
• Sampling-Based Approaches to Calculating Marginal Densities (Gelfand and Smith 1990)
• The gibbs sampler is responsible for the recent “Bayesian revolution”
• Gibbs sampler
• See pg. 169 in book.
• In many cases the distribution we want to sample from is multivariate
• If we can sample from the univariate full-conditional distribution for each random variable of a multivariate distribution, then we can use a Gibbs sampler to obtain dependent sample from the multivariate distribution

### 11.3.1 Gibbs sampler: simple example

$\boldsymbol{\theta}\sim\text{N}\Bigg(\mathbf{0},\bigg[\begin{array}{cc} 1 & \rho\\ \rho\ & 1 \end{array}\bigg]\Bigg)$ where $$\boldsymbol{\theta}\equiv(\theta_{1},\theta_{2})'$$. Suppose we didn’t know how to sample from the multivariate normal distribution, but we could obtain the conditional distributions $$[\theta_{1}|\theta_{2}]$$ and $$[\theta_{2}|\theta_{1}]$$ which are $\theta_{1}|\theta_{2}\sim\text{N}(\rho\theta_{2},1-\rho^{2})$ $\theta_{2}|\theta_{1}\sim\text{N}(\rho\theta_{1},1-\rho^{2})\,.$ It turns out that if we iteratively sample from $$[\theta_{1}|\theta_{2}]$$ and then $$[\theta_{2}|\theta_{1}]$$ (or the other way around) we will obtain samples from $$[\boldsymbol{\theta}]$$.

## 11.5 Metropolis-Hastings algorithm

• Understanding the Metropolis-Hastings algorithm (Chib and Greenberg 1995)
• Similar to the Metropolis algorithm, but the proposal distribution does not have to be symmetric.
• Modifications to the Metropolis algorithm
• $$R=\text{min}\big(1,\frac{f(\phi^{*})}{f(\phi^{(k)})}\times\frac{[\phi^{(k)}]}{[\phi^{*}]}\big)$$ where $$[\phi^{(k)}]$$ and $$[\phi^{*}]$$ is the pdf/pmf of the proposal distribution evaluated at $$\phi^{(k)}$$ and $$\phi^{*}$$ respectively