Chapter 5 Introduction to JAGS

5.1 WinBUGS

  • BUGS language developed in 1990s
  • There is a point and click routine called WinBUGS and OpenBUGS
  • WinBUGS made more of a splash because it made Bayesian estimation accessible
  • WinBUGS can be slow and fail to converge…so other softwares have been developed

5.1.1 BUGS? JAGS?

  • BUGS was first on the scene, standalone or run through R
  • JAGS originally for BUGS models on Unix, but also developed with R in mind
  • BUGS and JAGS differences tend to be minor (Plummer 2012)
  • JAGS has really taken over BUGS, practically
The BUGS logo.

Figure 5.1: The BUGS logo.

5.1.2 STAN?

  • STAN is newest, developed by Gelman et al.
  • STAN fits models in C++, but can also be run through R
  • STAN is more different from other two; more language differences, more code, and fitting differences, but also offers some improvements (diagnostics) for more complex models.
The STAN logo.

Figure 5.2: The STAN logo.

5.2 JAGS

  1. Software can be called from R
  2. JAGS is functionally for the MCMC
  3. All you need to do
  • Specify the model likelihood
  • Specify statistical distributions
  • JAGS does the rest…
  1. JAGS is not a procedural language like R; you can write the model code in any order

5.2.1 General steps to fitting a model in JAGS

  1. Define model in BUGS language (write in R and export as text file to be read by JAGS)
  2. Bundle data
  3. Provide initial values for parameters
  4. Set MCMC settings
  • # iterations
  • # to thin
  • # to burn-in
  • # of chains
  1. Define parameters to keep track of (i.e., parameters of interest)
  2. Call JAGS from R to fit the model
  3. Assess convergence

5.2.2 Define the BUGS Model: Types of Nodes

Constants—specified in the data file; fixed value(s)

  • example: number of observations, \(n\)

Stochastic—variables given a distribution

  • example: variable ∼ distribution(parameter 1, parameter 2, . . .)
  • \(y[i] \sim N(\mu,\sigma^2)\) is written as y[i] ~ dnorm(mu,tau)
  • Note that the variance is expressed as a precision (\(\tau\)), where \(\tau = \frac{1}{\sigma^2}\)

Deterministic—logical functions of other nodes

  • example: mu[i] <- alpha + beta * x[i]

5.2.3 Arrays and Indexing

Arrays are indexed by terms within square brackets

  • 1 term indexed: y[i] ∼ dnorm(mu, tau)
  • 2 term indexed: mu.p[i,j] <- psi[i,j] * w[i]
  • 3 term indexed: mu[i,j,k] <- z[i,k] * gamma[i,j,k]

5.2.4 Repeated Structures

for loops

Recall that n is a constant and needs to be defined!

5.2.5 Likelihood Specification

Assume the response variable \(Y\) with \(n\) observations is stored in a vector \(y\) with elements \(y_i\). The stochastic part of the model is written as:

\(Y \sim distribution(\phi)\)

Where \(\phi\) is the parameter vector of the assumed distribution. The parameter vetor is linked with some explanatory variables, \(X_1, X_2, ...X_p\) using a link function \(h\)

\(\phi = h(\theta,X_1, X_2, ...X_p)\)

Where \(\theta\) is a set of parameters used to specify the link function and the final model structure. In generalized linear models, the link function links the parameters of the assumed distribution with a linear combination of predictor variables.

Family Default link function
binomial logit
gaussian identity
Gamma inverse
inverse gaussian 1/\(\mu^2\)
poisson log
quasi identity (with variance = constant)
quasibinomial logit
quasipoisson log

5.2.7 Simple Example

You would like to run a model to draw inference about the mean. The data include 30 observations of fish length (\(y_i=1, ...30\))

Statistical Model

\[y_i \sim N(\mu,\sigma^2)\]

Priors

\[\mu \sim N(0,100)\] \[\sigma \sim U(0,10)\]

5.2.8 Derived Quantities

One of the nicest things about a Bayesian analysis is that parameters that are functions of primary parameters and their uncertainty (e.g.,standard errors or credible intervals) can easily be obtained using the MCMC posterior samples. We just add a line to the JAGS model that computes the derived quantity of interest, and we directly obtain samples from the posterior distributions of not only the original estimated parameters, but the derived relationships we seek to quantify. In a frequentist mode of inference, this would require application of the delta method (or various procedures that correct for computations) which is more complicated and also makes more assumptions. In the Bayesian analysis, estimation error is automatically propagated into functions of parameters.

“Derived quantities: One of the greatest things about a Bayesian analysis.” Kery 2010

5.3 Convergence

  • Think about priors: do they need to be more informative?
  • Starting/initial values: make sure they are reasonable
  • Standardize data (predictors): \(z_i = (x_i − \bar{x} )/s_x\)
    • Slope and intercept parameters tend to be correlated
    • MCMC sampling from tightly correlated distributions is difficult (samplers can get stuck)
    • Standardizing data reduces correlation (mean-centering) and allows us to set vague priors no matter what the scale of the original data
    • Consider standardizing by 2 SDs (Gelman and Hill 2006)
  • Visually inspect trace plots and posterior densities
  • Gelman-Rubin statistic (\(\hat{R}\))
    • It compares the within-chain variance to the between-chain variance. Is there a chain effect?
    • Values near 1 indicates likely convergence, a value of \(\leq 1.1\) is considered acceptable by some

5.4 Additional Resources

  • Kéry (2010), Chapter 5
  • Lykou and Ntzoufras (2011)
  • Plummer (2012) (JAGS User Manual)
  • Spiegelhalter et al. (2003) (WinBUGS User Manual)

5.5 JAGS in R: Model of the Mean

This tutorial will work through the code needed to run a simple JAGS model, where the mean and variance are estimated using JAGS. The model is likely not very useful, but the objective is to show the preperation and coding that goes into a JAGS model. The first thing we need to do is load the R2jags library.

5.5.2 Define Model

We will write the model directly in R using JAGS syntax. Note that the model is entirely character data (according to R) and is sunk to an external text file. Note that you can write the model at any point prior to running the model, but I find it useful to code the model first as several other preperations we will make will be dependent upon the model and much easier to do with a model to reference.

5.5.3 Bundle Data

We need to bundle our data into a list that JAGS will be able to read. This is very simple in this exercise and generally very simple in more complex models. Just make sure you have all the elements in that the model needs, which may extend beyond data. Using an equals sign (not assignment operator), set the data that JAGS is looking for on the left side, to the input data on the right side.

5.5.4 Initial Values

JAGS will generate random starting values if not specified, and for simple models this should work. However, it is strongly suggested to get in the habit of providing starting values because eventually you will need to start the MCMC chains in a high-probability area. Like the data, list starting values for any parameters in the model.

5.5.5 MCMC Settings

You can adjust or set the MCMC settings directly in the JAGS command, but I recommend against hardcoding these settings as you will likely want an easy way to modify them. This simple code adjusts the settings, and the JAGS command will never need to be modified.

5.5.6 Parameters to Monitor

JAGS models can get complex and often have many (hundreds or thousands) of parameters. Here we will specify that we want to monitor both parameters in this model, but in the future, you may want to specify only some parameters of interest.

5.5.7 Run JAGS model

The command jags() runs the model, with the arguments below. Again, we have specificed everything outide this command, which makes things a bit more manageable.

5.5.8 Assess Convergence and Model Output

We will get into this more, but the first thing to do is print the model object summary. This will give you a convergence statistic as a place to start. We can also run traceplots and density plots for visual convergence.

Also, get comfortable working with the JAGS model object, which stores a lot of information. Inspect it here.

How would you get the posterior mean out of the JAGS model object without using the summary function?

How would you plot the posterior by hand (it need not be pretty)?

How would you make traceplots?

How would you make density plots?

There are a number of packages that can also be used to plot JAGS output and make things simpler. However, it is not a bad idea to become comfortable with the JAGS model object as there may come a model that has sufficiently complex output that a canned diagnostic does not work.

References

Gelman, Andrew, and Jennifer Hill. 2006. Data Analysis Using Regression and Multilevel/Hierarchical Models. Cambridge university press.

Kéry, Marc. 2010. Introduction to Winbugs for Ecologists: Bayesian Approach to Regression, Anova, Mixed Models and Related Analyses. Academic Press.

Lykou, Anastasia, and Ioannis Ntzoufras. 2011. “WinBUGS: A Tutorial.” Wiley Interdisciplinary Reviews: Computational Statistics 3 (5). Wiley Online Library: 385–96.

Plummer, Martyn. 2012. “JAGS Version 3.3. 0 User Manual.” International Agency for Research on Cancer, Lyon, France.

Spiegelhalter, David, Andrew Thomas, Nicky Best, and Dave Lunn. 2003. “WinBUGS User Manual.” version.