Chapter 29 Generalized Linear Mixef-Effects Model

Example: Consider the experiment designed to evaluate the effectiveness of an anti-fungal chemical on plants. A total of 60 plant leaves were randomly assigned to treatment with 0, 5, 10, 15, 20, or 25 units of the anti-fungal chemical, with 10 plant leaves for each amount of anti-fungal chemical. All leaves were infected with a fungus. Following a two-week period, the leaves were studied under a microscope, and the number of infected cells was counted and recorded for each leaf.

Let \(\ell_i \sim N(0, \sigma_\ell^2)\) denote a random effect for the \(i\)th leaf. Suppose \(\log(\lambda_i) = \beta_0 + \beta_1x_i + \ell_i\) and \(y_i\mid \lambda_i \sim \text{Poisson}(\lambda_i)\). Finally, suppose \(\ell_1, \ldots, \ell_n\) are independent and that \(y_1, \ldots, y_n\) are conditionally independent given \(\lambda_1, \ldots, \lambda_n\).

Lognormal distribution: if \(\log(v) \sim N(\mu, \sigma^2 )\), then \(v\) is said to have a lognormal distribution. The mean and variance of a lognormal distribution are \(E(v) = \exp(\mu + \sigma^2/2)\), \(Var(v) = \exp(2\mu + 2\sigma^2) - \exp(2\mu + \sigma^2)\).

Suppose \(\log(v) \sim N(\mu, \sigma^2)\) and \(u\mid v \sim \text{Poisson}(v)\). Then \(E(u) = E(v) = \exp(\mu + \sigma^2/2)\), \[ \begin{aligned} \operatorname{Var}(u) &=E(\operatorname{Var}(u \mid v))+\operatorname{Var}(E(u \mid v))=E(v)+\operatorname{Var}(v) \\ &=\exp \left(\mu+\sigma^{2} / 2\right)+\exp \left(2 \mu+2 \sigma^{2}\right)-\exp \left(2 \mu+\sigma^{2}\right) \\ &=\exp \left(\mu+\sigma^{2} / 2\right)+\left(\exp \left(\sigma^{2}\right)-1\right) \exp \left(2 \mu+\sigma^{2}\right) \\ &=E(u)+\left(\exp \left(\sigma^{2}\right)-1\right)[E(u)]^{2} \end{aligned} \] So that \(E(y_i) = \exp(\beta_0 + \beta_1x_i + \sigma_\ell^2/2)\) and \(Var(y_i) = E(y_i) + (\exp(\sigma_\ell^2) - 1)[E(y_i)]^2\). Moreover, we have \[ f_i(y) = P(y_i = y) = \int_{0}^{\infty} \frac{\lambda^{y} \exp (-\lambda)}{y !} \frac{1}{\lambda \sqrt{2 \pi \sigma_{\ell}^{2}}} \exp \left\{\frac{-\left(\log (\lambda)-x_{i}^{\prime} \boldsymbol{\beta}\right)^{2}}{2 \sigma_{\ell}^{2}}\right\} d \lambda. \] There is no close-form expression for \(f_i(y)\) so the integral must be approximated by numerical methods. The glmer function in lme4 package uses Laplace approximation to approximate the integral, but glmer also permits the use of the more general integral approximation method known as adaptive Gauss-Hermite quadrature.

library(lme4)
o = glmer(y ~ x + (1|leaf), family = poisson(link = "log")) # nAGQ = 10
summary(o)

we can set nAGQ = xx to choose the number of points per axis for Gaussian-Hermite Approximation. The default is 1 (Laplace Approximation). Larger nAGQ increases the accuracy but reduces the speed.