Chapter 10 The Aitken Model

  • Orthogonal Matrices: A square matrix \(P\) is said to be orthogonal if and only if \(P'P = I\)
  • Spectral Decomposition Theorem: An \(n\times n\) symmetric matrix \(H\) may be decomposed as \(H = P\Lambda P' = \sum_{i=1}^n \lambda_i p_ip_i'\)
    • \(P\) is an \(n\times n\) orthogonal matrix whose columns \(p_1,\ldots, p_n\) are the orthonormal eigenvectors of \(H\)
    • \(\Lambda = \text{diag}(\lambda_1, \ldots, \lambda_n)\) is a diagonal matrix whose diagonal entries are the eigenvalues of \(H\)
  • Symmetric Square Root Matrix: \(H\) is an \(n\times n\) symmetric, NND matrix. Then there exists a symmetric, NND matrix \(B\) such that \(BB = H\) where \(B = P\Lambda^{1/2}P'\).
  • Aitken Model: \(y = X\beta + \epsilon\), \(E(\epsilon) = 0\), \(Var(\epsilon) = \sigma^2 V\) where \(V\) is a known positive definite variance matrix.
    • \(V^{-1/2}y = V^{-1/2}X\beta + V^{-1/2}\epsilon\)
    • With \(z = V^{-1/2}y\), \(W = V^{-1/2}X\) and \(\delta = V^{-1/2}\epsilon\), we have \(z = W\beta + \delta\), \(E(\delta) = 0\) and \(Var(\delta) = \sigma^2I\), which is a Gauss-Markov Model.
    • \(\hat z = V^{-1/2}X(X'V^{-1}X)^-X'V^{-1}y\) so that \(\hat y = X(X'V^{-1}X)^-X'V^{-1}y\).
    • If \(C\beta\) is estimable, we know the BLUE is the OLS estimator. \(C(W'W)^-W'z = C(X'V^{-1}X)^-X'V^{-1}y \equiv C\hat\beta_{V}\) is called a Generalized Least Squares (GLS) estimator.
    • \(\hat\beta_V = (X'V^{-1}X)^-X'V^{-1}y\) is a solution to the Aitken Equations: \(X'V^{-1}X\beta = X'V^{-1}y\). When \(V\) is diagonal, \(\hat\beta_V\) is called weighted least squares estimator.
    • An unbiased estimator of \(\sigma^2\) is \[ \frac{z'(I-P_W)z}{n - rank(W)} = \frac{\|(I- V^{-1/2}X(X'V^{-1}X)^-X'V^{-1/2})V^{-1/2}y\|^2}{n - rank(V^{-1/2}X)} = \frac{\|V^{-1/2}(y - X\hat\beta_V)\|^2}{n - r} \]