Chapter 2 Review of Linear Models

  • General Linear Model (GLM): suppose y=Xβ+ϵ with E(ϵ)=0

    • distribution of y is left unspecified
    • E(y)C(X)
  • Ordinary Least Squares Estimator (OLSE): ˆy=PXX=X(XX)Xy

  • Orthogonal projection matrix PX:

    • PX is symmetric, idempotent,
    • PXX=X and XPX=PX
    • rank(X)=rank(PX)=tr(PX)
    • other properties:
      • XXA=XXBXA=XB
      • (XX)X(XX)XX=X, XX(XX)X=X
      • A=A, AGA=A AGA=A
  • Estimable: if C is any q×p matrix, the linear function of β given by Cβ is estimable if and only if C=AX for some matrix q×n matrix A

    • OLSE of an estimable linear function Cβ is C(XX)Xy
  • Normal equation: XXb=Xy

    • The OLSE of estimable Cβ is Cˆβ where ˆβ is any solution for b in the normal equations
    • ˆβ=(XX)Xy is always a solution to the Normal Equations for any (XX)
    • if Cβ is estimable, then Cˆβ is the same for all solution ˆβ to the Normal equations and Cˆβ=APXy where C=AX
  • Gauss-Markov Model(GMM): suppose y=Xβ+ϵ with E(ϵ)=0 and Var(ϵ)=σ2I

  • Gauss-Markov Theorem: the OLSE of an estimable function Cβ is the BLUE of Cβ

    • an unbiased estimator of σ2 under GMM is given by ˆσ2=y(IPX)ynr
  • Gauss-Markov Model with Normal Errors (GMMNE): suppose y=Xβ+ϵ with ϵN(0,σ2I)

    • GMMNE is useful for drawing statistical inferences regrading estimable Cβ
    • assume: 1. GMMNE 2. Cβ is estimable 3. rand(C) = q and d is a known q×1 vector. Then H0:Cβ=d is a testable hypothesis
  • CˆβN(Cβ,σ2C(XX)C) and ˆσ2σ2nrχ2nr are independent

  • The F test statistic F=(Cˆβd)[^Var(Cˆβ)]1(Cˆβd)/q=(Cˆβd)[C(XX)C]1(Cˆβd)/qˆσ2Fq,nr((Cβd)[C(XX)C]1(Cβd)2σ2) Under the null hypothesis H0:Cβ=d, the non-negative non-centrality parameter is 0.

  • The t test statistic t=cˆβd^Var(cβ)=cˆβdˆσ2c(XX)ctnr(cβdσ2c(XX)c) Under the null hypothesis H0:Cβ=d, the non-negative non-centrality parameter is 0.