Chapter 2 Review of Linear Models
General Linear Model (GLM): suppose y=Xβ+ϵ with E(ϵ)=0
- distribution of y is left unspecified
- E(y)∈C(X)
Ordinary Least Squares Estimator (OLSE): ˆy=PXX=X(X′X)−X′y
Orthogonal projection matrix PX:
- PX is symmetric, idempotent,
- PXX=X and X′PX=PX
- rank(X)=rank(PX)=tr(PX)
- other properties:
- X′XA=X′XB⇔XA=XB
- ∀(X′X)−⇒X(X′X)−X′X=X, X′X(X′X)−X′=X′
- A′=A, AGA=A⇒ AG′A=A
Estimable: if C is any q×p matrix, the linear function of β given by Cβ is estimable if and only if C=AX for some matrix q×n matrix A
- OLSE of an estimable linear function Cβ is C(X′X)−X′y
Normal equation: X′Xb=X′y
- The OLSE of estimable Cβ is Cˆβ where ˆβ is any solution for b in the normal equations
- ˆβ=(X′X)−X′y is always a solution to the Normal Equations for any (X′X)−
- if Cβ is estimable, then Cˆβ is the same for all solution ˆβ to the Normal equations and Cˆβ=APXy where C=AX
Gauss-Markov Model(GMM): suppose y=Xβ+ϵ with E(ϵ)=0 and Var(ϵ)=σ2I
Gauss-Markov Theorem: the OLSE of an estimable function Cβ is the BLUE of Cβ
- an unbiased estimator of σ2 under GMM is given by ˆσ2=y′(I−PX)yn−r
Gauss-Markov Model with Normal Errors (GMMNE): suppose y=Xβ+ϵ with ϵ∼N(0,σ2I)
- GMMNE is useful for drawing statistical inferences regrading estimable Cβ
- assume: 1. GMMNE 2. Cβ is estimable 3. rand(C) = q and d is a known q×1 vector. Then H0:Cβ=d is a testable hypothesis
Cˆβ∼N(Cβ,σ2C(X′X)−C′) and ˆσ2∼σ2n−rχ2n−r are independent
The F test statistic F=(Cˆβ−d)′[^Var(Cˆβ)]−1(Cˆβ−d)/q=(Cˆβ−d)′[C(X′X)−C′]−1(Cˆβ−d)/qˆσ2∼Fq,n−r((Cβ−d)′[C(X′X)−C′]−1(Cβ−d)2σ2) Under the null hypothesis H0:Cβ=d, the non-negative non-centrality parameter is 0.
The t test statistic t=c′ˆβ−d√^Var(c′β)=c′ˆβ−d√ˆσ2c′(X′X)−c∼tn−r(c′β−d√σ2c′(X′X)−c) Under the null hypothesis H0:Cβ=d, the non-negative non-centrality parameter is 0.