5  Appendix

5.1 Calculate loglikelihood base on least square

In linear regression with fixed x, the least square estimator and maximum likelihood estimator are equivalent. Therefore it is very easy to derive the loglikelihood of a given linear regression model from its least square estimation. Assume we have yi=β0+β1x1i+ϵi

L=i=1n12πσ2exp(yiβ0β1xi)22σ2=i=1n12log(2πσ2)12σ2i=1n(ei2)=n2log(2π)n2log(σ2)12σ2i=1n(ei2).

The least square estimator and the normal maximum likelihood estimator of σ2 is σ^2=s2=1n2i=1n(ei2),

thus we have =n2log(2π)n2log(1n2)n2log(i=1n(ei2))n22=n2log(2π)+n2log(n2)n2log(i=1n(ei2))n2+1=n2[log(2π)log(n2)+log(i=1n(ei2))+12n]

  1. View(getAnywhere(lrtest.default)$objs[[1]]), line 3
  2. View(getAnywhere(logLik.lm)$objs[[1]]), line 22
  3. View(getAnywhere(lrtest.default)$objs[[1]]), line 94, 102

5.2 Sum of squares decomposition for F test

Assume we have p predictors, n observations, y=[y1,,yn], β=[β0,β1,,βp],

X=[1x11x12x1p1x21x22x2p1xn1xn2xnp]. Because Xβ^=yXXβ^=XyX(Xβ^y)=0Xe=0[111x11x21xn1x12x22xn2x1px2pxnp]×[e1e2en]=0[i=1neii=1nxi1eii=1nxi2eii=1nxipei]=0. thus we have 2i=1n(yiy^i)(y^iy¯)=2i=1nei(y^iy¯)=i=1ny^ieii=1ny¯ei=β^0i=1nei+β^1i=1nxi1ei++β^1i=1nxipeiy¯i=1nei=0.

Reference: Multiple Linear Regression (MLR) Handouts by Yibi Huang