Summary points
The main take aways from lecture 11 are
The least-squares estimates of a set of linear functions of regression coefficients is just the set of linear functions of the least-squares estimates.
We can make inferences about any linear combination of regression coefficients using the following result \(\frac{(\mathbf{b}^T\boldsymbol{\hat{\beta}}-\mathbf{b}^T\boldsymbol{\beta})}{\sqrt{\frac{RSS}{n-p}\mathbf{b}^T(\mathbf{X}^T\mathbf{X})^{-1}\mathbf{b}}} \sim t(n-p)\)
We can use the ANOVA table to test a model with say \(k\) explanatory variables against the null model \(E(y_i)=\alpha\) using the following result
\[F = { MS_\mathrm{model} \over MS_\mathrm{residuals}} \sim F(Df_\mathrm{model}, Df_\mathrm{residuals}). \]
- The ANOVA is a good model comparison tool that we largely will not cover in the course but highly recommend you read the following https://bookdown.org/ndphillips/YaRrr/comparing-regression-models-with-anova.html