Summary points
In lecture 4, we
Defined a linear model in vector-matrix notation \(\mathbf{Y} = \mathbf{X}\boldsymbol{\beta}+\boldsymbol{\epsilon}\).
Construted the formula for least squares estimators \(\boldsymbol{\hat{\beta}} = (\mathbf{X}^T\mathbf{X})^{-1}\mathbf{X}^T\mathbf{Y}\).
Derived the residual sum of squares \(RSS=\mathbf{Y}^T\mathbf{Y}-\mathbf{Y}^T\mathbf{X}\boldsymbol{\hat{\beta}}.\)
These formulae cover all least-squares problems, for suitably defined \(\mathbf{X}\) and \(\boldsymbol{\beta}\) they are worth remembering. To solve a particular problem, you just need to plug in the correct \(\mathbf{X}\). For any example, we simply need to identify the matrix \(\mathbf{X}\) and the vectors \(\mathbf{Y}\) and \(\boldsymbol{\beta}\) and then apply these two key results. Remember that evaluating the sum of squares at the least squares estimates for \(\boldsymbol{\beta}\) gives the residual sum of squares.