Methods for variable selection

Let us now discuss a few strategies for variable selection. These methods systematically search through our possible explanatory variables and determine which ones we should include in our model.

Selection criterion

We need to define a rule, or selection criterion, that we may use to determine which variables to include. There are several variable selection criterion options. The criterion for variable selection include adjusted R-squared (\(R^2\)(adj)), Akaike information criterion (AIC), Bayesian information criterion (BIC), the Mallows’s \(C_p\) statistic and the PRESS statistic. We have already met adjusted R-squared (\(R^2\)(adj)) and we will only brief describe the others listed.

  1. \(R^2\) or \(R^2\)adj
  2. Akaike Information Criterion (AIC) is defined as \[\mbox{AIC}= -2l(\boldsymbol{\beta}) + 2p\]
  3. (Schwarz) Bayesian Information Criterion (BIC or sbc) is defined as \[\mbox{BIC}= -2l(\boldsymbol{\beta}) + 2p \log(n)\]
  4. Mallow’s \(C_p\) is defined as \[C_p= \frac{RSS_{p}}{\hat{\sigma}_{F}^2}-n + 2p\]

Once we have chosen a selection criterion, we need to choose a method that systematically removes and/or includes variables based on the chosen criterion. The first method we will consider is all subset regression

Search strategy

In this course, we will consider two search strategies.

  1. All subset where we search all possible models.
  2. Stepwise selection where we begin with an initial model and systematically add or remove variables one at a time. We will discuss stepwise selection in the next lecture.