Chapter 9 M9: Dimension Reduction
What do you do when you just have too many predictors? In this module, we’ll look at the kinds of problems that can show up with high-dimensional data and some ways of reducing the dimension (or the effective dimension) of the dataset. These include creating “combination” variables and dropping things from the model.
This module’s reading is all in the textbook! Relevant sections include:
- Chapter 6 intro, “Linear Model Selection and Regularization”
- 6.3 “Dimension Reduction Methods”
- Skip or skim subsection 6.3.2, “Partial Least Squares”
- Subsection 3.3.3, “Potential Problems” (in Section 3.3, “Other Considerations in the Regression Model”).
- Focus on sub-subsection 6, “Collinearity,” which discusses multicollinearity and the VIF. You can skim the other parts if you like, as a refresher on the list of regression assumptions and conditions :)
- Section 6.1, “Subset Selection”
- Section 6.4, “Considerations in High Dimensions”