Chapter 6 M6: Regression Alternatives

Having spent a long and interesting semester unpacking regression, we’ll now take a peek at some completely different methods (or at least, methods that look completely different!). We can use these methods to get predictions for either quantitative or categorical responses, and sometimes, to learn more about the system we’re describing.

Of course, these methods all have their own assumptions and conditions – and places where you, the analyst, have to make decisions about what to do!

This module draws on the following textbook sections:

  • 2.1.2 “How do we estimate f?” (in section 2.1 “What Is Statistical Learning?”)
  • 2.1.5 “Regression Versus Classification Problems”
  • 2.2.3 “The Classification Setting”
  • 4.4 introduction “Generative Models for Classification”
  • 4.4.1 “Linear Discriminant Analysis for p=1”
  • 4.4.2 “Linear Discriminant Analysis for p>1”
  • 4.4.3 “Quadratic Discriminant Analysis”
  • Chapter 8 introduction “Tree-Based Methods”
  • 8.1 introduction “The Basics of Decision Trees”
  • 8.1.1 “Regression Trees”
  • 8.1.3 “Trees Versus Linear Models”
  • 8.1.4 “Advantages and Disadvantages of Trees”
  • 8.1.2 “Classification Trees”
  • 8.2 “Bagging, Random Forests, Boosting, and Bayesian Additive Regression Trees”
    • Feel free to skim or skip subsection 8.2.4 “Bayesian Additive Regression Trees” if you like.
    • Do read subsection 8.2.5 “Summary of Tree Ensemble Methods,” even if you skip 8.2.4. You might even want to read this part first to get an overview before you go read the details about each method!