Chapter 5 Kernel regression estimation II

In Chapter 4 we studied the simplest situation for performing nonparametric estimation of the regression function: a single, continuous predictor \(X\) is available for explaining \(Y,\) a numerical response that is implicitly assumed to be continuous.162 This served to introduce the main concepts without the additional technicalities associated with more complex predictors.

The purpose of this chapter is to extend nonparametric regression when

  1. there are multiple predictors \(X_1,\ldots,X_p,\)
  2. some predictors possibly are non-continuous, i.e., they are categorical or discrete, and
  3. the response \(Y\) is not continuous.

We concentrate first on the first two points, as the third presents a change of paradigm from the kernel regression estimator studied in Chapter 4.


  1. Local polynomial estimators also “work” with discrete responses, with a varying degree of adequacy. For example, if \(Y\) is binary, then \(\hat m_h(x;0,h)\in[0,1],\) for any \(x\in\mathbb{R}\) and \(h>0,\) since \(\hat m_h(x;0,h)\) is an \(x\)-weighted mean (or a convex linear combination) of \(Y_1,\ldots,Y_n\in[0,1].\) Then, the regression function \(m:\mathbb{R}\longrightarrow[0,1]\) is always properly estimated. However, a local linear estimator may yield improper estimators of \(m,\) since \(\hat m_h(x;1,h)\) is an \(x\)-weighted linear combination of \(Y_1,\ldots,Y_n\in[0,1]\) and consequently \(\hat m_h(x;1,h)\) may “spike” outside \([0,1].\)↩︎