17.2 Marginal Effects in Different Contexts

  1. Linear Regression Models

For a simple linear regression:

\[ E[Y|X] = \beta_0 + \beta_1 X, \]

the marginal effect is constant and equal to \(\beta_1\). This makes interpretation straightforward.

  1. Logit and Probit Models

In logistic regression, the expected value of \(Y\) is modeled as:

\[ E[Y|X] = P(Y=1|X) = \frac{1}{1 + e^{-\beta_0 - \beta_1 X}}. \]

The marginal effect is given by:

\[ \frac{\partial E[Y|X]}{\partial X} = \beta_1 P(Y=1|X) (1 - P(Y=1|X)). \]

Unlike linear models, the effect varies with \(X\), requiring evaluation at specific values (e.g., means or percentiles).

  1. Interaction Effects and Nonlinear Terms

When models include interactions (e.g., \(X_1 X_2\)) or transformations (e.g., \(\log(X)\)), marginal effects become more complex. For example, in:

\[ E[Y|X] = \beta_0 + \beta_1 X + \beta_2 X^2, \]

the marginal effect of \(X\) is:

\[ \frac{\partial E[Y|X]}{\partial X} = \beta_1 + 2\beta_2 X. \]

This means the marginal effect depends on the value of \(X\).