3.6 Three versions of \(b_1\)

We’ve just shown that you can think of \(b_1\) as a weighted average of the response values – weighted by where the \(x\) values are relative to \(\bar{x}\):

\[b_1 = \sum_{i} \left( \frac{x_i-\bar{x}}{S_{xx}} \right) y_i\] The original version of \(b_1\) that we mentioned looked different: \[b_1 = r\frac{s_y}{s_x}\]

In that formulation, we can think of \(b_1\) as the correlation between \(X\) and \(Y\), but “scaled to match” \(X\) and \(Y\) by taking their standard deviations into account.

There’s a third way to write \(b_1\) as well: \[b_1 = S_{xy}/S_{xx}\]

This one makes it look like \(b_1\) is related to the covariance of \(X\) and \(Y\), but scaled relative to the amount of variance in \(X\).

I know I said I didn’t really care about \(b_0\), but in case you’re wondering (and can’t find this in your old stats notes), one way to write the least-squares estimate of \(b_0\) is: \[ b_0 = \bar{y} - b_1 \bar{x}\] This is mildly interesting as well: the intercept relates to the actual means of \(X\) and \(Y\), as well as the relationship between them.

You may also recognize this as related to the old “point-slope form” for defining a line – what is the point we know the line must pass through?

They’re all true! You can get from any of these formulations to the others with a bit of algebra. What they all have in common is the idea of the slope coefficient reflecting both the relationship between \(X\) and \(Y\), and the amount of variation/scale of \(X\) and \(Y\). (That’s why \(b_1\) changes if you change the units of your variables!) Depending on the situation, you may find it helpful to think about the slope in any of these ways :)