Appendix
The next theorem is a generalization of Fisher’s Theorem (Theorem 2.2). It is key in statistical inference.
Theorem 2.6 Let \(\boldsymbol{X}=(X_1,\ldots,X_n)'\) be a vector of iid rv’s distributed as \(\mathcal{N}(0,\sigma^2).\) We define the linear combinations
\[\begin{align*} Z_i=\boldsymbol{c}_i' \boldsymbol{X}, \quad i=1,\ldots,p\leq n-1, \end{align*}\]
where the vectors \(\boldsymbol{c}_i\in \mathbb{R}^n\) are orthonormal, that is
\[\begin{align*} \boldsymbol{c}_i'\boldsymbol{c}_j=\begin{cases} 0 & \text{if}\ i\neq j,\\ 1 & \text{if}\ i=j. \end{cases} \end{align*}\]
Then,
\[\begin{align*} Y=\boldsymbol{X}'\boldsymbol{X}-\sum_{i=1}^p Z_i^2 \end{align*}\]
is independent from \(Z_1,\ldots,Z_p\) and, in addition,
\[\begin{align*} \frac{Y}{\sigma^2}\sim \chi_{n-p}^2. \end{align*}\]
Proof (Proof of Theorem 2.6). Select \(n-p\) vectors \(\boldsymbol{c}_{p+1},\ldots,\boldsymbol{c}_n\) so that \(\{\boldsymbol{c}_1,\ldots,\boldsymbol{c}_n\}\) forms an orthonormal basis in \(\mathbb{R}^n.\) Define by columns the \(n\times n\) matrix
\[\begin{align*} \boldsymbol{C}=\begin{pmatrix} \boldsymbol{c}_1 & \cdots & \boldsymbol{c}_n \end{pmatrix}, \end{align*}\]
that verifies \(\boldsymbol{C}'\boldsymbol{C}=\boldsymbol{I}_n,\) where \(\boldsymbol{I}_n\) denotes the identity matrix of size \(n.\) (\(\boldsymbol{C}\) is an orthogonal matrix.)
Define the vector
\[\begin{align*} \boldsymbol{Z}=\boldsymbol{C}'\boldsymbol{X}=(\boldsymbol{c}_1'\boldsymbol{X},\ldots, \boldsymbol{c}_p'\boldsymbol{X},\ldots, \boldsymbol{c}_n'\boldsymbol{X})'=(Z_1,\ldots, Z_p,\ldots, Z_n)'. \end{align*}\]
Then:
\[\begin{align*} \mathbb{E}[\boldsymbol{Z}]=\boldsymbol{0}, \ \mathbb{V}\mathrm{ar}[\boldsymbol{Z}]=\boldsymbol{C}' \mathbb{V}\mathrm{ar}[\boldsymbol{X}] \boldsymbol{C}=\sigma^2 \boldsymbol{C}'\boldsymbol{C} =\sigma^2 \boldsymbol{I}_n. \end{align*}\]
Therefore, since \(Z_1,\ldots,Z_n\) are normal (they are linear combinations of normals) and are uncorrelated, they are independent (this is easy to see using the density version of (1.8) and (1.9)). Besides, solving in \(\boldsymbol{Z}=\boldsymbol{C}'\boldsymbol{X}\) for \(\boldsymbol{X}\) we have \(\boldsymbol{X}=\boldsymbol{C}\boldsymbol{Z}.\) Considering this and employing that \(\boldsymbol{C}'\boldsymbol{C}=\boldsymbol{I}_n,\) we get
\[\begin{align*} \boldsymbol{X}'\boldsymbol{X}=\boldsymbol{Z}\boldsymbol{C}'\boldsymbol{C} \boldsymbol{Z}=\boldsymbol{Z}'\boldsymbol{Z}. \end{align*}\]
Replacing this equality in the definition of \(Y,\) it follows that
\[\begin{align*} Y=\boldsymbol{Z}'\boldsymbol{Z}-\sum_{i=1}^p Z_i^2=\sum_{i=1}^n Z_i^2-\sum_{i=1}^p Z_i^2=\sum_{i=p+1}^n Z_i^2. \end{align*}\]
Therefore, \(Y=\sum_{i=p+1}^n Z_i^2\) is independent from \(Z_1,\ldots,Z_p.\) Also, by Corollary 2.1 it follows that
\[\begin{align*} \frac{Y}{\sigma^2}=\sum_{i=p+1}^n \frac{Z_i^2}{\sigma^2}\sim \chi_{n-p}^2. \end{align*}\]