# Chapter 5 Confidence intervals

## 5.1 The pivotal quantity method

**Definition 5.1 (Condifence interval)**Let \(X\) be a r.v. with induced probability \(\mathbb{P}_\theta\), \(\theta\in \Theta\), where \(\Theta\subset \mathbb{R}\). Let \((X_1,\ldots,X_n)\) be a s.r.s. of \(X\). Let \(T_1=T_1(X_1,\ldots,X_n)\) and \(T_2=T_2(X_1,\ldots,X_n)\) be two unidimensional statistics such that \[\begin{align} \mathbb{P}_{\theta}(T_1\leq \theta\leq T_2)\geq 1-\alpha, \ \forall \theta\in\Theta.\tag{5.1} \end{align}\] Then, the interval \([T_1(x_1,\ldots,x_n),T_2(x_1,\ldots,x_n)]\) obtained for any sample realization \((x_1,\ldots,x_n)\) is referred as a

*confidence interval for \(\theta\) at the confidence level \(1-\alpha\)*. This interval is often denoted by \(\mathrm{CI}_{100(1-\alpha)\%}\).

The value \(\alpha\) is denoted as the *significance level*. \(T_1\) and \(T_2\) are know as the *inferior* and the *superior limits of the confidence interval* for \(\theta\), respectively. Sometimes the interest lies in only one of these limits.

**Definition 5.2 (Pivot)**A

*pivot*\(Z(\theta)=Z(\theta;X_1,\ldots,X_n)\) is a function of the sample \(X_1,\ldots,X_n\) and the unknown parameter \(\theta\) that is bijective in \(\theta\) and has a completely known probability distribution.

The pivotal quantity method for obtaining a confidence interval consists in, once fixed the significance level \(\alpha\) desired to satisfy (5.1), find a pivot \(Z(\theta)\) and, using the pivot’s distribution, select two constants \(c_1\) and \(c_2\) such that
\[
\mathbb{P}_{Z}(c_1\leq Z(\theta)\leq c_2)\geq 1-\alpha.
\]
Then, solving^{3} for \(\theta\) in the inequalities
\[
Z(\theta)\leq c_2\quad\text{and} \quad Z(\theta)\geq c_1
\]
we obtain the equivalent inequalities of the form
\[
\theta\leq T_1\quad\text{and} \quad \theta\geq T_2.
\]
For example, if \(Z\) is increasing, then \(T_1=Z^{-1}(c_1)\) and \(T_2=Z^{-1}(c_2)\), whereas if \(Z\) is decreasing, then \(T_1=Z^{-1}(c_2)\) and \(T_2=Z^{-1}(c_1)\). Then, \([T_1,T_2]\) is a confidence interval for \(\theta\) at level \(1-\alpha\).

Usually, the pivot \(Z(\theta)\) can be constructed from an estimator \(\hat\theta\) of \(\theta\). Assume this estimator has a know distribution, but that depends on \(\theta\). Then, making a transformation (that involves \(\theta\)) of \(\hat\theta\), namely \(\hat\theta'\), such that the distribution of \(\hat\theta'\) does not depend on \(\theta\), we find that \(\hat\theta'\) is a pivot for \(\theta\).

**Example 5.1 **Assume that we have a single observation \(X\) of a \(\mathrm{Exp}(1/\theta)\) r.v.. Employ \(X\) for constructing a confidence interval for \(\theta\) with a confidence level \(0.90\).

We have a s.r.s. of size one and we need to find a pivot for \(\theta\), that is, a function of \(X\) and \(\theta\) whose distribution is completely known. The p.d.f. and the m.g.f. of \(X\) are given by

\[ f_X(x)=\frac{1}{\theta}e^{-x/\theta}, \quad x\geq 0, \quad m_X(s)=(1-\theta s)^{-1}. \]

Then, taking \(Z=X/\theta\), the m.g.f. of \(\theta\) is given by, respectively,

\[ m_Z(s)=m_{X/\theta}(s)=m_{X}(s/\theta)=\left(1-\theta \frac{s}{\theta}\right)^{-1}=(1-s)^{-1}. \]

\(m_Z\) does not depend on \(\theta\) and, in addition, is the m.g.f. of a r.v. \(\mathrm{Exp}(1)\) with p.d.f.

\[ f_{Z}(z)=e^{-z}, \quad z\geq 0. \]

Then, we need to find two constants \(c_1\) and \(c_2\) such that

\[ \mathbb{P}(c_1\leq Z\leq c_2)\geq 0.90. \]

We know that

\[\begin{align*} \mathbb{P}(Z\leq c_1)&=\int_{0}^{c_1} e^{-z}\,\mathrm{d}z=1-e^{-c_1},\\ \mathbb{P}(Z\geq c_2)&=\int_{c_2}^{\infty} e^{-z}\,\mathrm{d}z=e^{-c_2}. \end{align*}\]

Splitting the probability \(0.10\) evenly in two, then

\[ 1-e^{-c_1}=0.05, \quad e^{-c_2}=0.05. \]

Solving for the \(c_1\) and \(c_2\), we obtain

\[ c_1=-\log(0.95)=0.051,\quad c_2=-\log(0.05)=2.996. \]

Therefore, it is verified

\[ \mathbb{P}(0.051\leq X/\theta\leq 2.996)=0.9. \]

Solving \(\theta\) from the inequalities, we have

\[ \theta\leq X/0.051, \quad \theta\geq X/2.996 \]

and therefore

\[ \mathbb{P}(X/2.996\leq \theta\leq X/0.051)=0.9, \]

so the confidence interval for \(\theta\) at level \(0.10\) is

\[ \mathrm{CI}_{0.90}(\theta)=[X/2.996,X/0.051]. \]

## 5.2 Confidence intervals for normal populations

We assume in this section the r.v. \(X\) follows a \(\mathcal{N}(\mu,\sigma^2)\) distribution from which a s.r.s. \((X_1,\ldots,X_n)\) is extracted.

### 5.2.1 Confidence interval for \(\mu\) with known \(\sigma^2\)

Let’s fix the significance level \(\alpha\). For applying the pivotal quantity method we need an estimator of \(\mu\) with known distribution. For example, \(\hat\mu=\bar{X}\), which verifies \[ \bar{X}\sim\mathcal{N}(\mu,\sigma^2/n). \] A transformation that removes \(\mu\) from the distribution of \(\bar{X}\) is simply \[ Z=\bar{X}-\mu\sim \mathcal{N}(0,\sigma^2/n). \]

However, a more practical pivot is

\[\begin{align} Z=\frac{\bar{X}-\mu}{\sigma/\sqrt{n}}\sim\mathcal{N}(0,1) \tag{5.2} \end{align}\]

since its distribution does not depend on \(\sigma\).

Now we must find the constants \(c_1\) and \(c_2\) such that \[ \mathbb{P}(c_1\leq Z\leq c_2)=1-\alpha. \]

If split evenly the probability \(\alpha\) at both sides of the distribution, then \(c_1\) and \(c_2\) are the constants that verifies

\[ F_Z(c_1)=\alpha/2\quad \text{and} \quad F_Z(c_2)=1-\alpha/2, \] that is, \(c_1\) is the lower \(\alpha/2\)-quantile of \(\mathcal{N}(0,1)\) and \(c_2\) is the lower \((1-\alpha/2)\)-quantile of the same distribution.

**Definition 5.3 (Critical value) **A *critical value* \(\alpha\) of a r.v. \(X\) is the value of the variable that accumulates \(\alpha\) probability to its right, or in other words, is the upper \(\alpha\)-quantile of \(X\). It is denoted by \(x_{\alpha}\) and is such that
\[
\mathbb{P}(X\geq X_{\alpha})=\alpha.
\]

Then, the constants \(c_1\) and \(c_2\) verify \[ \mathbb{P}(Z\geq c_1)=1-\alpha/2\quad \text{and} \quad \mathbb{P}(Z\geq c_2)=\alpha/2, \] that is, \(c_1=z_{1-\alpha/2}\) and \(c_2=z_{\alpha/2}\). Given the symmetry of the standard normal distribution with respect to \(0\), it is verified that \(z_{1-\alpha/2}=-z_{\alpha/2}\), that is, \(c_1=-c_2\). Example 2.7 illustrates how to compute the critical values of a \(\mathcal{N}(0,1)\).

Once obtained \(c_1=-z_{\alpha/2}\) and \(c_2=z_{\alpha/2}\), we solve for \(\mu\) in the inequalities inside the probability:

\[\begin{align*} 1-\alpha &=\mathbb{P}\left(-z_{\alpha/2}\leq \frac{\bar{X}-\mu}{\sigma/\sqrt{n}}\leq z_{\alpha/2}\right)\\ &=\mathbb{P}\left(-z_{\alpha/2}\frac{\sigma}{\sqrt{n}}\leq \bar{X}-\mu\leq z_{\alpha/2}\frac{\sigma}{\sqrt{n}}\right)\\ &=\mathbb{P}\left(-\bar{X}-z_{\alpha/2}\frac{\sigma}{\sqrt{n}}\leq -\mu\leq -\bar{X}+z_{\alpha/2}\frac{\sigma}{\sqrt{n}}\right) \\ &=\mathbb{P}\left(\bar{X}-z_{\alpha/2}\frac{\sigma}{\sqrt{n}}\leq \mu\leq \bar{X}+z_{\alpha/2}\frac{\sigma}{\sqrt{n}}\right). \end{align*}\]

Therefore, a confidence interval for \(\mu\) is \[ \mathrm{CI}_{1-\alpha}(\mu)=\left[\bar{X}-z_{\alpha/2}\frac{\sigma}{\sqrt{n}}, \bar{X}+z_{\alpha/2}\frac{\sigma}{\sqrt{n}}\right], \] that we represent it in a compact way by means of \[ \mathrm{CI}_{1-\alpha}(\mu)=\bar{X}\mp z_{\alpha/2}\frac{\sigma}{\sqrt{n}}. \]

**Example 5.2 **A gunpowder manufacturer developed a new formula that was tested in eight bullets. The resultant initial velocities, measured in feet per second, were

\[\begin{align*} & 3005, \quad 2925, \quad 2935, \quad 2965, \\ & 2995, \quad 3005, \quad 2937, \quad 2905. \end{align*}\]

Assuming that the initial velocities have normal distribution with \(\sigma=39\) feet per second, find a confidence interval at level \(\alpha=0.05\) for the minitial mean velocity of the bullets that employ the new gunpowder.

We know that \[ X\sim \mathcal{N}(\mu,39^2), \ \text{with unknown $\mu$}. \] On the other hand, from the sample we have that \[ n=8,\quad \bar{X}=2959,\quad \alpha/2=0.025, \quad z_{\alpha/2}=1.96. \] Then, a confidence interval for \(\mu\) at \(0.95\) confidence is \[ \mathrm{CI}_{0.95}(\mu)=\bar{X}\mp z_{\alpha/2} \frac{\sigma}{\sqrt{n}}=2959\mp 1.96\frac{39}{\sqrt{8}}=[2931.97,2986.02]. \]

In R, the previous computations can be simply done as:### 5.2.2 Confidence interval for \(\mu\) with unknown \(\sigma^2\)

In this case (5.2) is not a pivot: besides depending on \(\mu\), the pivot also depends on another unknown parameter, \(\sigma^2\), which makes impossible to encapsulate \(\theta\) between two computable limits. However, we can estimate \(\sigma^2\) unbiasedly with \(S'^2\), and produce

\[ T=\frac{\bar{X}-\mu}{S'/\sqrt{n}}. \]

\(T\) is a pivot, since Theorem 2.3 provides the \(\mu\)-free distribution of \(T\), \(t_{n-1}\). If we split evenly the significance level \(\alpha\) between the two tails of the distribution, then the constants \(c_1\) and \(c_2\) are \[ \mathbb{P}(c_1\leq T\leq c_2)=1-\alpha, \] corresponding to the critical values \(c_1=t_{n-1;1-\alpha/2}\) and \(c_2=t_{n-1;\alpha/2}\). Since Student’s \(t\) is also symmetric, then \(c_1=-c_2=-t_{n-1;\alpha/2}\), hence \[ 1-\alpha=\mathbb{P}\left(-t_{n-1;\alpha/2}\leq \frac{\bar X-\mu}{S'/\sqrt{n}}\leq t_{n-1;\alpha/2}\right). \] Example 2.10 illustrates how to compute probabilities in a Students’\(t\) distribution.

Then, solving \(\mu\) in the inequalities, we get \[ 1-\alpha=\mathbb{P}\left(\bar{X}-t_{n-1;\alpha/2}{S'/\sqrt{n}}\leq\mu\leq \bar X+t_{n-1;\alpha/2}{S'/\sqrt{n}}\right). \] Then, a confidence interval for the mean \(\mu\) at confidence level \(1-\alpha\) is \[ \mathrm{CI}_{1-\alpha}(\mu)=\bar{X}\mp t_{n-1;\alpha/2}\frac{S'}{\sqrt{n}}. \]

**Example 5.3 **Compute the same confidence interval asked in the Example 5.2 but assuming that the variance of the inital velocities is unknown.

If the variance is unknown, then we can estimate it with \(S'^2=39.1\). Then, the confidence interval for \(\mu\) is \[ \mathrm{CI}_{0.95}(\mu)=\bar{X}\mp t_{7;0.025}\frac{S'}{\sqrt{n}}=2959\mp 2.365\frac{39.1}{\sqrt{8}}=[2923.31,2991.69]. \]

In R, the previous computations can be simply done as:

### 5.2.3 Confidence interval for \(\sigma^2\)

We already have an unbiased estimator of \(\sigma^2\), the quasivariance \(S'^2\). In addition, by Theorem 2.2 we have a pivot \[ U=\frac{(n-1)S'^2}{\sigma^2}\sim \chi_{n-1}^2. \] Then, we only need to compute the constants \(c_1\) and \(c_2\) such that \[ \mathbb{P}(c_1\leq U\leq c_2)=1-\alpha. \] Splitting the probability \(\alpha\) evenly to both sides of the \(\chi^2\) distribution, we have that \(c_1=\chi_{n-1;1-\alpha/2}^2\) and \(c_2=\chi_{n-1;\alpha/2}^2\). Once the constants are computed, solving for \(\sigma^2\) in the inequalities

\[\begin{align*} 1-\alpha &=\mathbb{P}\left(\chi_{n-1;1-\alpha/2}^2\leq \frac{(n-1)S'^2}{\sigma^2} \leq \chi_{n-1;\alpha/2}^2\right)\\ &=\mathbb{P}\left(\frac{\chi_{n-1;1-\alpha/2}^2}{(n-1)S'^2}\leq \frac{1}{\sigma^2} \leq \frac{\chi_{n-1;\alpha/2}^2}{(n-1)S'^2}\right) \\ &=\mathbb{P}\left(\frac{(n-1)S'^2}{\chi_{n-1;\alpha/2}^2}\leq \sigma^2 \leq \frac{(n-1)S'^2}{\chi_{n-1;1-\alpha/2}^2}\right) \end{align*}\]

yields the confidence interval for \(\sigma^2\):

\[ \mathrm{CI}_{1-\alpha}(\sigma^2)=\left[\frac{(n-1)S'^2}{\chi_{n-1;\alpha/2}^2}, \frac{(n-1)S'^2}{\chi_{n-1;1-\alpha/2}^2}\right]. \] Note that \(\chi_{n-1;1-\alpha/2}<\chi_{n-1;\alpha/2}\) but in the confidence interval these critical values appear in the denominators in reverse order.

**Example 5.4 **A practitioner wants to verify the variability of an equipment employed for measuring the volume of an audio source. Three independent measurements recorded with this equipment were

\[ 4.1,\quad 5.2, \quad 10.2. \]

Assuming that the measurements have a normal distribution, obtain the confidence interval of \(\sigma^2\) with confincence \(0.90\).

From the three measurements we obtain

\[ S'^2=10.57,\quad \alpha/2=0.05,\quad n=3. \]

Then, the confidence interval for \(\sigma^2\) is

\[ \mathrm{CI}_{0.90}(\sigma^2)=\left[\frac{(n-1)S'^2}{\chi_{2,0.05}^2},\frac{(n-1)S'^2}{\chi_{2,0.05}^2}\right] =\left[\frac{2(10.57)}{5.991},\frac{2(10.57)}{0.103}\right]=[3.53,205.24]. \]

Recall that since \(n\) is small, then the critical values of the \(\chi_{n-1}^2\) distribution are also small, and hence the length of the interval is large, illustrating the little information available.

The previous computations can be done as (observe the difference in the truncation of \(c_1\)):### 5.2.4 Confidence interval for the difference of means with know variances

Assume that we have a population \(X_1\sim \mathcal{N}(\mu_1,\sigma_1^2)\) where the mean population \(\mu_1\) is unknown and the variance \(\sigma_1^2\) is known. Similarly, let \(X_2\sim \mathcal{N}(\mu_2,\sigma_2^2)\) be another population, independent of \(X_1\), with unknown \(\mu_2\) and known \(\sigma_2^2\). Let \((X_{11},\ldots,X_{1n_1})\) and \((X_{21},\ldots,X_{2n_2})\) be two s.r.s.’s of sizes \(n_1\) and \(n_2\) coming from \(X_1\) and \(X_2\), respectively. We wish to construct a confidence interval for the difference \(\theta=\mu_1-\mu_2\).

For that, we consider an estimator of \(\theta\). An unbiased estimator is \[ \hat\theta=\bar{X}_1-\bar{X}_2. \] Since \(\hat\theta\) is a linear combination of independent normal r.v.’s, then it has a normal distribution. Its mean is easily computed as \[ \mathbb{E}[\hat\theta]=\mathbb{E}[\bar{X}_1]+ \mathbb{E}[\bar{X}_2]=\mu_1+\mu_2. \] Thanks to the independence between \(X_1\) and \(X_2\), the variance is \[ \mathbb{V}\mathrm{ar}[\hat\theta]=\mathbb{V}\mathrm{ar}[\bar{X}_1]+\mathbb{V}\mathrm{ar}[\bar X_2]=\frac{\sigma_1^2}{n_1}+\frac{\sigma_2^2}{n_2}. \] Then, a possible pivot is \[ Z=\frac{\bar{X}_1-\bar X_2-(\mu_1-\mu_2)}{\sqrt{\frac{\sigma_1^2}{n_1}+\frac{\sigma_2^2}{n_2}}}\sim \mathcal{N}(0,1). \]

Again, if we split evenly the significance level \(\alpha\) between the two tails, then we look for constants \(c_1\) and \(c_2\) such that \[ \mathbb{P}(c_1\leq Z\leq c_2)=1-\alpha, \] that is, \(c_1=-z_{\alpha/2}\) and \(c_2=z_{\alpha/2}\). Solving for \(\mu_1-\mu_2\), we obtain

\[\begin{align*} 1-\alpha &=\mathbb{P}\left(-z_{\alpha/2}\leq \frac{\bar{X}_1-\bar X_2-(\mu_1-\mu_2)}{\sqrt{\frac{\sigma_1^2}{n_1} +\frac{\sigma_2^2}{n_2}}}\leq z_{\alpha/2}\right) \\ &=\mathbb{P}\left(-z_{\alpha/2}\sqrt{\frac{\sigma_1^2}{n_1} +\frac{\sigma_2^2}{n_2}}\leq \bar{X}_1-\bar X_2-(\mu_1-\mu_2)\leq z_{\alpha/2}\sqrt{\frac{\sigma_1^2}{n_1} +\frac{\sigma_2^2}{n_2}}\right)\\ &=\mathbb{P}\left(-(\bar{X}_1-\bar{X}_2)-z_{\alpha/2}\sqrt{\frac{\sigma_1^2}{n_1} +\frac{\sigma_2^2}{n_2}}\leq -(\mu_1-\mu_2)\leq -(\bar{X}_1-\bar{X}_2) +z_{\alpha/2}\sqrt{\frac{\sigma_1^2}{n_1}+\frac{\sigma_2^2}{n_2}}\right)\\ &=\mathbb{P}\left(\bar{X}_1-\bar{X}_2-z_{\alpha/2}\sqrt{\frac{\sigma_1^2}{n_1} +\frac{\sigma_2^2}{n_2}}\leq \mu_1-\mu_2\leq \bar{X}_1-\bar{X}_2 +z_{\alpha/2}\sqrt{\frac{\sigma_1^2}{n_1}+\frac{\sigma_2^2}{n_2}}\right). \end{align*}\]

Therefore, a confidence interval for \(\mu_1-\mu_2\) at the confidence level \(1-\alpha\) is

\[ \mathrm{CI}_{1-\alpha}(\mu_1-\mu_2)=\bar{X}_1-\bar{X}_2 \mp z_{\alpha/2}\sqrt{\frac{\sigma_1^2}{n_1}+\frac{\sigma_2^2}{n_2}} \]

**Example 5.5 **A new training method for an assembly operation is being tested at a factory. For that purpose, two groups of nine employees were trained during three weeks. One group was trained with the usual procedure and the other with the new method. The assembly time (in minutes) that each employee required after the training period is collected in the table below. Assuming that the assembly times are normally distributed with both variances equal to \(22\) minutes, obtain a confidence interval at level \(0.05\) for the difference of average assembly times for the two kinds of trainings.

Procedure | Measurements |
---|---|

Standard | \(32 \qquad 37 \qquad 35 \qquad 28 \qquad 41 \qquad 44 \qquad 35 \qquad 31 \qquad 34\) |

New | \(35 \qquad 31 \qquad 29 \qquad 25 \qquad 34 \qquad 40 \qquad 27 \qquad 32 \qquad 31\) |

The average assembly times of the two groups are \[ \bar{X}_1=35.22,\quad \bar{X}_2=31.56. \] Then, a confidence interval for \(\mu_1-\mu_2\) at the \(0.95\) confidence level is \[\begin{align*} \mathrm{CI}_{0.95}(\mu_1-\mu_2)&=(35.22-31.56)\mp 1.96(4.69)\sqrt{1/9+1/9}\\ &=3.66\mp 4.33 =[-0.67,7.99]. \end{align*}\]

In R:```
# Samples
X_1 <- c(32, 37, 35, 28, 41, 44, 35, 31, 34)
X_2 <- c(35, 31, 29, 25, 34, 40, 27, 32, 31)
# n1, n2, Xbar1, Xbar2, sigma2_1, sigma2_2, alpha, z_{alpha/2}
n_1 <- length(X_1)
n_2 <- length(X_2)
X_bar_1 <- mean(X_1)
X_bar_2 <- mean(X_2)
sigma2_1 <- sigma2_2 <- 22
alpha <- 0.05
z <- qnorm(alpha / 2, lower.tail = FALSE)
# CI
(X_bar_1 - X_bar_2) + c(-1, 1) * z * sqrt(sigma2_1 / n_1 + sigma2_2 / n_2)
## [1] -0.6669768 8.0003101
```

### 5.2.5 Confidence interval for the difference of means, with unknown and equal variances

Assume that we are in the situation of the previous section but now both variances \(\sigma_1^2\) and \(\sigma_1^2\) are equal and *unknown*, that is, \(\sigma_1^2=\sigma_2^2=\sigma^2\) with \(\sigma^2\) unknown.

We want to construct a confidence interval for \(\theta=\mu_1-\mu_2\). As in the previous section, an unbiased estimator is
\[
\hat\theta=\hat\mu_1-\hat\mu_2=\bar{X}_1-\bar{X}_2
\]
whose distribution is
\[
\hat\theta\sim\mathcal{N}\left(\mu_1-\mu_2,\sigma^2\left(\frac{1}{n_1}+\frac{1}{n_2}\right)\right).
\]
However, since \(\sigma^2\) is unknown, then
\[\begin{align}
Z=\frac{\bar{X}_1-\bar
X_2-(\mu_1-\mu_2)}{\sqrt{\sigma^2\left(\frac{1}{n_1}+\frac{1}{n_2}\right)}}\sim
\mathcal{N}(0,1)\tag{5.3}
\end{align}\]
is *not* a pivot. We need to estimate in the first place \(\sigma^2\).

For that, we know that
\[
\frac{(n_1-1)S_1'^2}{\sigma^2}\sim \chi_{n_1-1}^2,\quad
\frac{(n_2-1)S_2'^2}{\sigma^2}\sim \chi_{n_2-1}^2.
\]
Besides, the two samples are independent, so by the additivity property of the \(\chi^2\), we know that
\[
\frac{(n_1-1)S_1'^2}{\sigma^2}+\frac{(n_2-1)S_2'^2}{\sigma^2}\sim
\chi_{n_1+n_2-2}^2.
\]
Taking expectations, we have
\[\begin{align*}
\mathbb{E}\left[\frac{(n_1-1)S_1'^2}{\sigma^2}+\frac{(n_2-1)S_2'^2}{\sigma^2}\right]
&=\frac{1}{\sigma^2}\mathbb{E}\left[(n_1-1)S_1'^2+(n_2-1)S_2'^2\right]\\
&=n_1+n_2-2.
\end{align*}\]
Solving for \(\sigma^2\), we obtain
\[
\mathbb{E}\left[(n_1-1)S_1'^2+(n_2-1)S_2'^2\right]=\sigma^2(n_1+n_2-2).
\]
From here we can easily deduce an unbiased estimator for \(\sigma^2\):
\[
\hat\sigma^2=\frac{(n_1-1)S_1'^2+(n_2-1)S_2'^2}{n_1+n_2-2}\triangleq S^2.
\]
Note that \(\hat\sigma^2\) is just a *pooled* sample variance stemming from the two sample quasivariances.

In addition, we know the distribution of \(\hat\sigma^2\), since \[ \frac{(n_1+n_2-2)S^2}{\sigma^2} =\frac{(n_1-1)S_1'^2+(n_2-1)S_2'^2}{\sigma^2}\sim\chi_{n_1+n_2-2}^2. \]

If we replace in (5.3) \(\sigma^2\) by \(S^2\) and we apply Theorem 2.3, we obtain the probability distribution

\[\begin{align*} T &=\frac{\bar{X}_1-\bar{X}_2-(\mu_1-\mu_2)} {\sqrt{S^2\left(\frac{1}{n_1}+\frac{1}{n_2}\right)}} = \frac{{\frac{\bar{X}_1-\bar X_2-(\mu_1-\mu_2)}{\sqrt{\sigma^2\left(\frac{1}{n_1}+\frac{1}{n_2}\right)}}}} {\sqrt{{\frac{(n_1+n_2-2)S^2}{\sigma^2}/(n_1+n_2-2)}}} \\ &\sim \frac{\mathcal{N}(0,1)}{\sqrt{\chi_{n_1+n_2-2}^2/(n_1+n_2-2)}}= t_{n_1+n_2-2}. \end{align*}\]

Then, solving \(\mu_1-\mu_2\) within the following probability we find a confidence interval for \(\mu_1-\mu_2\):

\[\begin{align*} 1-\alpha &=\mathbb{P}\left(-t_{n_1+n_2-2;\alpha/2}\leq \frac{\bar{X}_1-\bar{X}_2-(\mu_1-\mu_2)} {\sqrt{S^2\left(\frac{1}{n_1}+\frac{1}{n_2}\right)}} \leq t_{n_1+n_2-2;\alpha/2} \right)\\ &= \mathbb{P}\left(-t_{n_1+n_2-2;\alpha/2}S\sqrt{\frac{1}{n_1}+\frac{1}{n_2}} \leq \bar{X}_1-\bar{X}_2-(\mu_1-\mu_2) \leq t_{n_1+n_2-2;\alpha/2} \hat\sigma\sqrt{\frac{1}{n_1}+\frac{1}{n_2}} \right)\\ &= \mathbb{P}\left(\bar{X}_1-\bar{X}_2-t_{n_1+n_2-2;\alpha/2}S\sqrt{\frac{1}{n_1}+\frac{1}{n_2}} \leq \mu_1-\mu_2 \leq \bar{X}_1-\bar{X}_2+t_{n_1+n_2-2;\alpha/2} S\sqrt{\frac{1}{n_1}+\frac{1}{n_2}} \right). \end{align*}\]

Therefore, a confidence interval for \(\mu_1-\mu_2\), At confidence level \(1-\alpha\) is \[ \mathrm{CI}_{1-\alpha}(\mu_1-\mu_2) =\bar{X}_1-\bar{X}_2\mp t_{n_1+n_2-2;\alpha/2}S\sqrt{\frac{1}{n_1}+\frac{1}{n_2}}. \]

**Example 5.6 **Compute the same interval asked in Example 5.5, but now assuming that the assembly variances are unknown and equal for the two training methods.

The sample quasivariances for each of the methods are \[ S_1'^2=195.56/8=24.445,\quad S_2'^2=160.22/8=20.027. \] Therefore, the pooled estimated variance is \[ S^2=\frac{(8)(24.445)+(8)(20.027)}{9+9-2}=22.24, \] and the standard deviation is \(S=4.71\). Then, a confidence interval at confincence level \(0.95\) for the difference of average times is \[\begin{align*} \mathrm{CI}_{0.95}(\mu_1-\mu_2)&=(35.22-31.56)\mp t_{16,0.025}\, 4.71\sqrt{1/9+1/9}\\ &=3.66\mp 4.71=[-1.05,8.37]. \end{align*}\]

In R:```
# Samples
X_1 <- c(32, 37, 35, 28, 41, 44, 35, 31, 34)
X_2 <- c(35, 31, 29, 25, 34, 40, 27, 32, 31)
# n1, n2, Xbar1, Xbar2, S^2, alpha, z_{alpha/2}
n_1 <- length(X_1)
n_2 <- length(X_2)
X_bar_1 <- mean(X_1)
X_bar_2 <- mean(X_2)
S2_prime_1 <- var(X_1)
S2_prime_2 <- var(X_2)
S <- sqrt(((n_1 - 1) * S2_prime_1 + (n_2 - 1) * S2_prime_2) / (n_1 + n_2 - 2))
alpha <- 0.05
t <- qt(alpha / 2, df = n_1 + n_2 - 2, lower.tail = FALSE)
# CI
(X_bar_1 - X_bar_2) + c(-1, 1) * t * S * sqrt(1 / n_1 + 1 / n_2)
## [1] -1.045706 8.379039
```

### 5.2.6 Confidence interval for the ratio of variances

Let \(X_1\sim\mathcal{N}(\mu_1,\sigma_1^2)\) and \(X_2\sim\mathcal{N}(\mu_2,\sigma_2^2)\) be two independent r.v.’s with \((X_{11},\ldots,X_{1n_1})\) and \((X_{21},\ldots,X_{2n_2})\) two s.r.s.’s of \(X_1\) and \(X_2\), respectively. Neither the means nor the variances are known. We wish to construct a confidence interval for the ratio of variances, \(\theta=\sigma_1^2/\sigma_2^2\).

In order to find a pivot, we need to consider an estimator for \(\theta\). We know that \(S_1'^2\) and \(S_2'^2\) are unbiased estimators for \(\sigma_1^2\) and \(\sigma_2^2\), respectively. Also, since both s.r.s. are independent, then so do are \(S_1'^2\) and \(S_2'^2\). In addition, by Theorem 2.2, we know that

\[ \frac{(n_1-1)S_1'^2}{\sigma_1^2}\sim \chi_{n_1-1}^2, \quad \frac{(n_2-1)S_2'^2}{\sigma_2^2}\sim \chi_{n_2-1}^2. \]

A possible estimator for \(\theta\) is \(\hat\theta=S_1'^2/S_2'^2\), but its distribution is not completely known, since it depends on \(\sigma_1^2\) and \(\sigma_2^2\). However, we do know the distribution of

\[ F=\frac{S_1'^2/\sigma_1^2}{S_2'^2/\sigma_2^2} =\frac{\frac{(n_1-1)S_1'^2}{\sigma_1^2}/(n_1-1)}{\frac{(n_2-1)S_2'^2}{\sigma_2^2}/(n_2-1)} \sim \frac{\chi_{n_1-1}^2/(n_1-1)}{\chi_{n_2-1}^2/(n_2-1)}=\mathcal{F}_{n_1-1,n_2-1}. \]

Therefore, \(F\) is a pivot. Splitting evenly the probability \(\alpha\) between both tails of the \(\mathcal{F}_{n_1-1,n_2-1}\) distribution, and solving for \(\theta\) we get

\[\begin{align*} 1-\alpha &=\mathbb{P}\left(\mathcal{F}_{n_1-1,n_2-1;1-\alpha/2} \leq \frac{S_1'^2/S_2'^2}{\sigma_1^2/\sigma_2^2} \leq \mathcal{F}_{n_1-1,n_2-1;\alpha/2}\right) \\ &=\mathbb{P}\left(\frac{S_2'^2}{S_1'^2}\mathcal{F}_{n_1-1,n_2-1;1-\alpha/2} \leq \frac{1}{\sigma_1^2/\sigma_2^2} \leq \frac{S_2'^2}{S_1'^2}\mathcal{F}_{n_1-1,n_2-1;\alpha/2}\right)\\ &=\mathbb{P}\left(\frac{S_1'^2/S_2'^2}{\mathcal{F}_{n_1-1,n_2-1;\alpha/2}} \leq \frac{\sigma_1^2}{\sigma_2^2} \leq \frac{S_2'^2/S_1'^2}{\mathcal{F}_{n_1-1,n_2-1;1-\alpha/2}}\right). \end{align*}\]

Then, a confidence interval for \(\theta=\sigma_1^2/\sigma_2^2\) is

\[ \mathrm{CI}_{1-\alpha}(\sigma_1^2/\sigma_2^2)=\left[\frac{S_1'^2/S_2'^2}{\mathcal{F}_{n_1-1,n_2-1;\alpha/2}}, \frac{S_1'^2/S_2'^2}{\mathcal{F}_{n_1-1,n_2-1;1-\alpha/2}}\right]. \]

*Remark. * When using the tabulated probabilities for the Snedecor’s \(\mathcal{F}\) distribution, usually the critical values of the distribution are only available for small probabilities \(\alpha\). However, a useful fact is that if \(F\sim \mathcal{F}_{n_1,n_2}\), then \(F'=1/F\sim\mathcal{F}_{n_2,n_1}\). Therefore, if \(\mathcal{F}_{n_1,n_2;\alpha}\) is the critical value \(\alpha\) of \(\mathcal{F}_{n_1,n_2}\), then

\[\begin{align*} \mathbb{P}(\mathcal{F}_{n_1,n_2}>\mathcal{F}_{n_1,n_2;\alpha})=\alpha &\iff \mathbb{P}(1/\mathcal{F}_{n_1,n_2}<1/c)=\alpha\\ &\iff \mathbb{P}(\mathcal{F}_{n_2,n_1}>1/\mathcal{F}_{n_1,n_2})=1-\alpha. \end{align*}\]

This means that

\[ \mathcal{F}_{n_2,n_1;1-\alpha}=1/\mathcal{F}_{n_1,n_2;\alpha}. \]

The obtainment of the critical values for the \(\mathcal{F}\) distribution can be easily done with the function `qf`

, as illustrated in Example 2.11.

**Example 5.7 **Two learning methods are applied for teaching children in the shcool how to read. The results of both methods were compared in a reading test at the end of the learning period. The resulting means and quasivariances of the tests are collected in the table below. Assuming that the results have a normal distribution, we want to obtain a confidence interval with confidence level \(0.95\) for the mean difference.

Statistic | Method 1 | Method 2 |
---|---|---|

\(n_i\) | \(11\) | \(14\) |

\(\bar{X}_i\) | \(64\) | \(69\) |

\(S_i'^2\) | \(52\) | \(71\) |

In the first place we have to verify if the two unknown variances are equal, to see if we can apply the confidence intervals seen in Section 5.2.5. For that, we compute the confidence interval for the ratio of variances, and if that confidence interval contains the one, then this would indicate that there are no evidences against the assumption of equal variances. If that was the case, we can construct the conficence interval given in Section 5.2.5.

The sample quasivariances are \[ S_1'^2=52,\quad S_2'^2=71. \]

Then, since \(\mathcal{F}_{10,13,0.975}=1/\mathcal{F}_{13,10,0.025}\), the confidence interval at \(0.95\) for the ratio of variances is \[ \left[\frac{52/71}{\mathcal{F}_{10,13,0.025}},\frac{52/71}{\mathcal{F}_{10,13,0.975}}\right] =\left[\frac{0.73}{3.25},\frac{0.73}{1/3.58}\right]=[0.22,2.61]. \] The value one is inside the confidence interval, so the confidence interval does not provide any evidence agaisnt the hypothesis of the equality of variances. Therefore, we can use the confidence interval for unknown and equal variances.

The estimated variance is \[ S^2=\frac{10(52)+13(71)}{11+14-2}=62.74. \] Since the critical value is \(t_{23,0.025}=2.069\), the interval is \[ \mathrm{CI}_{0.95}(\mu_1-\mu_2)=(11-14)\mp 2.069\, 62.74\sqrt{\frac{1}{11}+\frac{1}{14}}=[-9.6,3.6]. \]

In R, the confidence interval for the ratio of variances can be computed as:```
# n1, n2, S1'^2, S2'^2, alpha, c1, c2
n_1 <- 11
n_2 <- 14
S2_prime_1 <- 52
S2_prime_2 <- 71
alpha <- 0.05
c1 <- qf(1 - alpha / 2, df1 = n_1 - 1, df2 = n_2 - 1, lower.tail = FALSE)
c2 <- qf(alpha / 2, df1 = n_1 - 1, df2 = n_2 - 1, lower.tail = FALSE)
# CI
(S2_prime_1 / S2_prime_2) / c(c2, c1)
## [1] 0.2253751 2.6243088
```

## 5.3 Asymptotic confidence intervals

We assume now that the r.v. \(X\) that represents the population follows a distribution that belongs to a parametric family of distributions \(\{F(\cdot,\theta): \theta\in\Theta\}\), and we want to obtain a confidence interval for \(\theta\). This family does not have to be normal and may be even unknown. In this situation, in certain cases we may have an asymptotic pivot of the form \[ Z(\theta)=\frac{\hat\theta-\theta}{\hat\sigma(\hat\theta)}\stackrel{d}{\longrightarrow}\mathcal{N}(0,1). \] Then, an asymptotic confidence interval for \(\theta\) is given by \[ \mathrm{CI}_{1-\alpha}(\theta)=\hat\theta\mp z_{\alpha/2} \hat\sigma(\hat\theta). \] This confidence interval is approximated for \(\theta\), being the interval more accurate as the sample size \(n\) grows.

### 5.3.1 Confidence interval asymptotic for a mean

Let \(X\) be a r.v. with \(\mathbb{E}[X]=\mu\) and \(\mathbb{V}\mathrm{ar}[X]=\sigma^2\), both being unknown parameters, and let \((X_1,\ldots,X_n)\) be a s.r.s. of \(X\). In the Example 3.18 we have seen that

\[ Z(\mu)=\frac{\bar{X}-\mu}{S'/\sqrt{n}}\stackrel{d}{\longrightarrow} \mathcal{N}(0,1). \]

Therefore, an asymptotic confidence interval for \(\mu\) at the confidence level \(1-\alpha\) is \[ \mathrm{CI}_{1-\alpha}(\mu)=\bar{X}\mp z_{\alpha/2}\frac{S'}{\sqrt{n}}. \]

**Example 5.8 **The shopping times of \(n=64\) random customers at a local supermarket are measured. The sample mean and the quasivariance were \(33\) and \(256\) minutes, respectively. Estimate the average shopping time, \(\mu\), of a customer with a confidence level \(0.90\).

The confidence interval for \(\mu\) at level \(\alpha=0.10\) is \[ \mathrm{CI}_{0.90}(\mu)=\bar{X}\mp z_{0.05}\frac{S'}{\sqrt{n}}=33\mp 1.645\frac{256}{\sqrt{64}}=[29.71,36.29]. \]

**Example 5.9 **Let \((X_1,\ldots,X_n)\) be a s.r.s. of a r.v. with distribution \(\mathrm{Pois}(\lambda)\). Let’s compute an asymptotic confidence interval al level \(\alpha\) for \(\lambda\).

In Exercise 3.2 we have seen that

\[ \frac{\bar{X}-\lambda}{\sqrt{\bar{X}/n}}\stackrel{d}{\longrightarrow} \mathcal{N}(0,1). \]

Therefore, the confidence interval is

\[ \mathrm{CI}_{1-\alpha}(\lambda)=\bar{X}\mp z_{\alpha/2} \sqrt{\frac{\bar X}{n}}. \]### 5.3.2 Confidence interval asymptotic for a proportion

Let \(X_1,\ldots,X_n\) be i.i.d. r.v.’s with \(\mathrm{Ber}(p)\) distribution. We are going to obtain an asymptotic confidence interval at level \(1-\alpha\) for \(p\).

In the Example 3.19, we have seen that \[ Z(p)=\frac{\hat p-p}{\sqrt{\hat p(1-\hat p)/n}}\stackrel{d}{\longrightarrow} \mathcal{N}(0,1). \]

Therefore, an asymptotic confidence interval for \(p\) is

\[ \mathrm{CI}_{1-\alpha}(p)=\hat p\mp z_{\alpha/2}\sqrt{\frac{\hat p (1-\hat p)}{n}}. \]

**Example 5.10 **Two brands of refrigerators, A and B, have a warranty of one year. In a random sample of \(n_A=50\) refrigerators of A, \(12\) failed before the end of the warranty period. From the random sample of \(n_B=60\) refrigerators of B, \(12\) failed before the expiration of the warranty. Estimate the difference of the failure proportions during the warranty period at the confidence level \(0.98\).

Let \(p_A\) and \(p_B\) be the proportion of failures for A and B, respectively. By the CLT we know that, for a large sample size, the sample proportions \(\hat p_A\) and \(\hat p_B\) verify \[ \hat p_A\cong\mathcal{N}(p_A, p_A (1-p_A)/n_A),\quad \hat p_B\cong\mathcal{N}(p_B,p_B (1- p_B)/n_B). \]

Then, since the sum of normal r.v.’s, for large \(n_A\) and \(n_B\) it is verified

\[ \hat p_A-\hat p_B\cong\mathcal{N}\left(p_A-p_B,\frac{p_A (1-p_A)}{n_A}+\frac{p_B (1-p_B)}{n_B}\right). \]

Applying Corollary 3.2 and Theorem 3.6, a pivot is

\[ Z(p_A-p_B)=\frac{(\hat p_A-\hat p_B)-(p_A-p_B)}{\sqrt{{\frac{\hat p_A (1-\hat p_A)}{n_A}+\frac{\hat p_B (1-\hat p_B)}{n_B}}}}\stackrel{d}{\longrightarrow} \mathcal{N}(0,1) \]

so a confidence interval for \(p_A-p_B\) is

\[ \mathrm{CI}_{0.98}(p_A-p_B)=(\hat p_A-\hat p_B)\mp z_{0.01}\sqrt{\frac{\hat p_A (1-\hat p_A)}{n_A}+\frac{\hat p_B (1-\hat p_B)}{n_B}}. \]

The sample proportions for the refriegerators are \(\hat p_A=12/50=0.24\) and \(\hat p_B=12/60=0.20\), so the above confidence interval is

\[\begin{align*} \mathrm{CI}_{0.98}(p_A-p_B)&=(0.24-0.20)\mp 2.33\sqrt{\frac{(0.24)(0.76)}{50}+\frac{(0.20)(0.80)}{60}}\\ &=[0.025,0.0547]. \end{align*}\]

This confidence interval is above zero. This indicates that \(p_A>p_B\), that is, that the probability of failure of the refrigerators of brand A is larger than the one for brand B.

## Exercises

**Exercise 5.1**Assume that a sample of size \(n=1\) of a \(\mathcal{U}(0,\theta)\) distribution, where \(\theta\) is unknown. Find the inferior limit of a confidence interval at \(0.95\) confidence for \(\theta\).

**Exercise 5.2 **With the aim of assigning research grants to master students in scientific masters, the Spanish Ministry of Education is analysing the final grades students in the scientific BSc’s of the last academic course.

Assuming that the final grades follow approximately a normal distribution. Compute the confidence interval for the mean of the final grades if the following s.r.s. of \(15\) students is available: \(6.2\), \(7.3\), \(5.5\), \(6.7\), \(9.0\), \(7.1\), \(5.0\), \(6.3\), \(7.2\), \(7.5\), \(8.0\), \(7.9\), \(6.5\), \(6.1\), \(7.0\).

Compute the same confidence interval if the distribution of final grades is unknown and a s.r.s. of \(50\) students with \(\bar{X}=6.5\) and \(S'=1.3\) is available.

**Exercise 5.3 **Assume that the final grades of Exercise 5.2 are distributed as a \(\Gamma(k,\beta)\) with \(k>0\) (known) and \(\beta>0\) (unknown).

- Find the MLE of \(\beta\) and obtain its asymptotic distribution.
- From the obtained asymptotic distribution, find a pivot and construct an asymptotic confidence interval for \(\beta\).
- Estimate the mean final grade and obtain an asymptotic confidence interval at level \(\alpha=0.05\) by using the results in a and b.
- Compute the confidence interval asymptotic obtained in c if we take a s.r.s. of \(50\) students with \(\bar{X}=6.5\) and \(S'=1.3\). Compare the result with the one obtained in part b of Exercise 5.2.
- Construct an exact confidence interval for \(\beta\) from a s.r.s. of size \(n\). Obtain also a confidence interval for the mean final grade and compare the result with the one obtained in d for \(k=5\).

**Exercise 5.4 **The oxygen consumption rate is a measure of the phisiological activity of runners. Two groups of runners have been trained by two methods: one based on continuous training during a certain period of time each day, and another based on intermittent training with the same duration. Samples were taken from the oxygen consumption of the runners trained by both methods, obtaining the following descriptive statistics:

Continuous training | Intermittent training |
---|---|

\(n_1=9\) | \(n_2=7\) |

\(\bar{X}_1=43.71\) | \(\bar{X}_2=39.63\) |

\(S_1'^2=5.88\) | \(S_2'^2=7.68\) |

**Exercise 5.5**In order to estimate the variance of the ammount of study hours of the students in a master in Statistics, a s.r.s. of \(20\) students is analysed, resulting a quasistandard deviation of \(2.5\) hours. Determine a confidence interval for the variance at the confidence level \(0.90\), assuming that the weekly study hours follow a normal distribution.

**Exercise 5.6**The direction of a medical clinic wants to estimate the average number of days required for the treatment of the patients with ages between \(25\) and \(34\) years. A s.r.s. of \(500\) patients of the clinic with these ages provided the mean and quasistandard deviation of \(5.4\) and \(3.1\) days, respectively. Obtain a confidence interval at confidence \(0.95\) for the mean stay time of the patients.

**Exercise 5.7**A poll was made in the fall of 1979 by the Presidential Comission about the retirement politic in the USA. The poll revealed that a large proportion of citizens was very pessimistic about their retirement perspectives. When interviewed if they believed that their retirement pension was going to be sufficient, \(62.9\%\) of the \(6100\) interviewed citizens answered negatively. Compute a confidence interval at a \(0.95\) confidence for the proportion of citizens that believed that their pension would not be sufficient.

**Exercise 5.8 **Assume that \((Y_1,Y_2,Y_3,Y_4)\sim \mathrm{Multin}(n,p_1,p_2,p_3,p_4)\). As in the binomial case, any linear combination of \(Y_1\), \(Y_2\), \(Y_3\), and \(Y_4\) has an approximate normal distribution for large values of \(n\).

- Determine the variance of \(Y_1-Y_2\). (Caution: \(Y_1\) and \(Y_2\) are
*not*independent.) - A study on the attitude of Florida residents with respect to the alligators in urban areas provided the following information: among \(500\) interviewed people, the \(6\%\) said that the alligators should be protected, the \(16\%\) opined that they should be removed, the \(52\%\) that they should be relocated, and the \(26\%\) that there should be a regularised commertial exploitation of the alligators. Estimate the difference between the proportion of population in favour of the protection and the proportion in favour of the removal of the alligators, with a confidence of \(0.95\).

**Exercise 5.9 **We want to compare the ammount of weekly study hours of the students graduated from a BSc in Statistics and a BSc in Economy. For that, we obtained a s.r.s. of \(20\) students graduated from the BSc in Statistics, with \(\bar{X}_1=3\) and \(S_1'=2.5\), and another s.r.s. of \(30\) students graduated from the BSc in Economy, with \(\bar{X}_2=2.8\) and \(S_2'=2.7\). Assume that the weekly study hours follow a normal distribution.

- Compute a confidence interval at \(0.95\) confidence for the ratio of variances of the weekly study hours for both types of students.
- Assuming that the variances are equal, obtain a confidence interval at \(0.95\) confidence for the difference of means of weekly study hours of the two types of students.

Therefore, it is key that \(Z\) is bijective in \(\theta\).↩︎