Exercises
Exercise 4.1 The lifetime of a certain type of electrical components follows a \(\Gamma(\alpha,\beta)\) distribution, but the values of the parameters \(\alpha\) and \(\beta\) are unknown. Assume that three of these components are tried in an independent way and the following lifetimes are measured: \(120,\) \(130,\) and \(128\) hours.
Exercise 4.2 The proportion of impurities in each manufactured unit of a certain kind of chemical product is a rv with pdf
\[\begin{align*} f(x;\theta)=\begin{cases} (\theta+1)x^\theta, & 0<x<1,\\ 0, & \text{otherwise}, \end{cases} \end{align*}\]
where \(\theta>-1.\) Five units of the manufactured product are taken in one day, resulting in the next impurity proportions: \(0.33,\) \(0.51,\) \(0.02,\) \(0.15,\) \(0.12.\)
- Obtain the moment estimator of \(\theta.\)
- Obtain the maximum likelihood estimator of \(\theta.\)
Exercise 4.3 An industrial product is packaged in batches of \(N\) units each. The number of defective units within each batch is unknown. Since checking whether a unit is defective or not is expensive, the quality control consists in selecting \(n\) units of the batch and obtaining an estimation of the number of defective units within the batch. The batch is rejected if the estimated number of defective units exceeds \(N/4.\)
- Find the moment estimator of the number of defective units within a parcel.
- If \(N=20,\) \(n=5,\) and among these \(n\) units \(2\) of them are defective, is the batch rejected?
Exercise 4.4 Assume that there are \(3\) balls in an urn: \(\theta\) of them are red and \(3-\theta\) white. Two balls are extracted (with replacement).
- The two balls are red. Obtain the MLE of \(\theta.\)
- If only one of the balls is red, what is now the MLE of \(\theta\)?
Exercise 4.5 A particular machine may fail because of two mutually exclusive types of failures: A and B. We wish to estimate the probability of each failure by knowing that:
- The probability of failure A is twice the one of B.
- In \(30\) days we observed \(2\) failures of A, \(3\) failures of B, and \(25\) days without failures.
Compute the moment and maximum likelihood estimators of the probability of each failure.
Exercise 4.6 In the manufacturing process of metallic pieces of three sizes, the proportions of small, normal, and large pieces are \(p_1=0.05,\) \(p_2=0.9,\) and \(p_3=0.05,\) respectively. It is suspected that the machine is not properly adjusted and that the proportions have changed in the following way: \(p_1=0.05+\tau,\) \(p_2=0.9-2\tau,\) and \(p_3=0.05+\tau.\) For checking this suspicion, \(5000\) pieces are analyzed, finding \(278,\) \(4428,\) and \(294\) pieces of small, normal, and large sizes, respectively. Compute the MLE of \(\tau.\)
Exercise 4.7 Let \(X_i\sim \mathcal{N}(i\theta, 1),\) \(i=1,\ldots,n,\) be independent rv’s.
- Obtain the MLE of \(\theta.\)
- Is it unbiased?
- Obtain its asymptotic variance.
Exercise 4.8 For \(\theta>0,\) let \(X\) be a rv with pdf
\[\begin{align*} f(x;\theta)=\frac{\theta}{x^2},\quad x\geq \theta >0. \end{align*}\]
- Find the MLE of \(\theta.\)
- Try to obtain the moment estimator of \(\theta.\) Is there any problem?
Exercise 4.9 Consider the rv \(X\) with pdf
\[\begin{align*} f(x)=\begin{cases} \frac{3\theta^3}{x^4} & \text{if} \ x\geq \theta, \\ 0 & \text{if} \ x< \theta. \end{cases} \end{align*}\]
- Find the MLE of \(\theta.\)
- Find the moment estimator \(\hat{\theta}_{\mathrm{MM}}.\) Does it always makes sense?
- Is \(\hat{\theta}_{\mathrm{MM}}\) unbiased?
- Compute the precision of \(\hat{\theta}_{\mathrm{MM}}.\)
Exercise 4.10 A srs of size \(n\) from the pdf
\[\begin{align*} f(x)=2\theta xe^{-\theta x^2},\quad x>0, \end{align*}\]
where \(\theta>0\) is an unknown parameter, is available.
- Determine the moment estimator of \(\theta.\)
- Determine the MLE of \(\theta\) and find its asymptotic distribution.
Exercise 4.11 Let \((X_1,\ldots,X_n)\) be a srs from the inverse gamma distribution, whose pdf is
\[\begin{align*} f(x;\alpha,\beta)=\frac{\beta^\alpha}{\Gamma(\alpha)}x^{-(\alpha+1)}e^{-\beta/x}, \quad x>0, \ \alpha>0, \ \beta>0. \end{align*}\]
- Obtain \(\hat{\beta}_{\mathrm{MM}}\) assuming \(\alpha\) is known.
- Obtain \(\hat{\beta}_{\mathrm{MLE}}\) assuming \(\alpha\) is known.
- Try to obtain \((\hat{\alpha}_{\mathrm{MM}},\hat{\beta}_{\mathrm{MM}}).\)
Exercise 4.12 Let \((X_1,\ldots,X_n)\) be a srs from the inverse Gaussian distribution, whose pdf is
\[\begin{align*} f(x;\mu,\lambda)=\sqrt{\frac{\lambda}{2\pi x^3}}\exp\left\{-\frac{\lambda(x-\mu)^2}{2\mu^2x}\right\}, \quad x>0, \ \mu>0, \ \lambda>0. \end{align*}\]
Estimate \(\mu\) and \(\lambda\) by:
- Maximum likelihood.
- The method of moments.
Exercise 4.13 Consider the pdf given by
\[\begin{align*} f(x;\theta) = \frac{\theta^2}{1+\theta}(1+x)e^{-\theta x}, \quad x>0,\ \theta>0, \end{align*}\]
and assume a srs \((X_1,\ldots,X_n)\) from it is given.
- Obtain \(\hat{\theta}_\mathrm{MLE}.\)
- Obtain the asymptotic variance of \(\hat{\theta}_\mathrm{MLE}.\)
Exercise 4.14 Consider the following mixture of normal pdf’s:
\[\begin{align} f(x;\boldsymbol{\theta})=w\phi(x;\mu,\sigma^2)+(1-w)\phi(x;-\mu,\sigma^2),\tag{4.11} \end{align}\]
where \(\boldsymbol{\theta}=(\mu,\sigma^2,w)'\in\mathbb{R}\times\mathbb{R}_+\times[0,1].\) Assume there is a srs \((X_1,\ldots,X_n)\) from \(f(\cdot;\boldsymbol{\theta}).\) Then:
- Derive the moment estimators of \(\boldsymbol{\theta}.\) You can use that \(\mathbb{E}[\mathcal{N}(\mu,\sigma^2)^3]=\mu^3+3 \mu \sigma^2\) and \(\mathbb{E}[\mathcal{N}(\mu,\sigma^2)^4]=\mu^4+6 \mu^2 \sigma^2+3 \sigma^4.\)
- Implement in R the moment estimator to verify its correct behavior using a simulated sample from (4.11).
- Try to derive the MLE of \(\boldsymbol{\theta}.\) Is it simpler or harder than deriving the moment estimators?
Exercise 4.15 Derive the Fisher information matrix in Example 4.15.
Exercise 4.16 Derive the Fisher information matrix in Example 4.16.
Exercise 4.17 Consider a srs \((X_1,Y_1),\ldots,(X_n,Y_n)\) from a bivariate random vector distributed as \(\mathcal{N}_2\left(\begin{pmatrix}0\\0\end{pmatrix},\begin{pmatrix}1&\rho\\\rho&1\end{pmatrix}\right).\) Find:
- The MLE of \(\rho\) (at least in an implicit form).
- The asymptotic distribution of \(\hat{\rho}_\mathrm{MLE}.\)
Exercise 4.18 Replicate Figure 4.1 by coding in R the simulation study behind it.
Exercise 4.19 Replicate Figure 4.4 by coding in R the simulation study behind it. Use rgamma(n = n, shape = k, rate = theta)
to simulate a srs of size \(n\) from a \(\Gamma(k,\theta).\)
Exercise 4.20 Validate empirically (4.7) in Theorem 4.2 by coding in R a simulation study that delivers a visualization of the MLE’s distribution similar to the lower plots of Figure 4.4, but for the bivariate case. For that, use scatterplots rather than histograms and contourlines of the bivariate standard normal density. Use the \(\Gamma(k,\theta)\) distribution (both parameters unknown) with Fisher information matrix given in Example 4.16.
Some help:
- Compute numerically \((\hat{\theta}_\mathrm{MLE},\hat{k}_\mathrm{MLE})\) using the function
MASS::fitdistr(..., densfun = "gamma")
. - The computation of \(\psi\) and \(\psi'\) (first and second derivatives of \(\log\Gamma\)) can be done with
numDeriv::grad()
andnumDeriv::hessian()
. - Draw contours of a bivariate normal using
contour()
andmvtnorm::dmvnorm()
.
Exercise 4.21 Perform the analogous experiment to Exercise 4.20 for the \(\mathrm{Beta}(\alpha,\beta)\) distribution (both parameters unknown) with Fisher information matrix given in Example 4.17. Additionally, show the contourplots of several (random) log-likelihood functions, as well as their maxima, in the spirit of the upper plots in Figure 4.4.
Exercise 4.22 Given a srs \((X_1,\ldots,X_n)\) from a \(\Gamma(\theta,3)\) distribution, let us investigate the precision of the estimation of the Fisher information \(\mathcal{I}(\theta)\) using three approximations:
- \(\mathcal{I}(\hat{\theta}_\mathrm{MLE}).\)
- \(\hat{\mathcal{I}}(\theta).\)
- \(\hat{\mathcal{I}}(\hat{\theta}_\mathrm{MLE}).\)
Draw the above estimates as a function of \(n\) to have a plot in the spirit of Figure 4.1, but to estimate \(\mathcal{I}(\theta).\) Use \(\theta=2,5,\) \(N=5\) repetitions, and an increasing sequence of \(n\)’s until \(n=300.\) Importantly, use the same sample to compute the estimators in a–c to allow for a better comparison. Which estimator seems to perform better? Which ones are almost equivalent?
Exercise 4.23 Use simulations to empirically validate the following asymptotic distributions of the MLE for the uniparameter case:
- \(\sqrt{n\mathcal{I}(\theta)}(\hat{\theta}_{\mathrm{MLE}}-\theta)\stackrel{d}{\longrightarrow}\mathcal{N}(0,1).\)
- \(\sqrt{n\hat{\mathcal{I}}(\theta)}(\hat{\theta}_{\mathrm{MLE}}-\theta)\stackrel{d}{\longrightarrow}\mathcal{N}(0,1).\)
- \(\sqrt{n\mathcal{I}(\hat{\theta}_{\mathrm{MLE}})}(\hat{\theta}_{\mathrm{MLE}}-\theta)\stackrel{d}{\longrightarrow}\mathcal{N}(0,1).\)
- \(\sqrt{n\hat{\mathcal{I}}(\hat{\theta}_{\mathrm{MLE}})}(\hat{\theta}_{\mathrm{MLE}}-\theta)\stackrel{d}{\longrightarrow}\mathcal{N}(0,1).\)
Consider the \(\mathrm{Beta}(\alpha,\theta)\) distribution with \(\alpha\) known and \(\theta\) unknown, and Fisher information given in Example 4.17. Then, simulate \(N=1000\) srs’s of size \(n\) from \(\mathrm{Beta}(\alpha,\theta)\) and show the histograms for the random variables in a–d, as well as the standard normal density. Set \(\alpha,\) \(\theta,\) and \(n\) to values of your choice.
Exercise 4.24 Use simulations to empirically validate the following asymptotic distributions of the MLE for the multiparameter case:
- \(\sqrt{n}\boldsymbol{R}(\boldsymbol{\theta})'(\hat{\boldsymbol{\theta}}_\mathrm{MLE}-\boldsymbol{\theta})\stackrel{d}{\longrightarrow} \mathcal{N}_2\left(\boldsymbol{0},\boldsymbol{I}_2\right).\)
- \(\sqrt{n}\hat{\boldsymbol{R}}(\boldsymbol{\theta})'(\hat{\boldsymbol{\theta}}_\mathrm{MLE}-\boldsymbol{\theta})\stackrel{d}{\longrightarrow} \mathcal{N}_2\left(\boldsymbol{0},\boldsymbol{I}_2\right).\)
- \(\sqrt{n}\boldsymbol{R}(\hat{\boldsymbol{\theta}}_\mathrm{MLE})'(\hat{\boldsymbol{\theta}}_\mathrm{MLE}-\boldsymbol{\theta})\stackrel{d}{\longrightarrow} \mathcal{N}_2\left(\boldsymbol{0},\boldsymbol{I}_2\right).\)
- \(\sqrt{n}\hat{\boldsymbol{R}}(\hat{\boldsymbol{\theta}}_\mathrm{MLE})'(\hat{\boldsymbol{\theta}}_\mathrm{MLE}-\boldsymbol{\theta})\stackrel{d}{\longrightarrow} \mathcal{N}_2\left(\boldsymbol{0},\boldsymbol{I}_2\right).\)
Consider the \(\mathrm{Beta}(\alpha,\beta)\) distribution with \(\boldsymbol{\theta}=(\alpha,\beta)\) unknown and Fisher information matrix given in Example 4.17. Then, simulate \(N=1000\) srs’s of size \(n\) from \(\mathrm{Beta}(\alpha,\beta)\) and show the scatterplots for the random vectors in a–d, as well as the contourlines of the bivariate standard normal density. Set \(\boldsymbol{\theta}\) and \(n\) to values of your choice. Use chol()
to compute the Cholesky decomposition in R.
Exercise 4.25 The Gumbel distribution \(\mathrm{Gumbel}(\mu,\beta)\) has cdf
\[\begin{align*} F(x;\mu,\beta) = e^{-e^{-((x-\mu)/\beta)}}, \quad x\in \mathbb{R} \end{align*}\]
with \(\mu \in \mathbb{R}\) and \(\beta>0.\) It is used in extreme value theory to model the maximum of a number of samples of various distributions. Let \((X_1,\ldots,X_n)\) be a srs from a \(\mathrm{Gumbel}(\mu,\beta)\) distribution.
- Obtain the system of equations to compute \(\hat{\mu}_\mathrm{MLE}\) and \(\hat{\beta}_\mathrm{MLE}.\)
- Obtain \(\hat{\boldsymbol{\mathcal{I}}}(\mu,\beta).\)
- Derive usable asymptotic distributions for \(\hat{\mu}_\mathrm{MLE}\) and \(\hat{\beta}_\mathrm{MLE}.\)