Chapter 4 Homework 1: Properties of A Random Sample: Problems and Solutions

Exercise 4.1 (Casella and Berger 5.3) Let \(X_1,\cdots,X_n\) be i.i.d random variables with continuous cdf \(F_X\), and \(EX_i=\mu\). Define random variables \(Y_1,\cdots,Y_n\) by \[\begin{equation} Y_i=\left\{ \begin{aligned} & 1 & \quad X_i>\mu \\ & 0 & \quad X_i\leq\mu \end{aligned} \right. \tag{4.1} \end{equation}\] Find the distribution of \(\sum_{i=1}^nY_i\).
Proof. Notice that each \(Y_i\) is a Bernoulli r.v. with probability of success \(p=Pr(X_i>\mu)=1-F_X(\mu)\). Since \(X_1,\cdots,X_n\) are i.i.d random variables, \(Y_i\) are also independent. Thus, their sum follows Binomial distribution with parameters \(n\) and \(p=1-F_X(\mu)\). The pmf can be written as (4.2). \[\begin{equation} f_Y(y)={n \choose y}(1-F_X(\mu))^y(F_X(\mu))^{n-y} \tag{4.2} \end{equation}\] with \(y\in\{0,1,\cdots,n\}\).

Exercise 4.2 (Casella and Berger 5.4) A generalization of i.i.d. random variables is exchangeable random variables. The random variables \(X_1,\cdots,X_n\) are exchangeable if any permutation of any subset of them of size \(k\,(k\leq n)\) has the same distribution. This exercise is about an example of exchangeable but not i.i.d. random variables. Let \(X_i|P\stackrel{i.i.d.}{\sim}Bernoulli(P)\), \(i=1,\cdots,n\) and let \(P\sim Uniform(0,1)\).

  1. Show that the marginal distribution of any k of \(X\)s is the same as \[\begin{equation} P(X_1=x_1,\cdots,X_k=x_k)=\int_0^1p^t(1-p)^{k-t}dp=\frac{t!(k-t)!}{(k+1)!} \tag{4.3} \end{equation}\] where \(t=\sum_{i=1}^kx_i\). Hence \(X\)s are exchangeable.

  2. Show that, marginally, \[\begin{equation} P(X_1=x_1,\cdots,X_n=x_n)\neq\prod_{i=1}^nP(X_i=x_i) \tag{4.4} \end{equation}\] so the distribution of the \(X\)s is not i.i.d.

Proof. (a) For any k of the \(X\)s consider the joint distribution \[\begin{equation} \begin{split} f(x_1,\cdots,x_k)&=\int_0^1f(x_1,\cdots,x_k|p)f(p)dp\\ &=\int_0^1\prod_{i=1}^kp^{x_i}(1-p)^{1-x_i}\cdot 1dp\\ &=\int_0^1p^t(1-p)^{k-t}dp=B(t+1,k-t+1)\\ &=\frac{\Gamma(t+1)\Gamma(k-t+1)}{\Gamma(k+2)}=\frac{t!(k-t)!}{(k+1)!} \end{split} \tag{4.5} \end{equation}\] Where \(t=\sum_{i=1}^kx_i\).

  1. Consider the marginal distribution for each \(X_i\) \[\begin{equation} \begin{split} f(x)&=\int_0^1f(x|p)f(p)dp\\ &=\int_0^1p^x(1-p)^{1-x}\cdot 1dp\\ &=B(x+1,2-x)=\frac{x!(1-x)!}{2}=\frac{1}{2} \end{split} \tag{4.6} \end{equation}\] where \(x=0,1\). Therefore, we have \[\begin{equation} \prod_{i=1}^nP(X_i=x_i)=\frac{1}{2^n}\neq\frac{t^*!(n-t^*)!}{(n+1)!}=P(X_1=x_1,\cdots,X_n=x_n) \tag{4.7} \end{equation}\] where \(t^*=\sum_{i=1}^nx_i\). \(X\)s are not i.i.d. distributed.

Exercise 4.3 (Casella and Berger 5.10) Let \(X_1,\cdots,X_n\) be a random sample from a \(N(\mu,\sigma^2)\) population.

  1. Find expression for the first four moments.

  2. Calculate \(Var(S^2)\).

  3. Use the fact that \(\frac{(n-1)S^2}{\sigma^2}\sim\chi_{n-1}^2\) to calculate \(Var(S^2)\).

Proof. (a) Firstly, since the mgf of \(X_i\) is \(M(t)=exp(\mu t+\frac{\sigma^2t^2}{2})\), we have the following \[\begin{equation} \begin{split} &E(X_i)={M^{\prime}(t)}|_{t=0}=\mu\\ &E(X_i^2)={M^{\prime\prime}(t)}|_{t=0}=\mu^2+\sigma^2\\ &E(X_i^3)={M^{\prime\prime\prime}(t)}|_{t=0}=\mu^3+3\sigma^2\mu\\ &E(X_i^4)={M^{\prime\prime\prime\prime}(t)}|_{t=0}=\mu^4+6\sigma^2\mu^2+3\sigma^4 \end{split} \tag{4.8} \end{equation}\]

For the first moment \[\begin{equation} \theta_1=EX_i=\mu \tag{4.9} \end{equation}\] the second moment \[\begin{equation} \theta_2=E(X_i-\theta_1)^2=\sigma^2 \tag{4.10} \end{equation}\] the third moment \[\begin{equation} \begin{split} \theta_3&=E(X_i-\theta_1)^3\\ &=E(X_i^3)-3\mu E(X_i^2)+3\mu^2E(X_i)-\mu^3=0 \end{split} \tag{4.11} \end{equation}\] and finally the fourth moment \[\begin{equation} \begin{split} \theta_4&=E(X_i-\theta_1)^4\\ &=E(X_i^4)-4\mu E(X_i^3)+6\mu^2E(X_i^2)-4\mu^3E(X_i)+\mu^4=3\sigma^4 \end{split} \tag{4.12} \end{equation}\]

  1. From Exercise 5.8 of Casella and Berger (2002) we have \(Var(S^2)=\frac{1}{n}(\theta_4-\frac{n-3}{n-1}\theta^2_2)\) (the proof of this is straight forward but tedious, which can be done by induction. Maybe just remember this result.) Combine with (4.10) and (4.12) we have \[\begin{equation} Var(S^2)=\frac{1}{n}(3\sigma^4-\frac{(n-3)\sigma^4}{n-1})=\frac{2\sigma^4}{n-1} \tag{4.13} \end{equation}\]

  2. Using the fact that \(\frac{(n-1)S^2}{\sigma^2}\sim\chi_{n-1}^2\), we have \[\begin{equation} Var(S^2)=\frac{\sigma^4}{(n-1)^2}\times 2(n-1)=\frac{2\sigma^4}{n-1} \tag{4.14} \end{equation}\]
    which is the same as (4.13). Here we use the fact that \(Y\sim\chi^2_p\), then \(Var(Y)=2p\).
Exercise 4.4 (Casella and Berger 5.13) Let \(X_1,\cdots,X_n\) be i.i.d. \(N(\mu,\sigma^2)\), find a function of \(S^2\), say \(g(S^2)\) that satisfies \(Eg(S^2)=\sigma\).
Proof. Consider \(g(S^2)=c\sqrt{S^2}\), we have \(\frac{c\sqrt{\sigma^2}}{\sqrt{n-1}}\sqrt{\frac{(n-1)S^2}{\sigma^2}}\) where the random variable inside the square root, denote as \(X\), follows \(\chi^2_{n-1}\). Therefore, \[\begin{equation} \begin{split} E(\sqrt{X})&=\int_0^{\infty}x^{1/2}\frac{1}{2^{\frac{n-1}{2}}\Gamma(\frac{n-1}{2})}x^{\frac{n-1}{2}-1}e^{-\frac{n-1}{2}}\\ &=\frac{2^{\frac{n}{2}}\Gamma(\frac{n}{2})}{2^{\frac{n-1}{2}}\Gamma(\frac{n-1}{2})}\\ &=\frac{\sqrt{2}\Gamma(\frac{n}{2})}{\Gamma(\frac{n-1}{2})} \end{split} \tag{4.15} \end{equation}\] Therefore, \(E(c\sqrt{S^2})=\frac{c\sigma}{\sqrt{n-1}}\frac{\sqrt{2}\Gamma(\frac{n}{2})}{\Gamma(\frac{n-1}{2})}\). Let \(E(c\sqrt{S^2})=\sigma\) we have \(c=\frac{\sqrt{n-1}\Gamma(\frac{n-1}{2})}{\sqrt{2}\Gamma(\frac{n}{2})}\). Hence, the function satisfies the criteria is \(g(S^2)=\frac{\sqrt{n-1}\Gamma(\frac{n-1}{2})}{\sqrt{2}\Gamma(\frac{n}{2})}\sqrt{S^2}\)

Exercise 4.5 (Casella and Berger 5.15) Show the following

  1. \(\bar{X}_n=\frac{X_{n}+(n-1)\bar{X}_{n-1}}{n}\)

  2. \((n-1)S^2_n=(n-2)S_{n-1}^2+(\frac{n-1}{n})(X_n-\bar{X}_{n-1})^2\)
Proof. By definition of sample mean, it is straight forward that \[\begin{equation} \bar{X}_n=\frac{\sum_{i=1}^nX_i}{n}=\frac{X_n+(n-1)\bar{X}_{n-1}}{n} \tag{4.16} \end{equation}\] For (b), we have \[\begin{equation} \begin{split} &(n-2)S_{n-1}^2+(\frac{n-1}{n})(X_n-\bar{X}_{n-1})^2\\ &=\sum_{i=1}^{n-1}(X_i-\bar{X}_{n-1})^2+\frac{n-1}{n}(X_n-\bar{X}_{n-1})^2\\ &=\sum_{i=1}^{n-1}(X_i-\bar{X}_{n}+\bar{X}_{n}-\bar{X}_{n-1})^2+\frac{n-1}{n}(X_n-\bar{X}_{n-1})^2\\ &=\sum_{i=1}^{n-1}(X_i-\bar{X}_{n})^2+(n-1)(\bar{X}_{n}-\bar{X}_{n-1})^2+2(\bar{X}_n-\bar{X}_{n-1})\sum_{i=1}^{n-1}(X_i-\bar{X}_n)\\ &+\frac{n-1}{n}(X_n-\bar{X}_{n-1})^2 \end{split} \tag{4.17} \end{equation}\] To get (4.17), we need the following simple results \[\begin{equation} \begin{split} &\bar{X}_{n}-\bar{X}_{n-1}=\frac{X_n-\bar{X}_n}{n-1} \\ &\sum_{i=1}^{n-1}(X_i-\bar{X}_n)=\bar{X}_n-X_n \\ &X_n-\bar{X}_{n-1}=\frac{n(X_n-\bar{X}_n)}{n-1} \end{split} \tag{4.18} \end{equation}\] Substitute (4.18) into (4.17) we obtain \[\begin{equation} \begin{split} &(n-2)S_{n-1}^2+(\frac{n-1}{n})(X_n-\bar{X}_{n-1})^2\\ &=\sum_{i=1}^{n-1}(X_i-\bar{X}_{n})^2+(n-1)\times\frac{(X_n-\bar{X}_n)^2}{(n-1)^2}\\ &+2(\frac{X_n-\bar{X}_n}{n-1})\times(\bar{X}_n-X_n)+\frac{n-1}{n}\times(\frac{n}{n-1})^2\times(X_n-\bar{X}_n)^2\\ &=\sum_{i=1}^{n-1}(X_i-\bar{X}_{n})^2+(\frac{1}{n-1}-\frac{2}{n-1}+\frac{n}{n-1})(X_n-\bar{X}_n)^2\\ &=\sum_{i=1}^{n}(X_i-\bar{X}_{n})^2=(n-1)S^2_n \end{split} \tag{4.19} \end{equation}\]

Exercise 4.6 (Casella and Berger 5.16) Let \(X_i\),\(i=1,2,3\) be independent with \(N(i,i^2)\). For each of the following situations, use the \(X_i\)s to construct a statistic with the indicated distribution.

  1. chi squared with 3 degrees of freedom

  2. t distribution with 2 degrees of freedom

  3. F distribution with 1 and 2 degrees of freedom

Proof. (a) By definition, if \(X_i\stackrel{i.i.d.}{\sim}N(0,1)\), then \(\sum_{i=1}^nX_i^2\sim\chi_n^2\). Therefore \[\begin{equation} \sum_{i=1}^3(\frac{X_i-i}{i})^2\sim\chi_3^2 \tag{4.20} \end{equation}\]

  1. By definition, if \(U\sim N(0,1)\) and \(V\sim\chi_p^2\), then \(T=\frac{U}{\sqrt{V/p}}\sim t_p\). Therefore \[\begin{equation} \frac{6\sqrt{2}(X_1-1)}{\sqrt{9(X_2-2)^2+4(X_3-3)^2}}\sim t_2 \tag{2.21} \end{equation}\]

  2. By definition, if \(U\sim \chi_p^2\) and \(V\sim \chi_q^2\), then \(\frac{U/p}{V/q}\sim F_{p,q}\). Therefore \[\begin{equation} \frac{72(X_1-1)^2}{9(X_2-2)^2+4(X_3-3)^2} \sim F_{1,2} \tag{4.21} \end{equation}\]

Exercise 4.7 (Casella and Berger 5.18) Let \(X\) be a random variable with a t distribution with p degrees of freedom.

  1. Derive the mean and variance of \(X\).

  2. Show that \(X^2\) has an F distribution with 1 and p degrees of freedom.

  3. Let \(f(x|p)\) denote the pdf of \(X\). Show that \[\begin{equation} \lim_{p\to\infty}f(x|p)\to\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}} \tag{4.22} \end{equation}\] at each value of \(x\), \(-\infty<x<\infty\). This suggests that as \(p\to\infty\), \(X\) converges in distribution to a \(N(0,1)\) random variable.

  4. Use the result of parts (a) and (b) to argue that, as \(p\to\infty\), \(X^2\) converges in distribution to a \(\chi_1^2\) random variable.

  5. Guess the distributional limit as \(p\to\infty\), of \(qF_{q,p}\).

Proof. (a) For the mean, using the definition of mean we have \[\begin{equation} ET_p=\int_{-\infty}^{+\infty}t\cdot\frac{\Gamma(\frac{p-1}{2})}{\Gamma(\frac{p}{2})}\frac{1}{(p\pi)^{1/2}}\frac{1}{(1+t^2/p)^{(p+1)/2}}dt \tag{4.23} \end{equation}\] Noticing that the integrant of (4.23) is a odd function, therefore, the integral is 0 when \(p>1\).

As for the variance, noticing that \(T_p=\frac{U}{\sqrt{V/p}}\) with independent \(U\sim N(0,1)\) and \(V\sim\chi_p^2\). Thus, \[\begin{equation} Var(T_p^2)=E(T_p^2)=pE(U^2)E(V^{-1})=\frac{p}{p-2},\quad \forall p>2 \tag{4.24} \end{equation}\] where we used the result that the expectation of inverse chi squared distribution with p degrees of freedom is \(\frac{1}{p-2}\).

  1. By definition, \(X=\frac{U}{\sqrt{V/q}}\) with independent \(U\sim N(0,1)\) and \(V\sim\chi_q^2\). Therefore, \(X^2=\frac{U^2/1}{V/q}\) follows \(F_{1,q}\) by definition.

  2. The pdf of a t distributed r.v. with p degree of freedom is \[\begin{equation} f(x|p)=\frac{\Gamma{(p+1)/2}}{\Gamma(p/2)\sqrt{p\pi}}(1+\frac{x^2}{p})^{-\frac{p+1}{2}}\propto(1+\frac{x^2}{p})^{-\frac{p+1}{2}} \tag{4.25} \end{equation}\]

Assume \(\lim_{p\to\infty}f(x|p)\to f^*(x)\), then by dominant convergence theorem and \(f(x|p)\) is the pdf of a t distributed random variable with p degree of freedom, we can change the order of limit and integral, which gives \[\begin{equation} \int_{-\infty}^{\infty}f^*(x)dx=\int_{-\infty}^{\infty}\lim_{p\to\infty}f(x|p)dx=\lim_{p\to\infty}\int_{-\infty}^{\infty}f(x|p)dx=1 \tag{4.26} \end{equation}\] Therefore, we know \(f^*(x)\) is a pdf. Now use (4.25), we have \[\begin{equation} f^*(x)\propto\lim_{p\to\infty}(1+\frac{x^2}{p})^{-\frac{p+1}{2}}=exp(-\frac{x^2}{2}) \tag{4.27} \end{equation}\] By \(f^*(x)\) is a pdf we know the missing constant must be \(\frac{1}{\sqrt{2\pi}}\). Therefore, the result is proved.

  1. Since \(X^2\) is the square of a t distributed r.v., and as \(p\to\infty\), the t distribution converge in distribution to \(N(0,1)\), therefore, we argue that \(\lim_{p\to\infty}X^2\sim\chi^2_1\).

  2. Since \(qF_{q,p}\) is the sum of q squared t distributed random variables with p degree of freedom. Thus, we guess \(qF_{q,p}\) converge in distribution to a \(\chi^2_q\) distribution as \(p\to\infty\).
Exercise 4.8 (Casella and Berger 5.21) What is the probability that the larger of two continuous i.i.d. random variable will exceed the population median? Generalize this result to samples of size n.

Proof. Denote the sample median as \(m\), then for any \(n\in\{2,3,\cdots,\}\), suppose \(X_{(n)}\) is the largest one. We have \[\begin{equation} Pr(X_{(n)}>m)=1-Pr(X_{(n)}\leq m)=1-(Pr(X_i<m))^n=1-\frac{1}{2^n} \tag{4.28} \end{equation}\]

(Note here we consider population median which is a constant.)
Exercise 4.9 (Casella and Berger 5.22) Let \(X\) and \(Y\) be i.i.d. \(N(0,1)\) random variables and define \(Z=min(X,Y)\). Prove that \(Z^2\sim\chi_1^2\)

Proof. Consider the distribution of \(Z^2\) directly, for \(z\in\mathbb{R}_+\), \[\begin{equation} \begin{split} Pr(Z^2\leq z)&=Pr(-\sqrt{z}\leq min(X,Y)\leq\sqrt{z})\\ &=Pr(X>-\sqrt{z})Pr(Y>-\sqrt{z})-Pr(X>\sqrt{z})Pr(Y>\sqrt{z})\\ &=(F(\sqrt{z}))^2-(1-F(\sqrt{z}))^2=2F(\sqrt{z})-1 \end{split} \tag{4.29} \end{equation}\]

Take derivatives of (4.29) we have \[\begin{equation} f(z)=2f(\sqrt{z})\times\frac{1}{2}z^{-\frac{1}{2}}=\frac{1}{\sqrt{2\pi}}exp(-\frac{z}{2})z^{-\frac{1}{2}} \tag{4.30} \end{equation}\] as we desired. Here \(F(z)\) and \(f(z)\) denote the cdf and pdf of a standard normal random variable, respectively.

In this case, it can not be proved by trying to say \(Z\) is a standard normally distributed random variable. In particular

\[\begin{equation} Pr(Z\leq z)=1-Pr(Z>z)=1-(1-F(z))^2 \tag{4.31} \end{equation}\] where \(F(z)\) denote the cdf of a standard normal r.v. evaluate at z. It leads to \(f_Z(z)=2(1-F(z))f(z)\). Since \(F(z)\) has no close form, we can not proof using this method.

Exercise 4.10 (Casella and Berger 5.23) Let \(U_i\), \(i=1,2,\cdots\), be independent \(uniform(0,1)\) random variables, and let \(X\) have distribution \[\begin{equation} P(X=x)=\frac{c}{x!},\quad x=1,2,\cdots \tag{4.32} \end{equation}\] where \(c=\frac{1}{e-1}\). Find the distribution of \(Z=\min\{U_1,\cdots,U_X\}\)

Proof. First consider the distribution of \(Z|X=x\), we have

\[\begin{equation} F_{Z|X=x}(z)=1-(1-z)^x \tag{4.33} \end{equation}\] Hence we can get the pdf as \[\begin{equation} f_{Z|X=x}(z)=x(1-z)^{x-1} \tag{4.34} \end{equation}\] with \(z\in(0,1)\) and \(x=1,2,\cdots\). Then by law of total probability, the margianl distribution of \(Z\) can get from the joint distribution, marginalized out \(X\). Hence the pdf of \(Z\) is \[\begin{equation} \begin{split} f_Z(z)&=\sum_{x=1}^{\infty}x(1-z)^{x-1}\frac{c}{x!}\\ &=c\sum_{x=0}^{\infty}\frac{(1-z)^x}{x!}\\ &=ce^{1-z} \end{split} \tag{4.35} \end{equation}\] Thus, the distribution of \(Z\) is \(f(z)=\frac{e^{1-z}}{e-1}\) with \(z\in(0,1)\).
Exercise 4.11 (Casella and Berger 5.31) Suppose \(\bar{X}\) is the mean of 100 observations from a population with mean \(\mu\) and variance \(\sigma^2=9\). Find limits between which \(\bar{X}-\mu\) will lie with probability at least 0.9. Use both Chebychev inequality and CLT, comment.

Proof. For Chebychev inequality, since we know the distribution of \(\bar{X}\) is \(N(\mu,\frac{\sigma^2}{n})\). Hence, by Chebychev inequality, \(Pr(|\bar{X}-\mu|>\frac{k\sigma}{\sqrt{n}})<\frac{1}{k^2}\). To have the region with at least 0.9 probability, we have to use \(k=\sqrt{10}\). Hence the limit is \((-\frac{3\sqrt{10}}{10},\frac{3\sqrt{10}}{10})\).

Using CLT, \(\frac{\bar{X}-\mu}{\frac{\sigma}{\sqrt{n}}}\) have approximate \(N(0,1)\) distribution. Hence, the boundary is given by \((0.3\times\Phi(0.05),0.3\times\Phi(0.95))=(-0.4935,0.4935)\). Here \(\Phi(t)\) means that \(Pr(Y\leq\Phi(t))=t\), where \(Y\) is standard normally distributed.

The one using CLT is better since the region is smaller.

Exercise 4.12 (Casella and Berger 5.44) Let \(X_i,i=1,2,\cdots\), be independent Bernoulli(\(p\)) random variables and let \(Y_n=\frac{1}{n}\sum_{i=1}^nX_i\).

  1. Show that \(\sqrt{n}(Y_n-p)\to N(0,p(1-p))\) in distribution.

  2. Show that for \(p\neq\frac{1}{2}\), the estimate of variance \(Y_n(1-Y_n)\) satisfies \(\sqrt{n}[Y_n(1-Y_n)-p(1-p)]\to N(0,(1-2p)^2p(1-p))\) in distribution.

  3. Show that for \(p=\frac{1}{2}\), \(n[Y_n(1-Y_n)-\frac{1}{4}]\to -\frac{1}{4}\chi_1^2\) in distribution.

Proof. (a) Since for each \(X_i\), its mean is \(p\) and variance is \(p(1-p)\). Hence, \(Y_n\) have expectation \(p\) and variance \(\frac{p(1-p)}{n}\). Thus, by CLT, \(\frac{Y_n-p}{\sqrt{\frac{p(1-p)}{n}}}\to N(0,1)\) in distribution. That is to say, \(\sqrt{n}(Y_n-p)\to N(0,p(1-p))\) in distribution.

  1. From \((a)\), using Delta method, let \(g(\theta)=\theta(1-\theta)\). We have when \(\frac{dg(p)}{dp}\neq0\) or equivalently \(p\neq\frac{1}{2}\), \(\sqrt{n}[Y_n(1-Y_n)-p(1-p)]\to N(0,(1-2p)^2p(1-p))\) in distribution.

  2. Using Second order Delta method, \(n[Y_n(1-Y_n)-\frac{1}{4}]\to -\frac{1}{4}\chi_1^2\) in distribution.

References

Casella, George, and Roger Berger. 2002. Statistical Inference. 2nd ed. Belmont, CA: Duxbury Resource Center.