1.1 Past FYE Problems
Exercise 1.1 (By Huang, FYE 2020) A machine produces parts that are either defective or not. Let \(\theta\) denote the proportion of defectives among all parts produced by this machine. Suppose that we observe a fixed number \(n\) of such parts and let \(Y\) be the number of defectives among the \(n\) parts observed. We assume that the parts are conditionally independent given \(\theta\). Suppose that the marginal distribution of \(\theta\) follows the beta distribution with parameters \(\alpha\) and \(\beta\).
(i). (30%) Find the conditional distribution of \(\theta|Y\).
(ii). (30%) If \(\alpha=\beta=1\), find the marginal distribution of \(Y\).
(iii). (10%) Find the mean of \(Y\).
- (iv). (30%) Find the variance of \(Y\)
Proof. (i). From the problem, we know that \(Y|\theta\sim Bin(y;n,\theta)\) and \(\theta\sim Beta(\alpha,\beta)\). Therefore, by Bayes theorem, we have:
\[\begin{equation} p(\theta|Y)\propto \theta^{\alpha+y-1}(1-\theta)^{n-y+\beta-1} \tag{1.1} \end{equation}\]
Notice that this is the kernel of a Beta distribution, we have \(\theta|Y\sim Beta(\alpha+y,\beta+n-y)\).
(ii). The marginal distribution of \(Y\) is
\[\begin{equation} m(y)=\int_0^1 \frac{{n \choose y}\theta^{\alpha+y-1}(1-\theta)^{n-y+\beta-1}}{B(\alpha,\beta)}d\theta=\frac{{n \choose y}B(\alpha+y,\beta+n-y)}{B(\alpha,\beta)} \tag{1.2} \end{equation}\]
which is a beta-binomial distribution. Taking \(\alpha=\beta=1\), we have \[\begin{equation} m(y)=\frac{{n \choose y}B(1+y,1+n-y)}{B(1,1)} \end{equation}\] where the Beta function \(B(\alpha,\beta)=\frac{\Gamma(\alpha)\Gamma(\beta)}{\Gamma(\alpha+\beta)}\). Subsitute in the values, we have \[\begin{equation} \begin{split} & B(1+y,1+n-y)=\frac{y!(n-y)!}{(n+1)!}\\ & B(1,1)=1\\ & {n \choose y}=\frac{n!}{y!(n-y)!} \end{split} \tag{1.3} \end{equation}\] Take back to (1.2) we have \[\begin{equation} m(y)=\frac{1}{n+1} \tag{1.4} \end{equation}\] for \(Y=0,1,\cdots,n\).
(iii). Since \(m(y)=\frac{1}{n+1}\) for \(Y=0,1,\cdots,n\), by definition, the mean is
\[\begin{equation}
E(Y)=\sum_{y=0}^n\frac{y}{n+1}=\frac{1}{n+1}\cdot\frac{n(n+1)}{2}=\frac{n}{2}
\tag{1.5}
\end{equation}\]
Exercise 1.2 (By Juhee, FYE 2018) 1. (60%) Let \(X_1, \cdots, X_n\) be an independent and identically distributed (i.i.d.) sample from an exponential distribution with rate parameter \(\lambda\). Let \(Y=X_1+\cdots+X_n\).
(a). (25%) Show that \(Y\) follows a gamma distribution with shape parameter \(n\) and rate parameter \(\lambda\).
(b). (15%) Identify the asymptotic distribution of \(\sqrt{n}\{(Y/n)-(1/\lambda)\}\), and specify its parameters. Justify your answer.
(c). (20%) Consider a random variable \(Z\) with conditional distribution, \(Z|Y=y\), given by a Poisson distribution with mean \(y\). Derive the marginal distribution of \(Z\).
- (40%) Let \(X_1\), \(X_2\) be i.i.d. exponential random variables with rate parameter \(\lambda\). Let \(U\) be a random variable which is uniformly distributed on \([0,1]\). Suppose that \(U\), \(X_1\), and \(X_2\) are independent. Let \(Z=(X_1+X_2)U\). Prove that \(Z\) is an exponential random variable with rate \(\lambda\).
Exercise 1.3 (By Tatiana, FYE 2015) Nic is vacationing in Monte Carlo. The amount \(X\) (in dollars) he takes to the casino each evening is a random variable with probability density function
\[\begin{equation}
f_X(x)=\left\{\begin{aligned} &ax &\quad 0\leq x\leq 40\\ &0 &\quad o.w. \end{aligned}\right.
\tag{1.8}
\end{equation}\]
At the end of each night, the amount \(Y\) that he has on leaving the casino is uniformly distributed between zero and twice the amount he took in.
(a). (10%) Find the constant \(a\).
(b). (25%) Determine the joint probability density function \(f_{X,Y}(x,y)\). Be sure to indicate what the sample space is.
(c). (25%) What is the probability that on any given night Nic makes a positive profit at the casino? Justify your reasoning.
(d). (40%) Find the probability density function of Nic’s profit on any particular night, \(Z=Y-X\). What is \(E(Z)\)?Proof. By definition \(f_{X,Y}=f_X(x)f_{Y|X}(y|x)\).
(a). We have \[\begin{equation} 1=\int_0^{80}axdx=800a \tag{1.9} \end{equation}\] so \(f_X(x)=\frac{x}{800}\), for \(0\leq x\leq 40\).
(b). From the problem statement \(f_{Y|X}(y|x)=\frac{1}{2x}\), for \(y\in[0,2x]\). Therefore, \[\begin{equation} f_{X,Y}(x,y)=\left\{\begin{aligned} &\frac{1}{1600} &\quad 0\leq x\leq 40,\quad 0\leq y\leq 2x\\ &0 &\quad o.w. \end{aligned}\right. \tag{1.10} \end{equation}\]
(c). Nic makes a positive profit if \(Y>X\). This occurs with probability
\[\begin{equation}
P(Y>X)=\int\int_{y>x}f_{X,Y}(x,y)dydx=\int_0^{40}\int_{x}^{2x}\frac{1}{1600}dydx=\frac{1}{2}
\tag{1.11}
\end{equation}\]
We could have also arrived at this answer by realizing that for each possible value of \(X\), there is \(\frac{1}{2}\) probability that \(Y>X\).
Exercise 1.4 (By Tatiana, FYE 2015 Retake) Andrew shows up at BE 354 at time zero and spends his time exclusively typing e-mails. The times that his e-mails are sent follow a Poisson process with rate \(\lambda_A\) per hour. Let \(Y_1\) and \(Y_2\) be the times that Andrew’s first and second e-mails were sent.
(30%) Find the probability density function of \(Y_1^2\).
(20%) Find the conditional expectation of \(Y_2\) given \(Y_1\).
(30%) Find the joint p.d.f. of \(Y_1\) and \(Y_2\).
(20%) Let \(N\) be the number of e-mails sent by Andrew during the interval \([0,1]\). You are now told that \(\lambda_A\) is actually the realized value of an exponential random variable \(\Lambda\) with p.d.f. given by \(f_{\Lambda}(\lambda)=2e^{−2\lambda}, \lambda\geq 0\). Find \(E(N^2)\).
Proof. 1. Let \(Z=Y_1^2\). Then we first find the c.d.f. of \(Z\) and differentiate to find the p.d.f. of \(Z\). \[\begin{equation} F_Z(z)=P(Y_1^2\leq z)=P(-\sqrt{z}\leq Y_1\leq \sqrt{z})=F_{Y_1}(\sqrt{z})-F_{Y_1}(-\sqrt{z}) \tag{1.16} \end{equation}\] and \[\begin{equation} f_Z(z)=\frac{dF_Z(z)}{dz}=f_{Y_1}(\sqrt{z})\frac{1}{2}z^{-1/2}+f_{Y_1}(-\sqrt{z})\frac{1}{2}z^{-1/2}=\frac{\lambda_A}{2z^{-1/2}}(e^{-\lambda_A\sqrt{z}}+e^{\lambda_A\sqrt{z}}) \tag{1.17} \end{equation}\] for \(x\geq 0\).
We define \(T_2\) as the second inter-arrival time in Poisson process. Then \(Y_2=Y_1+T_2\) and \[\begin{equation} E(Y_2|Y_1)=E(Y_1+T_2|Y_1)=Y_1+E(T_2)=Y_1+\frac{1}{\lambda_A} \tag{1.17} \end{equation}\]
\[\begin{equation} \begin{split} f_{Y_1,Y_2}(y_1,y_2)&=f_{Y_1}(y_1)f_{Y_2|Y_1}(y_2|y_1)=f_{Y_1}(y_1)f_{T_2}(y_2-y_1)\\ &=\lambda_Ae^{-\lambda_Ay_1}\lambda_Ae^{-\lambda_A(y_2-y_1)}=\lambda_A^2e^{-\lambda_Ay_2} \end{split} \tag{1.18} \end{equation}\] for \(y_2\geq y_1\geq 0\).
- \[\begin{equation} \begin{split} E(N^2)&=E[E(N^2|\Lambda)]=E[Var(N|\Lambda)+(E(N|\Lambda))^2]=E(\Lambda+\Lambda^2)\\ &=E(\Lambda)+Var(\Lambda)+(E(\Lambda))^2\\ &=\frac{1}{2}+\frac{2}{2^2}=1 \end{split} \tag{1.19} \end{equation}\]
Exercise 1.5 (By Raquel, FYE 2014) Let \(X_1,X_2,X_3\) be independent and identically distributed exponential random variables with density \[\begin{equation} f_X(x)=\left\{\begin{aligned} &\beta e^{-\beta x} & x>0\\ & 0 & o.w. \end{aligned} \right. \tag{1.20} \end{equation}\] Let \(Y_i=X_1+\cdots+X_i\) for \(i=1,2,3\).
(30%) Find the joint p.d.f. of \(Y_1\) and \(Y_2\).
(20%) Find the p.d.f. of \(Y_2\).
(30%) Find he joint p.d.f. of \(Y_1,Y_2\) and \(Y_3\).
- (20%) Show that the p.d.f. of \(Y_3\) is \[\begin{equation} f_{Y_3}(y_3)=\left\{\begin{aligned} &\beta^3\frac{y_3^2}{2} e^{-\beta y_3} & y_3>0\\ & 0 & o.w. \end{aligned} \right. \tag{1.21} \end{equation}\] In other words, show that \(Y_3\) is a Gamma random variable with parameters \(3\) and \(\beta\).
Proof. 1. The joint p.d.f. of \(Y_1\) and \(Y_2\) is given by \(f_{Y_1,Y_2}(y_1,y_2)=f_{X_1,X_2}(s_1,s_2)|J|\) with \(s_1(Y_1,Y_2)=X_1\), \(s_2(Y_1,Y_2)=X_2\) and \(J\) the Jacobian of the transformation.
In this case \(X_1=s_1(Y_1,Y_2)=Y_1\), \(X_2=s_2(Y_1,Y_2)=Y_2-Y_1\) and \[\begin{equation} J=det\begin{pmatrix} 1 & 0 \\ -1 & 1 \end{pmatrix}=1 \tag{1.22} \end{equation}\] Then, \[\begin{equation} f_{Y_1,Y_2}(y_1,y_2)=\left\{\begin{aligned} &\beta^2 e^{-\beta y_2} & y_2>y_1>0\\ & 0 & o.w. \end{aligned} \right. \tag{1.23} \end{equation}\]
The marginal p.d.f. of \(Y_2\) is \[\begin{equation} f_{Y_2}(y_2)=\int_0^{y_2}f_{Y_1,Y_2}(y_1,y_2)dy_1=\int_{0}^{y_2}\beta^2e^{-\beta y_2}dy_1=\beta^2e^{-\beta y_2}y_2 \tag{1.24} \end{equation}\] for \(y_2>0\) and \(f_{Y_2}(y_2)=0\) otherwise.
Similarly, the joint p.d.f. of \(Y_1,Y_2,Y_3\) is given by \(f_{Y_1,Y_2,Y_3}(y_1,y_2,y_3)=f_{X_1,X_2,X_3}(s_1,s_2,s_3)|J|\) with \(X_1=s_1(Y_1,Y_2,Y_3)=Y_1\), \(X_2=s_2(Y_1,Y_2,Y_3)=Y_2-Y_1\) and \(s_3(Y_1,Y_2,Y_3)=Y_3-Y_2\) and \[\begin{equation} J=det\begin{pmatrix} 1 & 0 & 0 \\ -1 & 1 & 0\\ 0 & -1 & 1 \end{pmatrix}=1 \tag{1.25} \end{equation}\] Then, \[\begin{equation} f_{Y_1,Y_2,Y_3}(y_1,y_2,y_3)=f_{X_1,X_2,X_3}(s_1,s_2,s_3)\times 1=\beta^3e^{-\beta y_3} \tag{1.26} \end{equation}\] for \(y_3>y_2>y_1>0\) and \(f_{Y_1,Y_2,Y_3}(y_1,y_2,y_3)=0\) otherwise.
- We begin by integrating out the first variable:
\[\begin{equation}
f_{Y_2,Y_3}(y_2,y_3)=\int_{0}^{y_2}\beta^3e^{-\beta y_3}dy_1=\beta^3y_2e^{-\beta y_3}
\tag{1.27}
\end{equation}\]
Then next integral gives
\[\begin{equation} f_{Y_3}(y_3)=\int_{0}^{y_3}\beta^3y_2e^{-\beta y_3}dy_2=\beta^3\frac{y_3^2}{2}e^{-\beta y_3} \tag{1.28} \end{equation}\]
for \(y_3>0\) and 0 otherwise.
Exercise 1.6 (By Raquel, FYE 2014 Retake) Consider an electronic system comprised of three components denoted as \(C_1, C_2\), and \(C_3\). Let \(X_i\) be a random variable denoting the lifetime of component \(C_i\), for \(i=1,2,3\). Assume \(X_1, X_2\), and \(X_3\) are i.i.d. random variables each with p.d.f. \[\begin{equation} f(x)=\left\{\begin{aligned} & e^{-x} &\quad x>0 \\ & 0 & \quad o.w. \end{aligned}\right. \tag{1.29} \end{equation}\] Suppose that the system will operate as long as both component \(C_1\) and at least one of the components \(C_2\) and \(C_3\) operate. Let \(Z\) be the random variable that denotes the lifetime of the system.
(60%) Find the c.d.f. of \(Z\).
- (40%) Obtain \(E(Z^2)\).
Proof. 1. \[\begin{equation} \begin{split} F(z)&=Pr(Z\leq z)=1-Pr(Z>z)=1-Pr((X_1>z)\cap((X_2>z)\cup(X_3>z)))\\ &=1-(Pr(X_1>z)(1-Pr(X_2\leq z)Pr(X_3\leq z)))=1-(e^{-z}(1-(1-e^{-z})^2))\\ &=1-2e^{-2z}-e^{-3z} \end{split} \tag{1.30} \end{equation}\]
- The p.d.f. of \(Z\) is \(f(z)=4e^{-2z}-3e^{-3z}\) for \(z>0\). Therefore, \[\begin{equation} E(Z^2)=\int_0^{\infty}z^2(4e^{-2z}-3e^{-3z})dz=\frac{7}{9} \tag{1.31} \end{equation}\] This can be computed using the following: if \(X\sim Exp(\beta)\), \(E(X^2)=(E(X))^2+Var(X)=\frac{2}{\beta^2}\).
Exercise 1.7 (By Raquel, FYE 2013) An experiment consists of measuring the reaction time \(X\) (in seconds) to a certain stimulus. The stimulus is such that the reaction time cannot be less than 1 second and more than 3 seconds. The probability density function for \(X\) is given by \[\begin{equation} f(x)=\left\{\begin{aligned} &\frac{3}{2}x^{-2} & \quad 1\leq x\leq 3\\ & 0 & o.w.\end{aligned}\right. \tag{1.32} \end{equation}\]
(15%) Compute the expected reaction time.
(15%) Obtain the median reaction time.
(40%) If an individual takes more than 1.5 seconds to react, a light comes on and stays on either untile one further second has elapsed or until the person reacts (whichever happens first). Let \(T\) denote the amount of time the light remains lit. Find the expectation of random variable \(T\).
- (30%) Suppose that the reaction time to the stimulus is measured for each of 10 individuals. Assume that reaction times for the individuals are independent, and that the probability density function of the reaction time is the same for all 10 individulas, given by \(f(x)\) above. Provide the distribution, including the values of its parameters, for random variable \(Y\) defined as the number of individuals with reaction time less than 2.5 seconds.
Exercise 1.8 (By Raquel, FYE 2012) Let \(X_1\) and \(X_2\) be two independent random variables each with p.d.f. \(f_X(x)=e^{-x},x>0\) and \(f_X(x)=0,x\leq 0\). Let \(Z=X_1-X_2\) and \(W=X_1/X_2\).
(25%) Show that the joint p.d.f. of \(Y=X_1\) and \(Z\) is given by: \[\begin{equation} g(y,z)=\exp(-2y+z) \tag{1.33} \end{equation}\]
and find the region where this p.d.f. is defined.(15%) Prove that the conditional p.d.f. of \(X_1\) given \(Z=0\) is: \[\begin{equation} h(x_1|0)=\left\{\begin{aligned} &2e^{-2x_1} & \quad x_1>0\\ & 0 & o.w.\end{aligned}\right. \tag{1.34} \end{equation}\]
(25%) Show that the joint p.d.f. of \(Y=X_1\) and \(W\) is given by: \[\begin{equation} g(y,w)=y\exp(-y(1+1/w))/w^2 \tag{1.35} \end{equation}\] and find the region where this p.d.f. is defined.
(15%) Prove that the conditional p.d.f. of \(X_1\) given \(W=1\) is: \[\begin{equation} h(x_1|1)=\left\{\begin{aligned} &4x_1e^{-2x_1} & \quad x_1>0\\ & 0 & o.w.\end{aligned}\right. \tag{1.36} \end{equation}\]
- Note that \(\{Z=0\}=\{W=1\}\) but the conditional distribution of \(X_1\) given \(Z=0\) is not the same as the conditional distribution of \(X_1\) given \(W=1\). Can you comment on the results in Part 2 and Part 4?
Exercise 1.9 (By Raquel, FYE 2012 Retake) Suppose that \(X_1\) and \(X_2\) are continuous i.i.d. random variables and that each of them has the uniform distribution on the \((0,1)\) interval. Let \(Y_1=X_1, Y_2=X_1+X_2,Z+X_1-X_2\) and \(U=X_1^2\).
(35%) Find the p.d.f. of \(Y_2\).
(30%) Find the p.d.f. of \(Z\).
(20%) Find the p.d.f. of \(U\).
- (15%) Find \(E(Y_1Y_2)\).