## Exercises

### Probability review

**Exercise 1.1 **Let \((\Omega,\mathcal{A},\mathbb{P})\) be a probability space. Prove that, for each \(B\in\mathcal{A}\) with \(\mathbb{P}(B)>0,\) \(\mathbb{P}(\cdot|B):\mathcal{A}\rightarrow[0,1]\) is a probability function on \((\Omega,\mathcal{A}),\) and hence \((\Omega,\mathcal{A},\mathbb{P}(\cdot|B))\) is a probability space.

**Exercise 1.2 **Prove that if \(A_1,\ldots,A_k\) are independent events, then

\[\begin{align*} \mathbb{P}\left(\bigcup_{i=1}^kA_i\right)=1-\prod_{i=1}^k(1-\mathbb{P}(A_i)). \end{align*}\]

**Exercise 1.3 **Prove the law of total probability by expanding \(\mathbb{P}(B)=\mathbb{P}\big(B\cap\big(\cup_{i=1}^k A_i\big)\big)\) (why?).

**Exercise 1.4 **Prove Bayes’ theorem from the law of total probability and the definition of conditional probability.

**Exercise 1.5 **Consider the discrete sample space \(\Omega=\{a,b,c,d\}\) and the mapping \(X: \Omega\rightarrow \mathbb{R}\) such that \(X(a)=X(b)=0\) and \(X(c)=X(d)=1.\) Consider the \(\sigma\)-algebra generated by the sets \(\{a\}\) and \(\{c,d\}.\) Prove that \(X\) is a rv for this \(\sigma\)-algebra.

**Exercise 1.6 **Consider the \(\sigma\)-algebra generated by the subsets \(\{a\}\) and \(\{c\}\) for \(\Omega=\{a,b,c,d\}.\)

- Prove that the mapping in Exercise 1.5 is not a rv for this \(\sigma\)-algebra.
- Define a mapping that is a rv for this \(\sigma\)-algebra.

**Exercise 1.7 **Consider the experiment consisting in tossing two times a coin.

- Provide the sample space.
- For the \(\sigma\)-algebra \(\mathbb{P}(\Omega),\) consider the rv \(X=\) “Number of heads in two tosses”. Provide the range and the probability function induced by \(X.\)

**Exercise 1.8 **For the experiment of Exercise 1.7, consider the rv \(Y=\) “Difference between the number of heads and the number of tails”. Obtain its range and its induced probability function.

**Exercise 1.9 **A dice is rolled two times. Obtain the sample space and the pmf of the following rv’s:

- \(X_1=\) “Sum of the resulting numbers”.
- \(X_2=\) “Absolute value of difference of the resulting numbers”.
- \(X_3=\) “Maximum of the resulting numbers”.
- \(X_4=\) “Minimum of the resulting numbers”.

Indicate, for each case, which is the pre-image by the corresponding rv of the set \([1.5,3.5]\) and which is its induced probability.

### Random variables

**Exercise 1.10 **Assume that the lifespan (in hours) of a fluorescent tube is represented by the continuous rv \(X\) with pdf

\[\begin{align*} f(x)=\begin{cases} c/x^2, & x>100,\\ 0, & x\leq 100. \end{cases} \end{align*}\]

Compute:

- The value of \(c.\)
- The cdf of \(X.\)
- The probability that a tube lasts more than \(500\) hours.

**Exercise 1.11 **When storing flour in bags of \(100\) kg, a random error \(X\) is made in the measurement of the weight of the bags. The pdf of the error \(X\) is given by

\[\begin{align*} f(x)=\begin{cases} k(1-x^2), & -1<x<1,\\ 0, & \text{otherwise}. \end{cases} \end{align*}\]

- Compute the probability that a bag weighs more than \(99.5\) kg.
- What is the percentage of bags with a weight between \(99.8\) and \(100.2\) kg?

**Exercise 1.12 **A rv only takes the values \(1\) and \(3\) with a non-zero probability. If its expectation is \(8/3,\) find the probabilities of these two values.

**Exercise 1.13 **The random number of received calls in a call center during a time interval of \(h\) minutes, \(X_h,\) has a pmf

\[\begin{align*} \mathbb{P}(X_h=n)=\frac{(5h)^n}{n!}e^{-5h}, \quad n=0,1,2,\ldots. \end{align*}\]

- Find the average number of calls received in half an hour.
- What is the expected time that has to pass until an average of \(100\) calls is received?

**Exercise 1.14 **Let \(X\) be a rv following a Poisson distribution (see Example 1.14). Compute the expectation and variance of the new rv

\[\begin{align*} Y=\begin{cases} 1 & \mathrm{if}\ X=0,\\ 0 & \mathrm{if}\ X\neq 0. \end{cases} \end{align*}\]

**Exercise 1.15 **Consider a rv with density function

\[\begin{align*} f(x)=\begin{cases} 6x(1-x), & 0<x<1,\\ 0, & \mathrm{otherwise}. \end{cases} \end{align*}\]

Compute:

- \(\mathbb{E}[X]\) and \(\mathbb{V}\mathrm{ar}[X].\)
- \(\mathbb{P}\left(|X-\mathbb{E}[X]|<\sqrt{\mathbb{V}\mathrm{ar}[X]}\right).\)

### Moment generating function

**Exercise 1.16 **For each of the pmf’s given below, find the mgf and, using it, obtain the expectation and variance of the corresponding rv.

- \(\mathbb{P}(X=1)=p,\) \(\mathbb{P}(X=0)=1-p,\) where \(p\in(0,1).\)
- \(\mathbb{P}(X=n)=(1-p)^n p,\) \(n=0,1,2,\ldots,\) where \(p\in(0,1).\)

**Exercise 1.17 **Compute the mgf of \(\mathcal{U}(0,1).\) Then, use the result to obtain:

- The mgf of \(\mathcal{U}(-1,1).\)
- The mgf of \(\sum_{i=1}^n X_i,\) where \(X_i\sim\mathcal{U}(-1/i,1/i).\)
- \(\mathbb{E}[\mathcal{U}(-1,1)^k],\) \(k\geq 1.\) (You can verify the approach using mgf with direct computation of \(\mathbb{E}[\mathcal{U}(-1,1)^k].\))

**Exercise 1.18 **Compute the mgf of the following rv’s:

- \(\mathcal{N}(\mu,\sigma^2).\)
- \(\mathrm{Exp}(\lambda).\)
- \(X\sim f(\cdot;\theta)\) given in (1.13).

**Exercise 1.19 **Compute the mgf of \(X\sim \mathrm{Pois}(\lambda).\)

**Exercise 1.20 (Poisson additive property) **Prove the additive property of the Poisson distribution, that is, prove that if \(X_i,\) \(i=1,\ldots,n\) are independent rv’s with respective distributions \(\mathrm{Pois}(\lambda_i),\) \(i=1,\ldots,n,\) then

\[\begin{align*} \sum_{i=1}^n X_i\sim \mathrm{Pois}\left(\sum_{i=1}^n\lambda_i\right). \end{align*}\]

**Exercise 1.21 (Gamma additive property) **Prove the additive property of the gamma distribution, that is, prove that if \(X_i,\) \(i=1,\ldots,n\) are independent rv’s with respective distributions \(\Gamma(\alpha_i,\beta),\) \(i=1,\ldots,n,\) then

\[\begin{align*} \sum_{i=1}^n X_i\sim \Gamma\left(\sum_{i=1}^n\alpha_i,\beta\right). \end{align*}\]

### Random vectors

**Exercise 1.22 **Conclude Example 1.27 by computing the cdf function for all \((x_1,x_2)'\in\mathbb{R}^2.\) Split \(\mathbb{R}^2\) into key regions for doing that.

**Exercise 1.23 **Consider the random vector \((X,Y)'\) with joint pdf

\[\begin{align*} f(x,y)=e^{-x}, \quad x>0,\ 0<y<x. \end{align*}\]

Compute \(\mathbb{E}[X],\) \(\mathbb{E}[Y],\) and \(\mathbb{C}\mathrm{ov}[X,Y].\)

**Exercise 1.24 **Obtain the marginal pdf and cdf of \(X_2\) in Example 1.26.

**Exercise 1.25 **Obtain the marginal cdf and pmf of \(X_1\) in Example 1.27.

**Exercise 1.26 **Consider the joint pdf of the random vector \((X,Y)'\):

\[\begin{align*} f(x,y)=\begin{cases} cx^{-2}y^{-3},&x>50,\ y>10,\\ 0,&\text{otherwise.} \end{cases} \end{align*}\]

- Find \(c\) such that \(f\) is a pdf.
- Find the marginal pdf of \(X.\)
- Find the marginal cdf of \(X.\)
- Compute \(\mathbb{P}(X>100,Y>10).\)

**Exercise 1.27 **Consider the joint pdf of the random vector \((X,Y)'\):

\[\begin{align*} f(x,y)=\begin{cases} K(x^2+y^2),&2\leq x\leq 3,\ 2\leq y\leq 3,\\ 0,&\text{otherwise.} \end{cases} \end{align*}\]

- Find the value of \(K\) such that \(f\) is a pdf.
- Find the marginal pdf of \(X.\)
- Compute \(\mathbb{P}(Y>2.5).\)

**Exercise 1.28 **Obtain the pdfs of \(Y|X=x\) and \(X|Y=y\) in Exercise 1.26 applying (1.4). Check that the conditional pdf integrates one for each possible value \(y.\)

**Exercise 1.30 **Compute the conditional cdfs of \(Y|X=x\) and \(X|Y=y\) in Exercise 1.26. Do they equal \(y\mapsto F(x,y)/F_X(x)\) and \(x\mapsto F(x,y)/F_Y(y)\)?^{24}

**Exercise 1.31 **Compute variance-covariance matrix of the random vector with pdf given in Exercise 1.27.

**Exercise 1.32 **Prove that \(\mathbb{E}[\mathrm{Exp}(\lambda)]=1/\lambda\) and \(\mathbb{V}\mathrm{ar}[\mathrm{Exp}(\lambda)]=1/\lambda^2.\)

### Transformations of random vectors

**Exercise 1.33 **Let \(\boldsymbol{X}\sim f_{\boldsymbol{X}}\) in \(\mathbb{R}^p.\) Show that:

- For \(a\neq0,\) \(a\boldsymbol{X}\sim f_{\boldsymbol{X}}(\cdot/a)/|a|^p.\)
- For a \(p\times p\) rotation matrix
^{25}\(\boldsymbol{R},\) \(\boldsymbol{R}\boldsymbol{X}\sim f_{\boldsymbol{X}}\left(\boldsymbol{R}' \cdot \right).\)

**Exercise 1.34 **Consider the random vector \((X,Y)'\) with pdf

\[\begin{align} f(x,y)=\begin{cases} K(1 - x^2-y^2),& (x,y)'\in S, \\ 0,&\text{otherwise.} \end{cases} \tag{1.12} \end{align}\]

Obtain:

- The support \(S\) and \(K>0.\)
- The pdf of \(X+Y.\)
- The pdf of \(XY.\)

**Exercise 1.35 **Let \(X_1,\ldots,X_n\sim\Gamma(1,\theta)\) independent. Using only Corollary 1.2, compute the pdf of:

- \(X_1+X_2.\)
- \(X_1+X_2+X_3.\)
- \(X_1+\cdots+X_n.\)

**Exercise 1.36 **Corroborate the usefulness of Proposition 1.10 by computing the following expectations as (1) \(\mathbb{E}[g(X)]=\int g(x)f_X(x)\,\mathrm{d}x\) and (2) \(\mathbb{E}[Y]=\int yf_Y(y)\,\mathrm{d}y,\) where \(Y=g(X).\)

- \(X\sim \mathcal{U}(0,1)\) and \(g(x)=x^2.\)
- \(X\sim \mathrm{Exp}(1)\) and \(g(x)=e^x.\)

**Exercise 1.37 **Consider \(X\sim \mathcal{U}(-2,2)\) and \(Y=g(X),\) with \(g\) as in Example 1.33. Plot the density function \(f_Y\) and explain the intuition of why it has such a “Sauron-inspired” shape. You can validate the form of \(f_Y\) by simulating random values for \(Y\) and drawing its histogram.

**Exercise 1.38 **Let \(X\sim \mathcal{N}(0,1).\) Compute the pdf of \(X^2.\)

**Exercise 1.39 **Let \((X,Y)'\sim \mathcal{N}_2(\boldsymbol{0},\boldsymbol{I}_2).\) Compute the pdf of \(X^2+Y^2.\) Then, compute the pdf of \(-2\log U,\) where \(U\sim\mathcal{U}(0,1).\) Is there any connection between both pdfs?

**Exercise 1.40 **Validate the densities of \(X+Y\) and \(XY\) from Exercise 1.33 with a simulation. To do so:

- Compute the pdf of \(R=\sqrt{1-\sqrt{U}}\) with \(U\sim\mathcal{U}(0,1).\)
- Show that \((X,Y)'=(R\cos\Theta,R\sin\Theta)',\) with \(\Theta\sim\mathcal{U}(0,2\pi)\) independent from \(R,\) actually follows the pdf (1.12).
- Simulate \(X+Y\) and draw its histogram. Overlay \(f_{X+Y}.\)
- Simulate \(XY\) and draw its histogram. Overlay \(f_{XY}.\)

**Exercise 1.41 **Consider the random variable \(X\) with pdf

\[\begin{align} f(x;\theta)=\frac{\theta}{\pi(\theta^2+x^2)},\quad x\in\mathbb{R},\ \theta>0.\tag{1.13} \end{align}\]

Obtain the pdf of \(Y=X\mod 2\pi.\)