Exercises
Probability review
Exercise 1.1 Let (Ω,A,P) be a probability space. Prove that, for each B∈A with P(B)>0, P(⋅|B):A→[0,1] is a probability function on (Ω,A), and hence (Ω,A,P(⋅|B)) is a probability space.
Exercise 1.2 Prove that if A1,…,Ak are independent events, then
P(k⋃i=1Ai)=1−k∏i=1(1−P(Ai)).
Exercise 1.3 Prove the law of total probability by expanding P(B)=P(B∩(∪ki=1Ai)) (why?).
Exercise 1.4 Prove Bayes’ theorem from the law of total probability and the definition of conditional probability.
Exercise 1.5 Consider the discrete sample space Ω={a,b,c,d} and the mapping X:Ω→R such that X(a)=X(b)=0 and X(c)=X(d)=1. Consider the σ-algebra generated by the sets {a} and {c,d}. Prove that X is a rv for this σ-algebra.
Exercise 1.6 Consider the σ-algebra generated by the subsets {a} and {c} for Ω={a,b,c,d}.
- Prove that the mapping in Exercise 1.5 is not a rv for this σ-algebra.
- Define a mapping that is a rv for this σ-algebra.
Exercise 1.7 Consider the experiment consisting in tossing two times a coin.
- Provide the sample space.
- For the σ-algebra P(Ω), consider the rv X= “Number of heads in two tosses”. Provide the range and the probability function induced by X.
Exercise 1.8 For the experiment of Exercise 1.7, consider the rv Y= “Difference between the number of heads and the number of tails”. Obtain its range and its induced probability function.
Exercise 1.9 A dice is rolled two times. Obtain the sample space and the pmf of the following rv’s:
- X1= “Sum of the resulting numbers”.
- X2= “Absolute value of difference of the resulting numbers”.
- X3= “Maximum of the resulting numbers”.
- X4= “Minimum of the resulting numbers”.
Indicate, for each case, which is the pre-image by the corresponding rv of the set [1.5,3.5] and which is its induced probability.
Random variables
Exercise 1.10 Assume that the lifespan (in hours) of a fluorescent tube is represented by the continuous rv X with pdf
f(x)={c/x2,x>100,0,x≤100.
Compute:
- The value of c.
- The cdf of X.
- The probability that a tube lasts more than 500 hours.
Exercise 1.11 When storing flour in bags of 100 kg, a random error X is made in the measurement of the weight of the bags. The pdf of the error X is given by
f(x)={k(1−x2),−1<x<1,0,otherwise.
- Compute the probability that a bag weighs more than 99.5 kg.
- What is the percentage of bags with a weight between 99.8 and 100.2 kg?
Exercise 1.12 A rv only takes the values 1 and 3 with a non-zero probability. If its expectation is 8/3, find the probabilities of these two values.
Exercise 1.13 The random number of received calls in a call center during a time interval of h minutes, Xh, has a pmf
P(Xh=n)=(5h)nn!e−5h,n=0,1,2,….
- Find the average number of calls received in half an hour.
- What is the expected time that has to pass until an average of 100 calls is received?
Exercise 1.14 Let X be a rv following a Poisson distribution (see Example 1.14). Compute the expectation and variance of the new rv
Y={1if X=0,0if X≠0.
Exercise 1.15 Consider a rv with density function
f(x)={6x(1−x),0<x<1,0,otherwise.
Compute:
- E[X] and Var[X].
- P(|X−E[X]|<√Var[X]).
Moment generating function
Exercise 1.16 For each of the pmf’s given below, find the mgf and, using it, obtain the expectation and variance of the corresponding rv.
- P(X=1)=p, P(X=0)=1−p, where p∈(0,1).
- P(X=n)=(1−p)np, n=0,1,2,…, where p∈(0,1).
Exercise 1.17 Compute the mgf of U(0,1). Then, use the result to obtain:
- The mgf of U(−1,1).
- The mgf of ∑ni=1Xi, where Xi∼U(−1/i,1/i).
- E[U(−1,1)k], k≥1. (You can verify the approach using mgf with direct computation of E[U(−1,1)k].)
Exercise 1.18 Compute the mgf of the following rv’s:
- N(μ,σ2).
- Exp(λ).
- X∼f(⋅;θ) given in (1.13).
Exercise 1.19 Compute the mgf of X∼Pois(λ).
Exercise 1.20 (Poisson additive property) Prove the additive property of the Poisson distribution, that is, prove that if Xi, i=1,…,n are independent rv’s with respective distributions Pois(λi), i=1,…,n, then
n∑i=1Xi∼Pois(n∑i=1λi).
Exercise 1.21 (Gamma additive property) Prove the additive property of the gamma distribution, that is, prove that if Xi, i=1,…,n are independent rv’s with respective distributions Γ(αi,β), i=1,…,n, then
n∑i=1Xi∼Γ(n∑i=1αi,β).
Random vectors
Exercise 1.22 Conclude Example 1.27 by computing the cdf function for all (x1,x2)′∈R2. Split R2 into key regions for doing that.
Exercise 1.23 Consider the random vector (X,Y)′ with joint pdf
f(x,y)=e−x,x>0, 0<y<x.
Compute E[X], E[Y], and Cov[X,Y].
Exercise 1.24 Obtain the marginal pdf and cdf of X2 in Example 1.26.
Exercise 1.25 Obtain the marginal cdf and pmf of X1 in Example 1.27.
Exercise 1.26 Consider the joint pdf of the random vector (X,Y)′:
f(x,y)={cx−2y−3,x>50, y>10,0,otherwise.
- Find c such that f is a pdf.
- Find the marginal pdf of X.
- Find the marginal cdf of X.
- Compute P(X>100,Y>10).
Exercise 1.27 Consider the joint pdf of the random vector (X,Y)′:
f(x,y)={K(x2+y2),2≤x≤3, 2≤y≤3,0,otherwise.
- Find the value of K such that f is a pdf.
- Find the marginal pdf of X.
- Compute P(Y>2.5).
Exercise 1.28 Obtain the pdfs of Y|X=x and X|Y=y in Exercise 1.26 applying (1.4). Check that the conditional pdf integrates one for each possible value y.
Exercise 1.30 Compute the conditional cdfs of Y|X=x and X|Y=y in Exercise 1.26. Do they equal y↦F(x,y)/FX(x) and x↦F(x,y)/FY(y)?24
Exercise 1.31 Compute variance-covariance matrix of the random vector with pdf given in Exercise 1.27.
Exercise 1.32 Prove that E[Exp(λ)]=1/λ and Var[Exp(λ)]=1/λ2.
Transformations of random vectors
Exercise 1.33 Let \boldsymbol{X}\sim f_{\boldsymbol{X}} in \mathbb{R}^p. Show that:
- For a\neq0, a\boldsymbol{X}\sim f_{\boldsymbol{X}}(\cdot/a)/|a|^p.
- For a p\times p rotation matrix25 \boldsymbol{R}, \boldsymbol{R}\boldsymbol{X}\sim f_{\boldsymbol{X}}\left(\boldsymbol{R}' \cdot \right).
Exercise 1.34 Consider the random vector (X,Y)' with pdf
\begin{align} f(x,y)=\begin{cases} K(1 - x^2-y^2),& (x,y)'\in S, \\ 0,&\text{otherwise.} \end{cases} \tag{1.12} \end{align}
Obtain:
- The support S and K>0.
- The pdf of X+Y.
- The pdf of XY.
Exercise 1.35 Let X_1,\ldots,X_n\sim\Gamma(1,\theta) independent. Using only Corollary 1.2, compute the pdf of:
- X_1+X_2.
- X_1+X_2+X_3.
- X_1+\cdots+X_n.
Exercise 1.36 Corroborate the usefulness of Proposition 1.10 by computing the following expectations as (1) \mathbb{E}[g(X)]=\int g(x)f_X(x)\,\mathrm{d}x and (2) \mathbb{E}[Y]=\int yf_Y(y)\,\mathrm{d}y, where Y=g(X).
- X\sim \mathcal{U}(0,1) and g(x)=x^2.
- X\sim \mathrm{Exp}(1) and g(x)=e^x.
Exercise 1.37 Consider X\sim \mathcal{U}(-2,2) and Y=g(X), with g as in Example 1.33. Plot the density function f_Y and explain the intuition of why it has such a “Sauron-inspired” shape. You can validate the form of f_Y by simulating random values for Y and drawing its histogram.
Exercise 1.38 Let X\sim \mathcal{N}(0,1). Compute the pdf of X^2.
Exercise 1.39 Let (X,Y)'\sim \mathcal{N}_2(\boldsymbol{0},\boldsymbol{I}_2). Compute the pdf of X^2+Y^2. Then, compute the pdf of -2\log U, where U\sim\mathcal{U}(0,1). Is there any connection between both pdfs?
Exercise 1.40 Validate the densities of X+Y and XY from Exercise 1.33 with a simulation. To do so:
- Compute the pdf of R=\sqrt{1-\sqrt{U}} with U\sim\mathcal{U}(0,1).
- Show that (X,Y)'=(R\cos\Theta,R\sin\Theta)', with \Theta\sim\mathcal{U}(0,2\pi) independent from R, actually follows the pdf (1.12).
- Simulate X+Y and draw its histogram. Overlay f_{X+Y}.
- Simulate XY and draw its histogram. Overlay f_{XY}.
Exercise 1.41 Consider the random variable X with pdf
\begin{align} f(x;\theta)=\frac{\theta}{\pi(\theta^2+x^2)},\quad x\in\mathbb{R},\ \theta>0.\tag{1.13} \end{align}
Obtain the pdf of Y=X\mod 2\pi.