Joint c.d.f. and p.d.f.
The
joint (cumulative) probability distribution function (joint c.d.f.) of
X and
Y is defined by
FX,Y(x,y)=P({ω:X(ω)≤x and Y(ω)≤y})=P(X≤x,Y≤y),
where x,y∈R.
Two r.v.’s
X and
Y are said to be
jointly continuous, if there exists a function
fX,Y(x,y)≥0 such that for every “nice” set
C⊆R2,
P((X,Y)∈C)=∫∫CfX,Y(x,y)dxdy.
The function fX,Y is called the joint probability density function (joint p.d.f.) of X and Y.
If
X and
Y are jointly continuous, then
FX,Y(x,y)=P(X≤x,Y≤y)=∫y−∞∫x−∞fX,Y(u,v)dudv.
Hence we differentiate the c.d.f. FX,Y(x,y) with respect to both x and y to obtain the p.d.f.
fX,Y(x,y)=∂2∂x∂yFX,Y(x,y).
Suppose that
fX,Y(x,y)={24x(1−x−y)if x,y≥0 and x+y≤1,0otherwise.
- Find P(X>Y),
- Find P(X>12).
- Let C={(x,y):x>y} and write A={(x,y):fX,Y(x,y)>0}. Then,
C∩A={(x,y);x>0,y>0,x+y<1,x>y}.
Therefore
P(X>Y)=P((X,Y)∈C)=∫∫CfX,Y(x,y)dxdy=∫∫C∩A24x(1−x−y)dxdy=∫1/20∫1−yy24x(1−x−y)dxdy=∫1/20[12x2−8x3−12yx2]1−yydy=∫1/20(4−12y+16y3)dy=[4y−6y2+4y4]1/20=2−32+14=34.
- Let D={(x,y):x>1/2}, then
D∩A={(x,y);x>1/2,y>0,x+y<1}.
Therefore
P(X>1/2)=P((X,Y)∈D)=∫∫DfX,Y(x,y)dxdy=∫∫D∩A24x(1−x−y)dxdy=∫11/2∫1−x024x(1−x−y)dydx=∫11/2[24xy(1−x−12y)]1−x0dx=∫11/212x(1−x)2dx=[122x2−243x3+124x4]11/2=6−8+3−32+1−316=516.
Marginal c.d.f. and p.d.f.
There are many situations with bivariate distributions where we are interested in one of the random variables. For example, we might have the joint distribution of height and weight of individuals but only be interested in the weight of individuals. This is known as the marginal distribution.
Suppose that the c.d.f. of X and Y is given by FX,Y, then the c.d.f. of X can be obtained from FX,Y since
FX(x)=P(X≤x)=P(X≤x,Y<∞)=limy→∞FX,Y(x,y).
FX is called the marginal distribution (marginal c.d.f.) of X.
If fX,Y is the joint p.d.f. of X and Y, then the marginal probability density function (marginal p.d.f.) of X is given by
fX(x)=∫∞−∞fX,Y(x,y)dy.
Consider Example 1.
Find the marginal p.d.f. and c.d.f of Y.
fY(y)=∫∞−∞fX,Y(x,y)dx={∫1−y024x(1−x−y)dx0≤y≤1,0otherwise.={4(1−y)30≤y≤1,0otherwise.
Hence,
FY(y)=⎧⎪
⎪
⎪⎨⎪
⎪
⎪⎩0,y<0,∫y04(1−u)3du=1−(1−y)4,0≤y≤1,1,y>1.
Find the p.d.f. of Z=X/Y, where
fX,Y(x,y)={e−(x+y)0<x,y<∞,0otherwise.
Attempt Exercise 1 and then watch Video 14 for the solutions.
Video 14: Ratio of Exponentials
Alternatively the solutions are available:
Solution to Exercise 1
Clearly, Z>0. For z>0,
Therefore
FZ(z)=P(Z≤z)=P(X/Y≤z)=∫∫{(x,y):x/y≤z}fX,Y(x,y)dxdy=∫∞0∫yz0e−(x+y)dxdy=∫∞0−e−y(1+z)+e−ydy=1−11+z
and so
fZ(z)=dFZ(z)dz={1(1+z)2,z>0,0,z≤0.
Note that we can extend the notion of joint and marginal distributions to random variables X1,X2,…,Xn in a similar fashion.
Independent random variables
Random variables X and Y are said to be independent if, for all x,y∈R,
P(X≤x,Y≤y)=P(X≤x)P(Y≤y),
that is, for all x,y∈R, FX,Y(x,y)=FX(x)FY(y).
If
X and
Y are discrete random variables with joint p.m.f.
pX,Y(x,y) and marginal p.m.f.’s
pX(x) and
pY(y), respectively, then
X and
Y are independent if and only if for all
x,y∈R,
pX,Y(x,y)=pX(x)pY(y).
If
X and
Y are continuous random variables with joint p.d.f.
fX,Y(x,y) and marginal p.d.f.’s
fX(x) and
fY(y), respectively, then
X and
Y are independent if and only if for all
x,y∈R,
fX,Y(x,y)=fX(x)fY(y).
For example, in
Exercise 1 X and
Y have joint probability density function:
fX,Y(x,y)=exp(−{x+y})=exp(−x)exp(−y)=fX(x)fY(y),(x,y>0),
where both X and Y are distributed according to Exp(1). Thus the distribution Z given in Exercise 1 is the ratio of two independent exponential random variables with mean 1.
Note that we can easily extend the notion of independent random variables to random variables X1,X2,…,Xn.
The random variables X1,X2,…,Xn are said to independent and identically distributed (i.i.d.) if,
X1,X2,…,Xn are independent.
X1,X2,…,Xn all have the same distribution, that is, Xi∼F for all i=1,…,n.
Definition 6 extends the notion of i.i.d. given at the start of Section 5.4.2 for discrete random variables.
The random variables X1,X2,…,Xn are said to be a random sample if they are i.i.d.
Suppose X1,X2,…,Xn are a random sample from the Poisson distribution with mean λ. Find the joint p.m.f. of X1,X2,…,Xn.
If Xi∼Po(λ), then its p.m.f. is given by
P(Xi=xi)=pXi(xi)={e−λλxixi!if xi=0,1,2,…,0otherwise.
Since X1,X2,…,Xn are independent, their joint p.m.f. is given by,
pX1,X2,…,Xn(x1,x2,…,xn)=n∏i=1pXi(xi)={∏ni=1e−λλxixi!if xi=0,1,2,…,0otherwise.={e−nλλ∑ni=1xi∏ni=1xi!if xi=0,1,2,…,0otherwise.
The joint p.m.f. of X=(X1,X2,…,Xn) tells us how likely we are to observe x=(x1,x2,…,xn) given λ. This can be used either:
- To compute P(X=x) when λ is known;
- Or, more commonly in statistics, to assess what is a good estimate of λ given x in situations where λ is unknown.
Student Exercise
Attempt the exercise below.
Question.
A theory of chemical reactions suggests that the variation in the quantities
X and
Y of two products
C1 and
C2 of a certain reaction is described by the joint probability density function
fX,Y(x,y)=2(1+x+y)3x≥0,y≥0.
On the basis of this theory, answer the following questions.
- What is the probability that at least one unit of each product is produced?
- Determine the probability that quantity of C1 produced is less than half that of C2.
- Find the c.d.f. for the total quantity of C1 and C2.
Solution to Question.
- The required probability is
P(X≥1,Y≥1)=∫∞1∫∞12(1+x+y)3dydx=∫∞1[−1(1+x+y)3]∞y=1dx=∫∞11(2+x)2dx=[−12+x]∞x=1=13.
- The required probability is
P(X≤12Y)=∫∞0∫y/202(1+x+y)3dxdy=∫∞0[−1(1+x+y)3]y/2x=0dy=∫∞0(1(1+y)2−1(1+3y/2)2)du=[−11+y−−2/3(1+3y/2)]∞y=0=13.
- Since both X and Y are non-negative random variables, X+Y is non-negative. Thus P(X+Y≤z)=0 for z<0. For z≥0,
P(X+Y≤z)=∫z0∫z−y02(1+x+y)3dxdy=∫z0[−1(1+x+y)3]z−yx=0dy=∫z0(1(1+y)2−1(1+z)2)du=[−11+y−y(1+z)2]zy=0=−11+z−z(1+z)2+1+0=(z1+z)2.