Ch2 Sol
Ch2 Sol
2)
Tomoki Okuno
Note
• Not all solutions are provided: exercises that are too simple or not very important to me are skipped.
• Texts in red are just attentions to me. Please ignore them.
2 Multivariate Distributions
2.1 Distributions of Two Random Variables
2.1.1. Let f (x1 , x2 ) = 4x1 x2 , 0 < x1 < 1, 0 < x2 < 1, zero elsewhere, be the pdf of X1 and X2 . Find
P (0 < X1 < 12 , 14 < X2 < 1), P (X1 = X2 ), P (X1 < X2 ), and P (X1 ≤ X2 ).
Solution.
Z 1 Z 1/2
1 1 15
P 0 < X1 < , < X2 < 1 = 4x1 x2 dx1 dx2 = · · · =
2 4 1/4 0 64
P (X1 = X2 ) = 0 since the support is a segment not area
Z 1 Z x2 Z 1 Z 1
1
P (X1 < X2 ) = 4x1 x2 dx1 dx2 = 2x21 x2 |xx11 =0
=x2
dx1 dx2 = 2x32 dx2 = .
0 0 0 0 2
1
P (X1 ≤ X2 ) = P (X1 < X2 ) + P (X1 = X2 ) = P (X1 < X2 ) = .
2
Solution.
2.1.7. Let f (x, y) = e−x−y , 0 < x < ∞, 0 < y < ∞, zero elsewhere, be the pdf of X and Y . Then if
Z = X + Y , compute P (Z ≤ 0), P (Z ≤ 6), and, more generally, P (Z ≤ z), for 0 < z < ∞. What is the pdf
of Z.
1
Solution.
Compute the general probability:
F (z) = P (Z ≤ z) = P (X + Y ≤ z) = P (Y ≤ −X + z)
Z z Z z−x Z z
= e−x−y dydx = (e−x − e−z )dx = 1 − e−z − ze−z .
0 0 0
Hence, P (Z ≤ 0) = 0, P (Z ≤ 6) = 1 − 7e−6 , and f (z) = F ′ (z) = ze−z , 0 < z < ∞, zero elsewhere.
2.1.8. Let X and Y have the pdf f (x, y) = 1, 0 < x < 1, 0 < y < 1, zero elsewhere. Find the cdf and pdf of
the product Z = XY .
Solution.
If z ≤ 0, then F (z) = P (Z ≤ z) = 0 because Z > 0.
Z z Z 1 Z 1 Z z/x
F (z) = P (Z ≤ z) = P (Y ≤ z/X) = dydx + dydx = z − z log z, 0 < z < 1,
0 0 z 0
zero elsewhere.
2.1.11. Let X1 and X2 have the joint pdf f (x1 , x2 ) = 15x21 x2 , 0 < x1 < x2 < 1, zero elsewhere. Find the
marginal pdfs and compute P (X1 + X2 ≤ 1).
Solution.
1
15x21 (1 − x21 )
Z
fX1 (x1 ) = 15x21 x2 dx2 = , 0 < x1 < 1,
x1 2
Z x2
fX2 (x2 ) = 15x21 x2 dx1 = 5x42 , 0 < x2 < 1,
0
Z 1/2 Z 1−x1
5
P (X1 + X2 ≤ 1) = 15 x21 x2 dx2 dx1 = · · · = .
0 x1 64
2.1.13. Let X1 , X2 be two random variables with the joint pmf p(x1 , x2 ) = (x1 + x2 )/12, for x1 = 1, 2, x2 =
1, 2, zero elsewhere. Compute E(X1 ), E(X12 ), E(X2 ), E(X22 ), and E(X1 X2 ). Is E(X1 X2 ) = E(X1 )E(X2 )?
Find E(2X1 − 6X22 + 7X1 X2 ).
Solution.
First, find the marginal pdfs:
2
X x1 + x2 x1 + 1 x1 + 2 2x1 + 3 2x2 + 3
pX1 (x1 ) = = + = , pX2 (x2 ) = .
x =1
12 12 12 12 12
2
Hence
2
X 5 14 19
E(X1 ) = x1 p(x1 ) = pX1 (1) + 2pX1 (2) = + = ,
x1 =1
12 12 12
33
E(X12 ) = pX1 (1) + 22 pX1 (2) = ,
12
19 33
E(X2 ) = E(X1 ) = , E(X22 ) = E(X12 ) = .
12 12
2
Also, use the joint mgf to obtain
X 5
E(X1 X2 ) = x1 x2 p(x1 , x2 ) = p(1, 1) + 2p(2, 1) + 2p(1, 2) + 4p(2, 2) = ≠ E(X1 )E(X2 ).
x x
2
1 2
Therefore,
19 33 5 25
E(2X1 − 6X22 + 7X1 X2 ) = 2 −6 +7 = .
12 12 2 6
2.1.15. Let X1 , X2 be two random variables with joint pmf p(x1 , x2 ) = (1/2)x1 +x2 , for 1 ≤ xi < ∞,
i = 1, 2, where X1 and X2 are integers, zero elsewhere. Determine the joint mgf of X1 , X2 . Show that
M (t1 , t2 ) = M (t1 , 0)M (0, t2 ).
Solution.
∞
X (1/2)x1 +1
p(x1 ) = (1/2)x1 +x2 = = (1/2)x1 , p(x1 ) = (1/2)x2
x2 =1
1 − 1/2
∞
X et /2 et
MX1 (t) = (et /2)x1 = t
= = MX2 (t), t < log 2,
x1 =1
1 − e /2 2 − et
X∞ X ∞ ∞
X ∞
X
M (t1 , t2 ) = et1 x1 +t2 x2 (1/2)x1 +x2 = (et1 /2)x1 (et2 /2)x2
x1 =1 x2 =1 x1 =1 x2 =1
3
2.2.6. Suppose X1 and X2 have the joint pdf f (x1 , x2 ) = e−(x1 +x2 ) , 0 < xi < ∞, i = 1, 2, zero elsewhere.
(a) Use formula (2.2.5) to find the pdf of Y1 = X1 + X2 .
Solution.
Since the support of (Y1 , Y2 ) is 0 < y1 − y2 < ∞, 0 < y2 < ∞ ⇒ 0 < y2 < y1 < ∞,
Z ∞ Z y1
fY1 (y1 ) = fX1 ,X2 (y1 − y2 , y2 )dy2 = e−y1 dy2 = y1 e−y1 , y1 > 0.
−∞ 0
2.2.7. Use the formula (2.2.5) to find the pdf of Y1 = X1 + X2 , where X1 and X2 have the joint pdf
fX1 ,X2 (x1 , x2 ) = 2e−(x1 +x2 ) , 0 < x1 < x2 < ∞, zero elsewhere.
Solution.
Since the support of Y1 and Y2 is 0 < y1 − y2 < y2 , 0 < y2 < ∞ ⇒ 0 < y1 /2 < y2 < y1 < ∞,
Z ∞ Z y1
fY1 (y1 ) = fX1 ,X2 (y1 − y2 , y2 )dy2 = 2e−y1 dy2 = y1 e−y1 , y1 > 0,
−∞ y1 /2
Solution.
Let Z = w1 X1 − w2 X2 . This is one-to-one transformation so that we have
w+z w−z
x1 = , x2 = .
2w1 2w2
Then the Jacobian is given by
∂x1 ∂x1 1
∂w ∂z 1/2w1 1/2w1
J= ∂x2 ∂x2 = =− .
∂w ∂z
1/2w2 −1/2w2 2w1 w2
Hence the joint pdf of W and Z is
w+z w−z w+z w−z 1 1 w1 +w2 w1 −w2
fW,Z (w, z) = f , |J| = e− 2w1 e− 2w2 = e− 2w1 w2 w e 2w1 w2 z .
2w1 2w2 2w1 w2 2w1 w2
The support is
w+z w−z
> 0, >0 ⇒ w > 0, −w < z < w.
2w1 2w2
4
Hence the marginal pdf of W is
Z w w −w
1 w1 +w2 1 2
fW (w) = e− 2w1 w2 w e 2w1 w2 z dz
2w1 w2 −w
1 w1 +w2 h w1 −w2 iw
− 2w w
= e 1 w2 e 2w1 w2 z
w1 − w2 −w
1 w1 +w2
− 2w
w1 −w2 w1 −w2
= e 1 w2
w
e 2w1 w2 − e− 2w1 w2 w
w
w1 − w2
1
= (e−w/w1 − e−w/w2 ), w > 0.
w1 − w2
= lim
h→0 dh/dh
0 + {w/(w1 − h)2 }e−w/(w1 −h)
= lim
h→0 1
= w/w12 e−w/w1 .
5
(a) Compute the marginal pdf of X and the conditional pdf of Y , given X = x.
Solution.
Z ∞ ∞
2 1 1
f (x) = dy = − = 0 < x < ∞,
0 (1 + x + y)3 (1 + x + y)2 0 (1 + x)2
f (x, y) 2(1 + x)2
f (y|x) = = 0 < x < ∞, 0 < x < ∞,
f (x) (1 + x + y)3
zero elsewhere.
(b) For a fixed X = x, compute E(1 + x + Y |x) and use the result to compute E(Y |x).
Solution.
∞ ∞ ∞
2(1 + x)2 2(1 + x)2 −2(1 + x)2
Z Z
E(1 + x + Y |x) = (1 + x + y) dy = dy = = 2(1 + x).
0 (1 + x + y)3 0 (1 + x + y)2 (1 + x + y) 0
2.3.8. Let X and Y have the joint pdf f (x, y) = 2 exp{−(x + y)}, 0 < x < y < ∞, zero elsewhere. Find the
conditional mean E(Y |x) of Y , given X = x.
Solution.
Z ∞
f (x, y)
f (x) = 2 exp{−(x + y)}dy = 2e−2x ⇒ f2|1 (y|x) = = ex−y 0 < x < y < ∞.
x f (x)
Hence,
Z ∞ Z ∞
E(Y |x) = yex−y dy = (x + t)e−t dt = x + 1, x > 0.
x 0
2.3.10. Let X1 and X2 have the joint pmf p(x1, x2) described as follows:
(x1 , x2 ) (0, 0) (0, 1) (1, 0) (1, 1) (2, 0) (2, 1)
1 3 4 3 6 1
p(x1 , x2 ) 18 18 18 18 18 18
and p(x1 , x2 ) is equal to zero elsewhere. Find the two marginal probability mass functions and the two
conditional means.
Hint: Write the probabilities in a rectangular array.
Solution.
4
(
11 x1 = 0
18
18 x2 = 0 7
p(x1 ) = 7
, p(x2 ) = x1 = 1 ,
18 x2 = 1 18
7
x1 = 2
18
3
(
16 18
x1 = 0
18 x2 = 0 3
E(X1 |X2 = x2 ) = 5
, E(X2 |X1 = x1 ) = 18 x1 = 1 .
18 x2 = 1 1
18 x1 = 2
6
2.3.11. Let us choose at random a point from the interval (0, 1) and let the random variable X1 be equal to
the number that corresponds to that point. Then choose a point at random from the interval (0, x1 ), where
x1 is the experimental value of X1 ; and let the random variable X2 be equal to the number that corresponds
to this point.
(a) Make assumptions about the marginal pdf f1 (x1 ) and the conditional pdf f2|1 (x2 |x1 ).
Solution.
Assume that X1 ∼ U (0, 1) and X2 |X1 = x1 ∼ U (0, x2 ):
1
f (x1 ) = I(0 < x1 < 1), f (x2 |x1 ) = I(0 < x2 < x1 ).
x1
Hence,
1
1 − x2
Z
1
E(X1 |X2 = x2 ) = − dx1 = , 0 < x2 < 1.
x2 log x2 log(1/x2 )
2.3.12. Let f (x) and F (x) denote, respectively, the pdf and the cdf of the random variable X. The conditional
pdf of X, given X > x0 , x0 a fixed number, is defined by f (x|X > x0 ) = f (x)/[1 − F (x0 )], x0 < x, zero
elsewhere. This kind of conditional pdf finds application in a problem of time until death, given survival
until time x0 .
(a) Show that f (x|X > x0 ) is a pdf.
Solution.
Since f (x) > 0 and 0 < F (x) < 1, f (x|X > x0 ) = f (x)/[1 − F (x0 )] > 0. Also,
Z ∞ Z ∞
f (x) 1
f (x|X > x0 )dx = dx = [F (x)]∞
x0 = 1 since F (∞) = 1.
x0 x0 [1 − F (x 0 )] [1 − F (x 0 )]
(b) Let f (x) = e−x , 0 < x < ∞, and zero elsewhere. Compute P (X > 2|X > 1).
Solution.
Since F (x) = 1 − e−x , x > 0, f (x|X > 1) = f (x)/[1 − F (1)] = e−x+1 . Hence,
Z ∞ Z ∞
P (X > 2|X > 1) = f (x|X > 1)dx = e−x+1 dx = [−e−x+1 ]∞
2 =e
−1
.
2 2
7
2.4 Independent Random Variables
2.4.1. Show that the random variables X1 and X2 with joint pdf
(
12x1 x2 (1 − x2 ) 0 < x1 < 1, 0 < x2 < 1
f (x1 , x2 ) =
0 elsewhere
are independent.
Solution.
The support is rectangular (a product space). And f (x1 , x2 ) can be written as a product of a nonnegative
function of x1 and a nonnegative function of x2 : f (x1 , x2 ) ≡ g(x1 )h(x2 ), where g(x1 ) = 12x1 I(0 < x1 < 1)
and h(x2 ) = x2 (1 − x2 )I(0 < x2 < 1). Thus, X1 and X2 are independent.
Another solution is f (x1 , x2 ) = f (x1 )f (x2 ), where f (x1 ) = 2x1 and f (x2 ) = 6x2 (1 − x2 ) are marginal pdfs
of X1 and X2 .
2.4.2. If the random variables X1 and X2 have the joint pdf f (x1 , x2 ) = 2e−x1 −x2 , 0 < x1 < x2 , 0 < x2 < ∞,
zero elsewhere, show that X1 and X2 are dependent.
Solution.
Although the joint pdf can be expressed by a product of two nonnegative functions of x1 and x2 , respectively,
0 < x1 < x2 < ∞ is not a product space, which implies that X1 and X2 are dependent.
1
2.4.3. Let p(x1 , x2 ) = 16 , x1 = 1, 2, 3, 4, and x2 = 1, 2, 3, 4, zero elsewhere, be the joint pmf of X1 and X2 .
Show that X1 and X2 are independent.
Solution.
The marginal pdfs of X1 and X2 are p(x1 ) = p(x2 ) = 1/4. So p(x1 , x2 ) = p(x1 )p(x2 ) and the space is
rectangular, which gives us X1 and X2 are independent.
2.4.4. Find P (0 < X1 < 31 , 0 < X2 < 13 ) if the random variables X1 and X2 have the joint pdf f (x1 , x2 ) =
4x1 (1 − x2 ), 0 < x1 < 1, 0 < x2 < 1, zero elsewhere.
Solution.
Since f (x1 ) = 2x1 , 0 < x1 < 1 and f (x2 ) = 2(1 − x2 ), 0 < x2 < 1 and X1 and X2 are independent,
1 1 1 1
P 0 < X1 < , 0 < X2 < = P 0 < X1 < P 0 < X2 <
3 3 3 3
Z 1/3 ! Z !
1/3
= 2x1 dx1 2(1 − x2 )dx2
0 0
1 5 5
= = .
9 9 81
2.4.5. Find the probability of the union of the events a < X1 < b, −∞ < X2 < ∞, and −∞ < X1 < ∞,
c < X2 < d if X1 and X2 are two independent variables with P (a < X1 < b) = 23 and P (c < X2 < d) = 58 .
Solution.
P ({a < X1 < b, ∞ < X2 < ∞} ∪ {−∞ < X1 < ∞, c < X2 < d})
= P ({a < X1 < b} ∪ {c < X2 < d})
= P (a < X1 < b) + P (c < X2 < d) − P ({a < X1 < b} ∩ {c < X2 < d})
= P (a < X1 < b) + P (c < X2 < d) − P (a < X1 < b)P (c < X2 < d)
2 5 2 5 7
= + − = .
3 8 3 8 8
8
2.4.8. Let X and Y have the joint pdf f (x, y) = 3x, 0 < y < x < 1, zero elsewhere. Are X and Y
independent? If not, find E(X|y).
Solution.
X and Y are not independent because the support 0 < y < x < 1 is not rectangular (not a product space).
R1
So find f (y) first: f (y) = y 3xdx = 3(1 − y 2 )/2, 0 < y < 1, zero elsewhere. Hence
∞ 1
2x2 2(1 − y 3 ) 2(1 + y + y 2 )
Z Z
f (x, y)
E(X|y) = x dx = dx = = , 0 < y < 1.
−∞ f (y) y (1 − y 2 ) 3(1 − y 2 ) 3(1 + y)
2.4.10. Let X and Y be random variables with the space consisting of the four points (0, 0), (1, 1), (1, 0),
(1, −1). Assign positive probabilities to these four points so that the correlation coefficient is equal to zero.
Are X and Y independent?
Solution.
Assume the uniform distribution as shown below:
x1 , x2 -1 0 1 pX1 (x1 )
0 0 1/4 0 1/4
1 1/4 1/4 1/4 3/4
pX2 (x2 ) 1/4 1/2 1/4
Then, correlation coefficient ρ = 0 because
However, P (X1 = X2 = 1) = 1/4 ̸= 3/16 = pX1 (1)pX2 (1), meaning that X and Y are not independent.
2.4.11. Two line segments, each of length two units, are placed along the x-axis. The midpoint of the first
is between x = 0 and x = 14 and that of the second is between x = 6 and x = 20. Assuming independence
and uniform distributions for these midpoints, find the probability that the line segments overlap.
Solution.
Since X1 ∼ U (0, 14) and X2 ∼ U (6, 20), the joint pdf of X1 and X2 is f (x1 , x2 ) = 1/142 . The desired
probability is
Z 14 Z x1
1 (x1 − 6)2 14 8
P (X1 ≥ X2 ) = 2
dx2 dx1 = 2)
= .
6 6 14 2(14 6 49
2.4.12. Cast a fair die and let X = 0 if 1, 2, or 3 spots appear, let X = 1 if 4 or 5 spots appear, and let
X = 2 if 6 spots appear. Do this two independent times, obtaining X1 and X2 . Calculate P (|X1 − X2 | = 1).
Solution.
|X1 − X2 | = 1 when (X1 , X2 ) = (0, 1), (1, 0), (1, 2), (2, 1) with probabilities of 1/6, 1/6, 1/18, and 1/18,
respectively. Hence the desired probability is 2(1/6 + 1/18) = 4/9.
2.4.13. For X1 and X2 in Example 2.4.6, show that the mgf of Y = X1 + X2 is e2t /(2 − et )2 , t < log 2, and
then compute the mean and variance of Y .
Solution.
Let t = t1 = t2 then
2
et e2t
MY (t) = MX1 ,X2 (t, t) = = , t < log 2.
2 − et (2 − et )2
9
Let ψ(t) = log MY (t) = 2t − 2 log(2 − et ). Then
2et
E(Y ) = ψ ′ (0) = 2 + = 4,
2 − et t=0
4et
Var(Y ) = ψ ′′ (0) = = 4.
(2 − et )2 t=0
Hence,
Z ∞ Z ∞ Z 1
f (x, y) y 1+x
E(Y |X = x) = yf (y|x)dy = dy = y dy = , 0 < x < 1,
−∞ −∞ f (x) x 1−x 2
Z ∞ Z ∞ Z y
f (x, y) x y
E(X|Y = y) = xf (x|y)dy = x dy = dy = , 0 < y < 1.
−∞ −∞ f (y) 0 y 2
2.5.4. Show that the variance of the conditional distribution of Y , given X = x, in Exercise 2.5.3, is
(1 − x)2 /12, 0 < x < 1, and that the variance of the conditional distribution of X, given Y = y, is y 2 /12,
0 < y < 1.
Solution.
1
y2 1 + x + x2
Z
E(Y 2 |X = x) = dy = , 0 < x < 1,
x 1−x 3
y
x2 y2
Z
E(X 2 |Y = y) = dy = , 0 < y < 1.
0 y 3
Hence,
1 + x + x2 (1 + x)2 (1 − x)2
Var(Y |X = x) = E(Y 2 |X = x) − [E(Y |X = x)]2 = − = , 0 < x < 1,
3 4 12
y2 y2 y2
Var(X|Y = y) = E(X 2 |Y = y) − [E(X|Y = y)]2 = − = , 0 < y < 1.
3 4 12
10
2.5.5. Verify the results of equations (2.5.11) of this section.
Solution. See Exercise 2.5.8 because using ψ(t1 , t2 ) is easier to compute them.
2.5.6. Let X and Y have the joint pdf f (x, y) = 1, −x < y < x, 0 < x < 1, zero elsewhere. Show that, on
the set of positive probability density, the graph of E(Y |x) is a straight line, whereas that of E(X|y) is not
a straight line.
Solution.
Find the marginal pdfs of X and Y first.
Z x (R 1
dx = 1 − y 0<y<1
f (x) = dy = 2x, 0 < x < 1, f (y) = Ry1 .
−x 0
dx = 1 −1 < y ≤ 0
Hence,
Z ∞ Z ∞ Z x
f (x, y) y
E(Y |x) = yf (y|x)dy = y dy = dy = 0, 0 < x < 1,
−∞ −∞ f (x) −x 2x
(R 1
∞ ∞ x
dy = 1+y 0<y<1
Z Z
f (x, y)
E(X|y) = xf (x|y)dy = x dy = Ry1 1−y 1
2
−∞ −∞ f (y) 0
xdy = 2 −1 < y ≤ 0,
which means that the graph of E(Y |x) is a straight line, whereas that of E(X|y) is not a straight line.
2.5.8. Let ψ(t1 , t2 ) = log M (t1 , t2 ), where M (t1 , t2 ) is the mgf of X and Y . Show that
∂ψ(0, 0) ∂ 2 ψ(0, 0)
, , i = 1, 2,
∂ti ∂t2i
and
∂ 2 ψ(0, 0)
∂t1 t2
yield the means, the variances, and the covariance of the two random variables. Use this result to find the
means, the variances, and the covariance of X and Y of Example 2.5.6.
Solution.
Note that M (0, 0) = E(1) = 1. When i = 1,
Z ∞ Z ∞ Z ∞
∂ψ(0, 0) ∂M (0, 0)/∂t1
= = x f (x, y)dydx = xf (x)dx = E(X),
∂t1 M (0, 0) −∞ −∞ −∞
∂ 2 ψ(0, 0) M (0, 0)∂ 2 M (0, 0)/∂t21 − [∂M (0, 0)/∂t1 ]2
2 = = E(X 2 ) − [E(X)]2 = Var(X).
∂t1 M (0, 0)2
Same for i = 2. And
∂ 2 ψ(0, 0) ∂ ∂M (0, 0)/∂t1
=
∂t1 t2 ∂t2 M (0, 0)
[∂ 2 M (0, 0)/∂t1 t2 ]M (0, 0) − [∂M (0, 0)/∂t1 ][∂M (0, 0)/∂t2 ]
=
M (0, 0)2
= E(XY ) − E(X)E(Y ) = Cov(X, Y ).
Hence, for Example 2.5.6,
ψ(t1 , t2 ) = log M (t1 , t2 ) = − log(1 − t1 − t2 ) − log(1 − t2 ),
∂ψ(t1 , t2 ) 1 ∂ψ(t1 , t2 ) 1 1
= , = +
∂t1 1 − t1 − t2 ∂t2 1 − t1 − t2 1 − t2
∂ 2 ψ(t1 , t2 ) 1 ∂ 2 ψ(t1 , t2 ) 1 1
2 = , = +
∂t1 (1 − t1 − t2 ) 2 ∂t22 (1 − t1 − t2 )2 (1 − t2 )2
∂ 2 ψ(t1 , t2 ) 1
= .
∂t1 t2 (1 − t1 − t2 )2
11
Therefore,
∂ψ(0, 0) ∂ψ(0, 0)
µ1 = E(X) = = 1, µ2 = E(Y ) = =2
∂t1 ∂t2
∂ 2 ψ(0, 0) ∂ 2 ψ(0, 0)
σ12 = Var(X) = = 1, σ 2
2 = Var(Y ) = =2
∂t21 ∂t22
∂ 2 ψ(0, 0)
E[(X − µ1 )(Y − µ2 )] = Cov(X, Y ) = = 1.
∂t1 t2
2.5.9. Let X and Y have the joint pmf p(x, y) = 17 , (0, 0), (1, 0), (0, 1), (1, 1), (2, 1), (1, 2), (2, 2), zero elsewhere.
Find the correlation coefficient ρ.
Solution.
1+1+2+1+2 1+1+4+1+4 11
E(X) = E(Y ) = = 1, E(X 2 ) = E(Y 2 ) = =
7 7 7
2 11 4 1 + 2 + 2 + 4 9
⇒ σX = σY2 = − 1 = , E(XY ) = = .
7 7 7 7
Hence,
E(XY ) − E(X)E(Y ) 2/7 1
ρ= = = .
σX σY 4/7 2
2.5.11. Let σ12 = σ22 = σ 2 be the common variance of X1 and X2 and let ρ be the correlation coefficient of
X1 and X2 . Show for k > 0 that
2(1 + ρ)
P [|(X1 − µ1 ) + (X2 − µ2 )| ≥ kσ] ≤ .
k2
Solution.
P [|(X1 − µ1 ) + (X2 − µ2 )| ≥ kσ] = P [|(X1 − µ1 ) + (X2 − µ2 )|2 ≥ k 2 σ 2 ]
= P [(X1 − µ1 )2 + (X2 − µ2 )2 + 2(X1 − µ1 )(X2 − µ2 ) ≥ k 2 σ 2 ]
≤ P [(X1 − µ1 )2 ≥ k 2 σ 2 ] + P [(X2 − µ2 )2 ≥ k 2 σ 2 ]
+ P [2(X1 − µ1 )(X2 − µ2 ) ≥ k 2 σ 2 ]
= P (|X1 − µ1 | ≥ kσ) + P (|X2 − µ2 | ≥ kσ)
+ P [2(X1 − µ1 )(X2 − µ2 ) ≥ k 2 σ 2 ]
1 1 2E(X1 − µ1 )(X2 − µ2 )
≤ + 2+
k2 k k2 σ2
2(1 + ρ) E(X1 − µ1 )(X2 − µ2 )
= since = ρ.
k2 σ2
12
(b) Compute P (0 < X < 12 , 0 < Y < 21 , 0 < Z < 12 ) and P (0 < X < 12 ) = P (0 < Y < 12 ) = P (0 < Z < 12 ).
Solution. Skipped. We can solve part (c) without computing them.
(c) Are X, Y , and Z independent?
Solution. No; f (x, y, x) ̸= f (x)f (y)f (z) although the support is a product space.
(d) Compute E(X 2 Y Z + 3XY 4 Z 2 ).
Solution. Skipped.
(e) Determine the cdf of X, Y , and Z.
Solution.
0R
x≤0
x 2(t+1) (x+1)2 −1 x2 +2x
FX (x) = 0 3 dt = 3 = 3 0<x<1.
1 x≥1
Similarly,
0 2
y≤0 0 2
z≤0
y +2y z +2z
FY (y) = 0<y<1, FZ (z) = 0<z<1.
3 3
1 y≥1 1 z≥1
(f ) Find the conditional distribution of X and Y , given Z = z, and evaluate E(X + Y |z).
Solution.
f (x, y, z) x+y+z
f (x, y|z) = = , 0 < x < 1, 0 < y < 1.
f (z) z+1
Hence,
Z 1 Z 1
x+y+z
E(X + Y |z) = (x + y) dydx
0 0 z+1
1 1
(x + y)2 + z(x + y)
Z Z
= dydx
0 0 z+1
Z 1 y=1
1 (x + y)3 z(x + y)2
= + dx
z+1 0 3 2 y=0
Z 1
(x + 1)3 z(x + 1)2 x3 zx2
1
= + − − dx
z+1 0 3 2 3 2
1
(x + 1)4 z(x + 1)3 x4 zx3
1
= + − −
z+1 12 6 12 6 0
z + 7/6 6z + 7
= = , 0 < z < 1.
z+1 6(z + 1)
(g) Determine the conditional distribution of X, given Y = y and Z = z, and compute E(X|y, z).
Solution.
Z 1
2(x + y + z) 2y + 2z + 1
f (y, z) = dx =
0 3 3
f (x, y, z) 2(x + y + z)
f (x|y, z) = = .
f (y, z) 2y + 2z + 1
13
Hence,
1 1
2x2 + 2x(y + z)
Z Z
2(x + y + z) 3y + 3z + 2
E(X|y, z) = x dx = = ··· = , 0 < y, z < 1.
0 2y + 2z + 1 0 2y + 2z + 1 3(2y + 2z + 1)
2.6.2. Let f (x1 , x2 , x3 ) = exp[−(x1 + x2 + x3 )], 0 < x1 < ∞, 0 < x2 < ∞, 0 < x3 < ∞, zero elsewhere, be
the joint pdf of X1 , X2 , X3 .
(a) Compute P (X1 < X2 < X3 ) and P (X1 = X2 < X3 ).
Solution.
Z ∞ Z x3 Z x2
P (X1 < X2 < X3 ) = e−x1 −x2 −x3 dx1 dx2 dx3
Z0 ∞ Z0 x3 0
(b) Determine the joint mgf of X1 , X2 , and X3 . Are these random variables independent?
Solution.
Z ∞ Z ∞ Z ∞
M (t1 , t2 , t3 ) = e−(1−t1 )x1 e−(1−t2 )x2 e−(1−t3 )x3 dx1 dx2 dx3
0 0 0
Z ∞ Z ∞ Z ∞
= e−(1−t1 )x1 dx1 e−(1−t2 )x2 dx2 e−(1−t3 )x3 dx3
0 0 0
1
= , t1 < 1, t2 < 1, t3 < 1
(1 − t1 )(1 − t2 )(1 − t3 )
= MX1 (t1 )MX2 (t2 )MX3 (t3 ),
which clearly shows that these three random varialbes are independent.
2.6.7. Prove Corollary 2.6.1: Suppose P
X1 , X2 , ..., Xn are iid random variables with the common mgf M (t),
n
for −h < t < h, where h > 0. Let T = i=1 Xi . Then T has the mgf given by
Solution.
h Pn i Yn
MT (t) = E e i=1 Xi t = E(eXi t ) (Xi′ s are independent)
i=1
= [E(eXt )]n (Xi′ s are identical)
= [MX (t)]n .
2.6.9. Let X1 , X2 , X3 be iid with common pdf f (x) = exp(−x), 0 < x < ∞, zero elsewhere. Evaluate:
(a) P (X1 < X2 |X1 < 2X2 ).
Solution.
P (X1 < X2 , X1 < 2X2 ) P (X1 < X2 )
P (X1 < X2 |X1 < 2X2 ) = = .
P (X1 < 2X2 ) P (X1 < 2X2 )
14
For the numerator,
Z ∞ Z ∞ Z ∞
1
P (X1 < X2 ) = e−x1 −x2 dx2 dx1 = e−2x1 dx2 = .
0 x1 0 2
Hence
P (X1 < X2 < X3 < 1) 1 − 3e−1 + 3e−2 − e−3
P (X1 < X2 < X3 |X3 < 1) = = ≈ 0.0666.
P (X3 < 1) 6(1 − e−1 )
15
Hence, Var(Y ) = 25 ⇒ k = 7.
P5
2.8.6. Determine the mean and variance of the sample mean X=5 ¯ −1 i=1 Xi , where X1 , . . . , X5 is a random
sample from a distribution having pdf f (x) = 4x3 , 0 < x < 1, zero elsewhere.
Solution.
Z 1 Z 1
3 4 2 2 2
E(X) = x(4x )dx = , E(X ) = x2 (4x3 )dx = ⇒ Var(X) = .
0 5 0 3 75
Hence,
4 Var(X) 2
E(X̄) = E(X) = = 0.8, Var(X̄) = = ≈ 0.00533.
5 5 375
2.8.7. Let X and Y be random variables with µ1 = 1, µ2 = 4, σ12 = 4, σ22 = 6, ρ = 12 . Find the mean and
variance of the random variable Z = 3X − 2Y .
Solution.
2.8.8. Let X and Y be independent random variables with means µ1 , µ2 and variances σ12 , σ22 . Determine
the correlation coefficient of X and Z = X − Y in terms of µ1 , µ2 , σ12 , σ22 .
Solution.
Since X and Y are independent,
Cov(X, Z) σ2 σ1
ρ= p = p 2 21 =p 2 .
Var(X)Var(Z) σ1 (σ1 + σ22 ) σ1 + σ22
2.8.10. Determine the correlation coefficient of the random variables X and Y if var(X) = 4, var(Y ) = 2,
and var(X + 2Y ) = 15.
Solution.
√ √ √
15 = Var(X + 2Y ) = Var(X) + 4Var(Y ) + 4Cov(X, Y ) = 4 + 4(2) + 4ρ 4 2 = 12 + 8 2ρ.
√
Hence, ρ = 3/(8 2) ≈ 0.265.
2.8.11. Let X and Y be random variables with means µ1 , µ2 ; variances σ12 , σ22 ; and correlation coefficient
ρ. Show that the correlation coefficient of W = aX + b, a > 0, and Z = cY + d, c > 0, is ρ.
Solution.
16
2.8.13. Let X1 and X2 be independent random variables with nonzero variances. Find the correlation
coefficient of Y = X1 X2 and X1 in terms of the means and variances of X1 and X2 .
Solution.
Let µ1 , µ2 and σ12 , σ22 denote the means and the variances of X1 and X2 , respectively. Since the two r.v.s.
are independent,
Var(Y ) = Var(X1 X2 )
= E(X12 X22 ) − E(X1 X2 )2
= E(X12 )E(X22 ) − E(X1 )2 E(X2 )2
= (µ21 + σ12 )(µ22 + σ22 ) − µ21 µ22
= µ21 σ22 + σ12 µ22 + σ12 σ22 ,
Cov(Y, X1 ) = Cov(X1 X2 , X1 )
= E(X12 X2 ) − E(X1 X2 )E(X1 )
= E(X12 )E(X2 ) − E(X1 )2 E(X2 )
= (µ21 + σ12 )µ2 − µ21 µ2
= σ12 µ2
Hence,
Cov(Y, X1 ) σ12 µ2 σ1 µ2
ρ= p =p =p .
Var(Y )Var(X1 ) µ21 σ22 + σ12 µ22 + σ12 σ22 (σ1 ) µ21 σ22 + σ12 µ22 + σ12 σ22
2.8.15. Let X1 , X2 , and X3 be random variables with equal variances but with correlation coefficients
ρ12 = 0.3, ρ13 = 0.5, and ρ23 = 0.2. Find the correlation coefficient of the linear functions Y = X1 + X2 and
Z = X2 + X3 .
Solution.
Let σ 2 denote the variance of X1 , X2 , and X3 . Then
Cov(Y, Z) 2σ 2
ρ= p =p ≈ 0.801.
Var(Y )Var(Z) 2.6(2.4)σ 2
2.8.17. Let X and Y have the parameters µ1 , µ2 , σ12 , σ22 , and ρ. Show that the correlation coefficient of X
and [Y − ρ(σ2 /σ1 )X] is zero.
Solution.
Cov(X, Y − ρ(σ2 /σ1 )X) = Cov(X, Y ) − ρ(σ2 /σ1 )Var(X) = ρσ1 σ2 − ρ(σ2 /σ1 )σ12 = 0.
17