0% found this document useful (0 votes)
66 views

Ch2 Sol

Uploaded by

007poojarajput
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views

Ch2 Sol

Uploaded by

007poojarajput
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Exercises in Introduction to Mathematical Statistics (Ch.

2)

Tomoki Okuno

September 14, 2022

Note
• Not all solutions are provided: exercises that are too simple or not very important to me are skipped.
• Texts in red are just attentions to me. Please ignore them.

2 Multivariate Distributions
2.1 Distributions of Two Random Variables
2.1.1. Let f (x1 , x2 ) = 4x1 x2 , 0 < x1 < 1, 0 < x2 < 1, zero elsewhere, be the pdf of X1 and X2 . Find
P (0 < X1 < 12 , 14 < X2 < 1), P (X1 = X2 ), P (X1 < X2 ), and P (X1 ≤ X2 ).
Solution.
  Z 1 Z 1/2
1 1 15
P 0 < X1 < , < X2 < 1 = 4x1 x2 dx1 dx2 = · · · =
2 4 1/4 0 64
P (X1 = X2 ) = 0 since the support is a segment not area
Z 1 Z x2 Z 1 Z 1
1
P (X1 < X2 ) = 4x1 x2 dx1 dx2 = 2x21 x2 |xx11 =0
=x2
dx1 dx2 = 2x32 dx2 = .
0 0 0 0 2
1
P (X1 ≤ X2 ) = P (X1 < X2 ) + P (X1 = X2 ) = P (X1 < X2 ) = .
2

2.1.2. Let A1 = {(x, y) : x ≤ 2, y ≤ 4}, A2 = {(x, y) : x ≤ 2, y ≤ 1}, A3 = {(x, y) : x ≤ 0, y ≤ 4}, and


A4 = {(x, y) : x ≤ 0, y ≤ 1} be subsets of the space A of two random variables X and Y , which is the
entire two-dimensional plane. If P (A1 ) = 78 , P (A2 ) = 48 , P (A3 ) = 83 , and P (A4 ) = 28 , find P (A5 ), where
A5 = {(x, y) : 0 < x ≤ 2, 1 < y ≤ 4}.
Solution. P (A5 ) = P (A1 ) − P (A2 ) − P (A3 ) + P (A4 ) = 82 .
2.1.3. Let F (x, y) be the distribution function of X and Y . For all real constants a < b, c < d, show that

P (a < X ≤ b, c < Y ≤ d) = F (b, d) − F (b, c) − F (a, d) + F (a, c).

Solution.

P (a < X ≤ b, c < Y ≤ d) = P (X ≤ b, c < Y ≤ d) − P (X ≤ a, c < Y ≤ d)


= P (X ≤ b, Y ≤ d) − P (X ≤ b, Y ≤ c) − P (X ≤ a, Y ≤ d) + P (X ≤ a, Y ≤ c)
= F (b, d) − F (b, c) − F (a, d) + F (a, c).

2.1.7. Let f (x, y) = e−x−y , 0 < x < ∞, 0 < y < ∞, zero elsewhere, be the pdf of X and Y . Then if
Z = X + Y , compute P (Z ≤ 0), P (Z ≤ 6), and, more generally, P (Z ≤ z), for 0 < z < ∞. What is the pdf
of Z.

1
Solution.
Compute the general probability:

F (z) = P (Z ≤ z) = P (X + Y ≤ z) = P (Y ≤ −X + z)
Z z Z z−x Z z
= e−x−y dydx = (e−x − e−z )dx = 1 − e−z − ze−z .
0 0 0

Hence, P (Z ≤ 0) = 0, P (Z ≤ 6) = 1 − 7e−6 , and f (z) = F ′ (z) = ze−z , 0 < z < ∞, zero elsewhere.
2.1.8. Let X and Y have the pdf f (x, y) = 1, 0 < x < 1, 0 < y < 1, zero elsewhere. Find the cdf and pdf of
the product Z = XY .
Solution.
If z ≤ 0, then F (z) = P (Z ≤ z) = 0 because Z > 0.
Z z Z 1 Z 1 Z z/x
F (z) = P (Z ≤ z) = P (Y ≤ z/X) = dydx + dydx = z − z log z, 0 < z < 1,
0 0 z 0

and one z ≥ 1. Hence, the pdf pf Z is

fZ (z) = F ′ (z) = − log z, 0 < z < 1,

zero elsewhere.
2.1.11. Let X1 and X2 have the joint pdf f (x1 , x2 ) = 15x21 x2 , 0 < x1 < x2 < 1, zero elsewhere. Find the
marginal pdfs and compute P (X1 + X2 ≤ 1).
Solution.
1
15x21 (1 − x21 )
Z
fX1 (x1 ) = 15x21 x2 dx2 = , 0 < x1 < 1,
x1 2
Z x2
fX2 (x2 ) = 15x21 x2 dx1 = 5x42 , 0 < x2 < 1,
0
Z 1/2 Z 1−x1 
5
P (X1 + X2 ≤ 1) = 15 x21 x2 dx2 dx1 = · · · = .
0 x1 64

2.1.13. Let X1 , X2 be two random variables with the joint pmf p(x1 , x2 ) = (x1 + x2 )/12, for x1 = 1, 2, x2 =
1, 2, zero elsewhere. Compute E(X1 ), E(X12 ), E(X2 ), E(X22 ), and E(X1 X2 ). Is E(X1 X2 ) = E(X1 )E(X2 )?
Find E(2X1 − 6X22 + 7X1 X2 ).
Solution.
First, find the marginal pdfs:
2
X x1 + x2 x1 + 1 x1 + 2 2x1 + 3 2x2 + 3
pX1 (x1 ) = = + = , pX2 (x2 ) = .
x =1
12 12 12 12 12
2

Hence
2
X 5 14 19
E(X1 ) = x1 p(x1 ) = pX1 (1) + 2pX1 (2) = + = ,
x1 =1
12 12 12
33
E(X12 ) = pX1 (1) + 22 pX1 (2) = ,
12
19 33
E(X2 ) = E(X1 ) = , E(X22 ) = E(X12 ) = .
12 12

2
Also, use the joint mgf to obtain
X 5
E(X1 X2 ) = x1 x2 p(x1 , x2 ) = p(1, 1) + 2p(2, 1) + 2p(1, 2) + 4p(2, 2) = ≠ E(X1 )E(X2 ).
x x
2
1 2

Therefore,
19 33 5 25
E(2X1 − 6X22 + 7X1 X2 ) = 2 −6 +7 = .
12 12 2 6

2.1.15. Let X1 , X2 be two random variables with joint pmf p(x1 , x2 ) = (1/2)x1 +x2 , for 1 ≤ xi < ∞,
i = 1, 2, where X1 and X2 are integers, zero elsewhere. Determine the joint mgf of X1 , X2 . Show that
M (t1 , t2 ) = M (t1 , 0)M (0, t2 ).
Solution.

X (1/2)x1 +1
p(x1 ) = (1/2)x1 +x2 = = (1/2)x1 , p(x1 ) = (1/2)x2
x2 =1
1 − 1/2

X et /2 et
MX1 (t) = (et /2)x1 = t
= = MX2 (t), t < log 2,
x1 =1
1 − e /2 2 − et
X∞ X ∞ ∞
X ∞
X
M (t1 , t2 ) = et1 x1 +t2 x2 (1/2)x1 +x2 = (et1 /2)x1 (et2 /2)x2
x1 =1 x2 =1 x1 =1 x2 =1

= MX1 (t1 )MX2 (t2 ) = M (t1 , 0)M (0, t2 ).

2.2 Transformations: Bivariate Random Variables


2.2.1. If p(x1 , x2 ) = ( 32 )x1 +x2 ( 13 )2−x1 −x2 , (x1 , x2 ) = (0, 0), (0, 1), (1, 0), (1, 1), zero elsewhere, is the joint pmf
of X1 and X2 , find the joint pmf of Y1 = X1 − X2 and Y2 = X1 + X2 .
Solution.
The support of (Y1 , Y2 ) is (y1 , y2 ) = (0, 0), (−1, 1), (1, 1), (0, 2). Since the one-to-one inverse functions are
x1 = (y1 + y2 )/2 and x2 = (y2 − y1 )/2,
   y1  2−y1
y1 + y2 y2 − y1 2 1
pY1 ,Y2 (y1 , y2 ) = p , = ,
2 2 3 3
zero outside the support.
2.2.5. Let X1 and X2 be continuous random variables with the joint pdf fX1 ,X2 (x1 , x2 ), −∞ < xi < ∞,
i = 1, 2. Let Y1 = X1 + X2 and Y2 = X2 .
(a) Find the joint pdf fY1 ,Y2 .
Solution.
The inverse functions are x1 = y1 − y2 , x2 = y2 and then the Jacobian J = 1. Hence
fY1 ,Y2 (y1 , y2 ) = fX1 ,X2 (y1 − y2 , y2 )|J| = fX1 ,X2 (y1 − y2 , y2 ).

(b) Show that


Z ∞
fY1 (y1 ) = fX1 ,X2 (y1 − y2 , y2 )dy2 , (2.2.5)
−∞

which is sometimes called the convolution formula.


Solution.
The support is −∞ < y1 − y2 < ∞, −∞ < y2 < ∞, i.e., −∞ < yi < ∞, i = 1, 2, which gives (2.2.5).

3
2.2.6. Suppose X1 and X2 have the joint pdf f (x1 , x2 ) = e−(x1 +x2 ) , 0 < xi < ∞, i = 1, 2, zero elsewhere.
(a) Use formula (2.2.5) to find the pdf of Y1 = X1 + X2 .
Solution.
Since the support of (Y1 , Y2 ) is 0 < y1 − y2 < ∞, 0 < y2 < ∞ ⇒ 0 < y2 < y1 < ∞,
Z ∞ Z y1
fY1 (y1 ) = fX1 ,X2 (y1 − y2 , y2 )dy2 = e−y1 dy2 = y1 e−y1 , y1 > 0.
−∞ 0

(b) Find the mgf of Y1


Solution.
Z ∞  2
−(1−t)y1 1 1
M (t) = y1 e dy1 = Γ(2) = , t < 1.
0 1−t (1 − t)2

2.2.7. Use the formula (2.2.5) to find the pdf of Y1 = X1 + X2 , where X1 and X2 have the joint pdf
fX1 ,X2 (x1 , x2 ) = 2e−(x1 +x2 ) , 0 < x1 < x2 < ∞, zero elsewhere.
Solution.
Since the support of Y1 and Y2 is 0 < y1 − y2 < y2 , 0 < y2 < ∞ ⇒ 0 < y1 /2 < y2 < y1 < ∞,
Z ∞ Z y1
fY1 (y1 ) = fX1 ,X2 (y1 − y2 , y2 )dy2 = 2e−y1 dy2 = y1 e−y1 , y1 > 0,
−∞ y1 /2

which means Y ∼ Exp(1).


2.2.8. Suppose X1 and X2 have the joint pdf
(
e−x1 e−x2 x1 > 0, x2 > 0
f (x1 , x2 ) = .
0 elsewhere

For constants w1 > 0 and w2 > 0, let W = w1 X1 + w2 X2 .


(a) Show that the pdf pf W is
(
−w/w1
1
w1 −w2 (e − e−w/w2 ) w > 0
f (x1 , x2 ) = .
0 elsewhere

Solution.
Let Z = w1 X1 − w2 X2 . This is one-to-one transformation so that we have
w+z w−z
x1 = , x2 = .
2w1 2w2
Then the Jacobian is given by
∂x1 ∂x1 1
∂w ∂z 1/2w1 1/2w1
J= ∂x2 ∂x2 = =− .
∂w ∂z
1/2w2 −1/2w2 2w1 w2
Hence the joint pdf of W and Z is
 
w+z w−z w+z w−z 1 1 w1 +w2 w1 −w2
fW,Z (w, z) = f , |J| = e− 2w1 e− 2w2 = e− 2w1 w2 w e 2w1 w2 z .
2w1 2w2 2w1 w2 2w1 w2
The support is
w+z w−z
> 0, >0 ⇒ w > 0, −w < z < w.
2w1 2w2

4
Hence the marginal pdf of W is
Z w w −w
1 w1 +w2 1 2
fW (w) = e− 2w1 w2 w e 2w1 w2 z dz
2w1 w2 −w
1 w1 +w2 h w1 −w2 iw
− 2w w
= e 1 w2 e 2w1 w2 z
w1 − w2 −w
1 w1 +w2
− 2w
 w1 −w2 w1 −w2 
= e 1 w2
w
e 2w1 w2 − e− 2w1 w2 w
w
w1 − w2
1
= (e−w/w1 − e−w/w2 ), w > 0.
w1 − w2

(b) Verify that fW (w) > 0 for w > 0.


Solution.
If w1 > w2 , then w1 − w2 > 0, e−w/w1 − e−w/w2 > 0 because g(x) = e−a/x is increasing for a > 0.
If w1 < w2 , then w1 − w2 < 0, e−w/w1 − e−w/w2 < 0. Hence, fW (w) > 0 for w > 0.
(c) Note that the pdf fW (w) has an indeterminate form when w1 = w2 . Rewrite fW (w) using h defined
as w1 − w2 = h. Then use l’Hˆopital’s rule to show that when w1 = w2 , the pdf is given by fW (w) =
(w/w12 ) exp{−w/w1 } for w > 0 and zero elsewhere.
Solution.
When w1 = w2 , or equivalently h → 0,
 −w/w
− e−w/(w1 −h)

e 1

lim fW (w) = lim


h→0 h→0 h
 −w/w
d
− e−w/(w1 −h)

dh e 1

= lim
h→0 dh/dh
0 + {w/(w1 − h)2 }e−w/(w1 −h)
 
= lim
h→0 1
= w/w12 e−w/w1 .

2.3 Conditional Distributions and Expectations


2.3.5. Let X1 and X2 be two random variables such that the conditional distributions and means exist.
Show that:
(a) E(X1 + X2 |X2 ) = E(X1 |X2 ) + X2 .
Solution.
Consider X2 = x2 (a fixed number) first.

E(X1 + X2 |X2 = x2 ) = E(X1 |X2 = x2 ) + x2 ⇒ E(X1 + X2 |X2 ) = E(X1 |X2 ) + X2 .

(b) E(u(X2 )|X2 ) = u(X2 ).


Solution. E(u(X2 )|X2 = x2 ) = E(u(x2 )) = u(x2 ) ⇒ E(u(X2 )|X2 ) = u(X2 ).
2.3.6. Let the joint pdf of X and Y be given by
(
2
(1+x+y)3 0 < x < ∞, 0 < x < ∞
f (x, y) =
0 elsewhere.

5
(a) Compute the marginal pdf of X and the conditional pdf of Y , given X = x.
Solution.
Z ∞  ∞
2 1 1
f (x) = dy = − = 0 < x < ∞,
0 (1 + x + y)3 (1 + x + y)2 0 (1 + x)2
f (x, y) 2(1 + x)2
f (y|x) = = 0 < x < ∞, 0 < x < ∞,
f (x) (1 + x + y)3

zero elsewhere.
(b) For a fixed X = x, compute E(1 + x + Y |x) and use the result to compute E(Y |x).
Solution.
∞ ∞ ∞
2(1 + x)2 2(1 + x)2 −2(1 + x)2
Z Z 
E(1 + x + Y |x) = (1 + x + y) dy = dy = = 2(1 + x).
0 (1 + x + y)3 0 (1 + x + y)2 (1 + x + y) 0

Since E(1 + x + Y |x) = 1 + x + E(Y |x), E(Y |x) = 1 + x.


2.3.7. Suppose X1 and X2 are discrete random variables which have the joint pmf p(x1 , x2 ) = (3x1 + x2 )/24,
(x1 , x2 ) = (1, 1), (1, 2), (2, 1), (2, 2), zero elsewhere. Find the conditional mean E(X2 |x1 ), when x1 = 1.
Solution.
X 4 5 7
E(X2 |x1 = 1) = x2 p(1, x2 ) = p(1, 1) + 2p(2, 1) = +2 = .
24 24 12
x2 ∈(1,2)

2.3.8. Let X and Y have the joint pdf f (x, y) = 2 exp{−(x + y)}, 0 < x < y < ∞, zero elsewhere. Find the
conditional mean E(Y |x) of Y , given X = x.
Solution.
Z ∞
f (x, y)
f (x) = 2 exp{−(x + y)}dy = 2e−2x ⇒ f2|1 (y|x) = = ex−y 0 < x < y < ∞.
x f (x)

Hence,
Z ∞ Z ∞
E(Y |x) = yex−y dy = (x + t)e−t dt = x + 1, x > 0.
x 0

2.3.10. Let X1 and X2 have the joint pmf p(x1, x2) described as follows:
(x1 , x2 ) (0, 0) (0, 1) (1, 0) (1, 1) (2, 0) (2, 1)
1 3 4 3 6 1
p(x1 , x2 ) 18 18 18 18 18 18

and p(x1 , x2 ) is equal to zero elsewhere. Find the two marginal probability mass functions and the two
conditional means.
Hint: Write the probabilities in a rectangular array.
Solution.

4
(
11 x1 = 0
 18

18 x2 = 0 7
p(x1 ) = 7
, p(x2 ) = x1 = 1 ,
18 x2 = 1  18
7
x1 = 2

18

3
(
16  18
 x1 = 0
18 x2 = 0 3
E(X1 |X2 = x2 ) = 5
, E(X2 |X1 = x1 ) = 18 x1 = 1 .
18 x2 = 1 1

18 x1 = 2

6
2.3.11. Let us choose at random a point from the interval (0, 1) and let the random variable X1 be equal to
the number that corresponds to that point. Then choose a point at random from the interval (0, x1 ), where
x1 is the experimental value of X1 ; and let the random variable X2 be equal to the number that corresponds
to this point.
(a) Make assumptions about the marginal pdf f1 (x1 ) and the conditional pdf f2|1 (x2 |x1 ).
Solution.
Assume that X1 ∼ U (0, 1) and X2 |X1 = x1 ∼ U (0, x2 ):
1
f (x1 ) = I(0 < x1 < 1), f (x2 |x1 ) = I(0 < x2 < x1 ).
x1

(b) Compute P (X1 + X2 ≥ 1).


Solution.
By (a), f1,2 (x1 , x2 ) = f (x2 |x1 )f (x1 ) = 1/x1 , 0 < x2 < x1 < 1. Hence,
Z 1 Z x1 Z 1  
1 1
P (X1 + X2 ≥ 1) = P (X2 ≥ 1 − X1 ) = dx2 dx1 = 2− dx1 = 1 − log 2.
1/2 1−x1 x1 1/2 x1

(c) Find the conditional mean E(X1 |x2 )


Solution.
Find f (x2 ) to get f (x1 |x2 ).
Z 1
1 f (x1 , x2 ) 1
f (x2 ) = dx1 = − log x2 , 0 < x2 < 1 ⇒ f (x1 |x2 ) = =− , 0 < x2 < x1 < 1.
x2 x1 f (x2 ) x1 log x2

Hence,
1
1 − x2
Z
1
E(X1 |X2 = x2 ) = − dx1 = , 0 < x2 < 1.
x2 log x2 log(1/x2 )

2.3.12. Let f (x) and F (x) denote, respectively, the pdf and the cdf of the random variable X. The conditional
pdf of X, given X > x0 , x0 a fixed number, is defined by f (x|X > x0 ) = f (x)/[1 − F (x0 )], x0 < x, zero
elsewhere. This kind of conditional pdf finds application in a problem of time until death, given survival
until time x0 .
(a) Show that f (x|X > x0 ) is a pdf.
Solution.
Since f (x) > 0 and 0 < F (x) < 1, f (x|X > x0 ) = f (x)/[1 − F (x0 )] > 0. Also,
Z ∞ Z ∞
f (x) 1
f (x|X > x0 )dx = dx = [F (x)]∞
x0 = 1 since F (∞) = 1.
x0 x0 [1 − F (x 0 )] [1 − F (x 0 )]

(b) Let f (x) = e−x , 0 < x < ∞, and zero elsewhere. Compute P (X > 2|X > 1).
Solution.
Since F (x) = 1 − e−x , x > 0, f (x|X > 1) = f (x)/[1 − F (1)] = e−x+1 . Hence,
Z ∞ Z ∞
P (X > 2|X > 1) = f (x|X > 1)dx = e−x+1 dx = [−e−x+1 ]∞
2 =e
−1
.
2 2

7
2.4 Independent Random Variables
2.4.1. Show that the random variables X1 and X2 with joint pdf
(
12x1 x2 (1 − x2 ) 0 < x1 < 1, 0 < x2 < 1
f (x1 , x2 ) =
0 elsewhere

are independent.
Solution.
The support is rectangular (a product space). And f (x1 , x2 ) can be written as a product of a nonnegative
function of x1 and a nonnegative function of x2 : f (x1 , x2 ) ≡ g(x1 )h(x2 ), where g(x1 ) = 12x1 I(0 < x1 < 1)
and h(x2 ) = x2 (1 − x2 )I(0 < x2 < 1). Thus, X1 and X2 are independent.
Another solution is f (x1 , x2 ) = f (x1 )f (x2 ), where f (x1 ) = 2x1 and f (x2 ) = 6x2 (1 − x2 ) are marginal pdfs
of X1 and X2 .
2.4.2. If the random variables X1 and X2 have the joint pdf f (x1 , x2 ) = 2e−x1 −x2 , 0 < x1 < x2 , 0 < x2 < ∞,
zero elsewhere, show that X1 and X2 are dependent.
Solution.
Although the joint pdf can be expressed by a product of two nonnegative functions of x1 and x2 , respectively,
0 < x1 < x2 < ∞ is not a product space, which implies that X1 and X2 are dependent.
1
2.4.3. Let p(x1 , x2 ) = 16 , x1 = 1, 2, 3, 4, and x2 = 1, 2, 3, 4, zero elsewhere, be the joint pmf of X1 and X2 .
Show that X1 and X2 are independent.
Solution.
The marginal pdfs of X1 and X2 are p(x1 ) = p(x2 ) = 1/4. So p(x1 , x2 ) = p(x1 )p(x2 ) and the space is
rectangular, which gives us X1 and X2 are independent.
2.4.4. Find P (0 < X1 < 31 , 0 < X2 < 13 ) if the random variables X1 and X2 have the joint pdf f (x1 , x2 ) =
4x1 (1 − x2 ), 0 < x1 < 1, 0 < x2 < 1, zero elsewhere.
Solution.
Since f (x1 ) = 2x1 , 0 < x1 < 1 and f (x2 ) = 2(1 − x2 ), 0 < x2 < 1 and X1 and X2 are independent,
     
1 1 1 1
P 0 < X1 < , 0 < X2 < = P 0 < X1 < P 0 < X2 <
3 3 3 3
Z 1/3 ! Z !
1/3
= 2x1 dx1 2(1 − x2 )dx2
0 0
  
1 5 5
= = .
9 9 81

2.4.5. Find the probability of the union of the events a < X1 < b, −∞ < X2 < ∞, and −∞ < X1 < ∞,
c < X2 < d if X1 and X2 are two independent variables with P (a < X1 < b) = 23 and P (c < X2 < d) = 58 .
Solution.

P ({a < X1 < b, ∞ < X2 < ∞} ∪ {−∞ < X1 < ∞, c < X2 < d})
= P ({a < X1 < b} ∪ {c < X2 < d})
= P (a < X1 < b) + P (c < X2 < d) − P ({a < X1 < b} ∩ {c < X2 < d})
= P (a < X1 < b) + P (c < X2 < d) − P (a < X1 < b)P (c < X2 < d)
 
2 5 2 5 7
= + − = .
3 8 3 8 8

8
2.4.8. Let X and Y have the joint pdf f (x, y) = 3x, 0 < y < x < 1, zero elsewhere. Are X and Y
independent? If not, find E(X|y).
Solution.
X and Y are not independent because the support 0 < y < x < 1 is not rectangular (not a product space).
R1
So find f (y) first: f (y) = y 3xdx = 3(1 − y 2 )/2, 0 < y < 1, zero elsewhere. Hence

∞ 1
2x2 2(1 − y 3 ) 2(1 + y + y 2 )
Z Z
f (x, y)
E(X|y) = x dx = dx = = , 0 < y < 1.
−∞ f (y) y (1 − y 2 ) 3(1 − y 2 ) 3(1 + y)

2.4.10. Let X and Y be random variables with the space consisting of the four points (0, 0), (1, 1), (1, 0),
(1, −1). Assign positive probabilities to these four points so that the correlation coefficient is equal to zero.
Are X and Y independent?
Solution.
Assume the uniform distribution as shown below:
x1 , x2 -1 0 1 pX1 (x1 )
0 0 1/4 0 1/4
1 1/4 1/4 1/4 3/4
pX2 (x2 ) 1/4 1/2 1/4
Then, correlation coefficient ρ = 0 because

E(X) = 3/4, E(Y ) = 0, E(XY ) = −1/4 + 1/4 = 0 ⇒ E(XY ) − E(X)E(Y ) = 0.

However, P (X1 = X2 = 1) = 1/4 ̸= 3/16 = pX1 (1)pX2 (1), meaning that X and Y are not independent.
2.4.11. Two line segments, each of length two units, are placed along the x-axis. The midpoint of the first
is between x = 0 and x = 14 and that of the second is between x = 6 and x = 20. Assuming independence
and uniform distributions for these midpoints, find the probability that the line segments overlap.
Solution.
Since X1 ∼ U (0, 14) and X2 ∼ U (6, 20), the joint pdf of X1 and X2 is f (x1 , x2 ) = 1/142 . The desired
probability is
Z 14 Z x1
1 (x1 − 6)2 14 8
P (X1 ≥ X2 ) = 2
dx2 dx1 = 2)
= .
6 6 14 2(14 6 49

2.4.12. Cast a fair die and let X = 0 if 1, 2, or 3 spots appear, let X = 1 if 4 or 5 spots appear, and let
X = 2 if 6 spots appear. Do this two independent times, obtaining X1 and X2 . Calculate P (|X1 − X2 | = 1).
Solution.
|X1 − X2 | = 1 when (X1 , X2 ) = (0, 1), (1, 0), (1, 2), (2, 1) with probabilities of 1/6, 1/6, 1/18, and 1/18,
respectively. Hence the desired probability is 2(1/6 + 1/18) = 4/9.
2.4.13. For X1 and X2 in Example 2.4.6, show that the mgf of Y = X1 + X2 is e2t /(2 − et )2 , t < log 2, and
then compute the mean and variance of Y .
Solution.
Let t = t1 = t2 then
2
et e2t

MY (t) = MX1 ,X2 (t, t) = = , t < log 2.
2 − et (2 − et )2

9
Let ψ(t) = log MY (t) = 2t − 2 log(2 − et ). Then
2et
E(Y ) = ψ ′ (0) = 2 + = 4,
2 − et t=0
4et
Var(Y ) = ψ ′′ (0) = = 4.
(2 − et )2 t=0

2.5. The Correlation Coefficient


2.5.1. Let the random variables X and Y have the joint pmf
(a) p(x, y) = 13 , (x, y) = (0, 0), (1, 1), (2, 2), zero elsewhere.
(b) p(x, y) = 13 , (x, y) = (0, 2), (1, 1), (2, 0), zero elsewhere.
(c) p(x, y) = 13 , (x, y) = (0, 0), (1, 1), (2, 0), zero elsewhere.
In each case compute the correlation coefficient of X and Y .
Solution.
For (a) and (b), the scatter plots clearly show that ρ = 1 and ρ = −1, respectively.
For (c), since E(X) = 1, E(Y ) = 31 , and E(XY ) = 31 , Cov(X, Y ) = E(XY ) − E(X)E(Y ) = 0. Thus, ρ = 0.
2.5.3. Let f (x, y) = 2, 0 < x < y, 0 < y < 1, zero elsewhere, be the joint pdf of X and Y . Show that the
conditional means are, respectively, (1 + x)/2, 0 < x < 1, and y/2, 0 < y < 1. Show that the correlation
coefficient of X and Y is ρ = 12 .
Solution.
Find the marginal pdfs of X and Y first.
Z 1 Z y
f (x) = 2dy = 2(1 − x), 0 < x < 1, f (y) = 2dx = 2y, 0 < y < 1.
x 0

Hence,
Z ∞ Z ∞ Z 1
f (x, y) y 1+x
E(Y |X = x) = yf (y|x)dy = dy = y dy = , 0 < x < 1,
−∞ −∞ f (x) x 1−x 2
Z ∞ Z ∞ Z y
f (x, y) x y
E(X|Y = y) = xf (x|y)dy = x dy = dy = , 0 < y < 1.
−∞ −∞ f (y) 0 y 2

2.5.4. Show that the variance of the conditional distribution of Y , given X = x, in Exercise 2.5.3, is
(1 − x)2 /12, 0 < x < 1, and that the variance of the conditional distribution of X, given Y = y, is y 2 /12,
0 < y < 1.
Solution.
1
y2 1 + x + x2
Z
E(Y 2 |X = x) = dy = , 0 < x < 1,
x 1−x 3
y
x2 y2
Z
E(X 2 |Y = y) = dy = , 0 < y < 1.
0 y 3
Hence,
1 + x + x2 (1 + x)2 (1 − x)2
Var(Y |X = x) = E(Y 2 |X = x) − [E(Y |X = x)]2 = − = , 0 < x < 1,
3 4 12
y2 y2 y2
Var(X|Y = y) = E(X 2 |Y = y) − [E(X|Y = y)]2 = − = , 0 < y < 1.
3 4 12

10
2.5.5. Verify the results of equations (2.5.11) of this section.
Solution. See Exercise 2.5.8 because using ψ(t1 , t2 ) is easier to compute them.
2.5.6. Let X and Y have the joint pdf f (x, y) = 1, −x < y < x, 0 < x < 1, zero elsewhere. Show that, on
the set of positive probability density, the graph of E(Y |x) is a straight line, whereas that of E(X|y) is not
a straight line.
Solution.
Find the marginal pdfs of X and Y first.
Z x (R 1
dx = 1 − y 0<y<1
f (x) = dy = 2x, 0 < x < 1, f (y) = Ry1 .
−x 0
dx = 1 −1 < y ≤ 0
Hence,
Z ∞ Z ∞ Z x
f (x, y) y
E(Y |x) = yf (y|x)dy = y dy = dy = 0, 0 < x < 1,
−∞ −∞ f (x) −x 2x
(R 1
∞ ∞ x
dy = 1+y 0<y<1
Z Z
f (x, y)
E(X|y) = xf (x|y)dy = x dy = Ry1 1−y 1
2
−∞ −∞ f (y) 0
xdy = 2 −1 < y ≤ 0,
which means that the graph of E(Y |x) is a straight line, whereas that of E(X|y) is not a straight line.
2.5.8. Let ψ(t1 , t2 ) = log M (t1 , t2 ), where M (t1 , t2 ) is the mgf of X and Y . Show that
∂ψ(0, 0) ∂ 2 ψ(0, 0)
, , i = 1, 2,
∂ti ∂t2i
and
∂ 2 ψ(0, 0)
∂t1 t2
yield the means, the variances, and the covariance of the two random variables. Use this result to find the
means, the variances, and the covariance of X and Y of Example 2.5.6.
Solution.
Note that M (0, 0) = E(1) = 1. When i = 1,
Z ∞ Z ∞ Z ∞
∂ψ(0, 0) ∂M (0, 0)/∂t1
= = x f (x, y)dydx = xf (x)dx = E(X),
∂t1 M (0, 0) −∞ −∞ −∞
∂ 2 ψ(0, 0) M (0, 0)∂ 2 M (0, 0)/∂t21 − [∂M (0, 0)/∂t1 ]2
2 = = E(X 2 ) − [E(X)]2 = Var(X).
∂t1 M (0, 0)2
Same for i = 2. And
∂ 2 ψ(0, 0) ∂ ∂M (0, 0)/∂t1
=
∂t1 t2 ∂t2 M (0, 0)
[∂ 2 M (0, 0)/∂t1 t2 ]M (0, 0) − [∂M (0, 0)/∂t1 ][∂M (0, 0)/∂t2 ]
=
M (0, 0)2
= E(XY ) − E(X)E(Y ) = Cov(X, Y ).
Hence, for Example 2.5.6,
ψ(t1 , t2 ) = log M (t1 , t2 ) = − log(1 − t1 − t2 ) − log(1 − t2 ),
∂ψ(t1 , t2 ) 1 ∂ψ(t1 , t2 ) 1 1
= , = +
∂t1 1 − t1 − t2 ∂t2 1 − t1 − t2 1 − t2
∂ 2 ψ(t1 , t2 ) 1 ∂ 2 ψ(t1 , t2 ) 1 1
2 = , = +
∂t1 (1 − t1 − t2 ) 2 ∂t22 (1 − t1 − t2 )2 (1 − t2 )2
∂ 2 ψ(t1 , t2 ) 1
= .
∂t1 t2 (1 − t1 − t2 )2

11
Therefore,
∂ψ(0, 0) ∂ψ(0, 0)
µ1 = E(X) = = 1, µ2 = E(Y ) = =2
∂t1 ∂t2
∂ 2 ψ(0, 0) ∂ 2 ψ(0, 0)
σ12 = Var(X) = = 1, σ 2
2 = Var(Y ) = =2
∂t21 ∂t22
∂ 2 ψ(0, 0)
E[(X − µ1 )(Y − µ2 )] = Cov(X, Y ) = = 1.
∂t1 t2

2.5.9. Let X and Y have the joint pmf p(x, y) = 17 , (0, 0), (1, 0), (0, 1), (1, 1), (2, 1), (1, 2), (2, 2), zero elsewhere.
Find the correlation coefficient ρ.
Solution.
1+1+2+1+2 1+1+4+1+4 11
E(X) = E(Y ) = = 1, E(X 2 ) = E(Y 2 ) = =
7 7 7
2 11 4 1 + 2 + 2 + 4 9
⇒ σX = σY2 = − 1 = , E(XY ) = = .
7 7 7 7
Hence,
E(XY ) − E(X)E(Y ) 2/7 1
ρ= = = .
σX σY 4/7 2

2.5.11. Let σ12 = σ22 = σ 2 be the common variance of X1 and X2 and let ρ be the correlation coefficient of
X1 and X2 . Show for k > 0 that
2(1 + ρ)
P [|(X1 − µ1 ) + (X2 − µ2 )| ≥ kσ] ≤ .
k2

Solution.
P [|(X1 − µ1 ) + (X2 − µ2 )| ≥ kσ] = P [|(X1 − µ1 ) + (X2 − µ2 )|2 ≥ k 2 σ 2 ]
= P [(X1 − µ1 )2 + (X2 − µ2 )2 + 2(X1 − µ1 )(X2 − µ2 ) ≥ k 2 σ 2 ]
≤ P [(X1 − µ1 )2 ≥ k 2 σ 2 ] + P [(X2 − µ2 )2 ≥ k 2 σ 2 ]
+ P [2(X1 − µ1 )(X2 − µ2 ) ≥ k 2 σ 2 ]
= P (|X1 − µ1 | ≥ kσ) + P (|X2 − µ2 | ≥ kσ)
+ P [2(X1 − µ1 )(X2 − µ2 ) ≥ k 2 σ 2 ]
1 1 2E(X1 − µ1 )(X2 − µ2 )
≤ + 2+
k2 k k2 σ2
2(1 + ρ) E(X1 − µ1 )(X2 − µ2 )
= since = ρ.
k2 σ2

2.6. Extension to Several Random Variables


2.6.1. Let X, Y, Z have joint pdf f (x, y, z) = 2(x + y + z)/3, 0 < x < 1, 0 < y < 1, 0 < z < 1, zero elsewhere.
(a) Find the marginal probability density functions of X, Y , and Z.
Solution.
Z 1 Z 1
2(x + y + z) 2(x + 1)
fX (x) = dzdy = · · · = .
0 0 3 3
Similarly,
2(y + 1) 2(z + 1)
fY (y) = , fZ (z) = .
3 3

12
(b) Compute P (0 < X < 12 , 0 < Y < 21 , 0 < Z < 12 ) and P (0 < X < 12 ) = P (0 < Y < 12 ) = P (0 < Z < 12 ).
Solution. Skipped. We can solve part (c) without computing them.
(c) Are X, Y , and Z independent?
Solution. No; f (x, y, x) ̸= f (x)f (y)f (z) although the support is a product space.
(d) Compute E(X 2 Y Z + 3XY 4 Z 2 ).
Solution. Skipped.
(e) Determine the cdf of X, Y , and Z.
Solution.

0R
 x≤0
x 2(t+1) (x+1)2 −1 x2 +2x
FX (x) = 0 3 dt = 3 = 3 0<x<1.

1 x≥1

Similarly,
 
0 2
 y≤0 0 2
 z≤0
y +2y z +2z
FY (y) = 0<y<1, FZ (z) = 0<z<1.
 3  3
1 y≥1 1 z≥1
 

(f ) Find the conditional distribution of X and Y , given Z = z, and evaluate E(X + Y |z).
Solution.
f (x, y, z) x+y+z
f (x, y|z) = = , 0 < x < 1, 0 < y < 1.
f (z) z+1

Hence,
Z 1 Z 1
x+y+z
E(X + Y |z) = (x + y) dydx
0 0 z+1
1 1
(x + y)2 + z(x + y)
Z Z
= dydx
0 0 z+1
Z 1 y=1
1 (x + y)3 z(x + y)2
= + dx
z+1 0 3 2 y=0
Z 1
(x + 1)3 z(x + 1)2 x3 zx2

1
= + − − dx
z+1 0 3 2 3 2
1
(x + 1)4 z(x + 1)3 x4 zx3

1
= + − −
z+1 12 6 12 6 0
z + 7/6 6z + 7
= = , 0 < z < 1.
z+1 6(z + 1)

(g) Determine the conditional distribution of X, given Y = y and Z = z, and compute E(X|y, z).
Solution.
Z 1
2(x + y + z) 2y + 2z + 1
f (y, z) = dx =
0 3 3
f (x, y, z) 2(x + y + z)
f (x|y, z) = = .
f (y, z) 2y + 2z + 1

13
Hence,
1 1
2x2 + 2x(y + z)
Z Z
2(x + y + z) 3y + 3z + 2
E(X|y, z) = x dx = = ··· = , 0 < y, z < 1.
0 2y + 2z + 1 0 2y + 2z + 1 3(2y + 2z + 1)

2.6.2. Let f (x1 , x2 , x3 ) = exp[−(x1 + x2 + x3 )], 0 < x1 < ∞, 0 < x2 < ∞, 0 < x3 < ∞, zero elsewhere, be
the joint pdf of X1 , X2 , X3 .
(a) Compute P (X1 < X2 < X3 ) and P (X1 = X2 < X3 ).
Solution.
Z ∞ Z x3 Z x2
P (X1 < X2 < X3 ) = e−x1 −x2 −x3 dx1 dx2 dx3
Z0 ∞ Z0 x3 0

= [e−x2 −x3 − e−2x2 −x3 ]dx2 dx3


Z0 ∞ 0

= [(e−x3 − e−2x3 ) − (e−x3 /2 − e−3x3 /2)]dx3


0
1
= (1 − 1/2) − (1/2 − 1/6) = ,
Z ∞ Z x3 Z x2 6
P (X1 = X2 < X3 ) = e−x1 −x2 −x3 dx1 dx2 dx3 = 0.
0 0 x2

(b) Determine the joint mgf of X1 , X2 , and X3 . Are these random variables independent?
Solution.
Z ∞ Z ∞ Z ∞
M (t1 , t2 , t3 ) = e−(1−t1 )x1 e−(1−t2 )x2 e−(1−t3 )x3 dx1 dx2 dx3
0 0 0
Z ∞ Z ∞ Z ∞
= e−(1−t1 )x1 dx1 e−(1−t2 )x2 dx2 e−(1−t3 )x3 dx3
0 0 0
1
= , t1 < 1, t2 < 1, t3 < 1
(1 − t1 )(1 − t2 )(1 − t3 )
= MX1 (t1 )MX2 (t2 )MX3 (t3 ),

which clearly shows that these three random varialbes are independent.
2.6.7. Prove Corollary 2.6.1: Suppose P
X1 , X2 , ..., Xn are iid random variables with the common mgf M (t),
n
for −h < t < h, where h > 0. Let T = i=1 Xi . Then T has the mgf given by

MT (t) = [M (t)]n , −h < t < h.

Solution.
h Pn i Yn
MT (t) = E e i=1 Xi t = E(eXi t ) (Xi′ s are independent)
i=1
= [E(eXt )]n (Xi′ s are identical)
= [MX (t)]n .

2.6.9. Let X1 , X2 , X3 be iid with common pdf f (x) = exp(−x), 0 < x < ∞, zero elsewhere. Evaluate:
(a) P (X1 < X2 |X1 < 2X2 ).
Solution.
P (X1 < X2 , X1 < 2X2 ) P (X1 < X2 )
P (X1 < X2 |X1 < 2X2 ) = = .
P (X1 < 2X2 ) P (X1 < 2X2 )

14
For the numerator,
Z ∞ Z ∞ Z ∞
1
P (X1 < X2 ) = e−x1 −x2 dx2 dx1 = e−2x1 dx2 = .
0 x1 0 2

For the denominator,


Z ∞ Z ∞ Z ∞
−x1 −x2 2
P (X1 < 2X2 ) = e dx2 dx1 = e−3x1 /2 dx2 = .
0 x1 /2 0 3
1/2
Thus, P (X1 < X2 |X1 < 2X2 ) = 2/3 = 34 .
(b) P (X1 < X2 < X3 |X3 < 1).
Solution.
P (X1 < X2 < X3 < 1)
P (X1 < X2 < X3 |X3 < 1) = .
P (X3 < 1)
For the numerator,
Z 1 Z x3 Z x2
P (X1 < X2 < X3 < 1) = e−x1 −x2 −x3 dx1 dx2 dx3
0 0 0
Z 1 Z x3
= [e−x2 −x3 − e−2x2 −x3 ]dx2 dx3
0 0
Z 1
= [(e−x3 − e−2x3 ) − (e−x3 /2 − e−3x3 /2)]dx3
0
Z 1
= [e−x3 /2 − e−2x3 + e−3x3 /2)]dx3
0
1 − e−1 1 − e−2 1 − e−3
= − +
2 2 6
1 − 3e−1 + 3e−2 − e−3
=
6
For the denominator,
Z 1
P (X3 < 1) = e−x3 dx3 = 1 − e−1 .
0

Hence
P (X1 < X2 < X3 < 1) 1 − 3e−1 + 3e−2 − e−3
P (X1 < X2 < X3 |X3 < 1) = = ≈ 0.0666.
P (X3 < 1) 6(1 − e−1 )

2.7. Transformations for Several Random Variables


Skipped because of a just extension from two random variables.

2.8. Linear Combinations of Random Variables


2.8.3. Let X1 and X2 be two independent random variables so that the variances of X1 and X2 are σ12 = k
and σ22 = 2, respectively. Given that the variance of Y = 3X2 − X1 is 25, find k.
Solution.

Var(Y ) = 32 Var(X2 ) + Var(X1 ) X1 , X2 are independent


= 9σ22 + σ12 = 18 + k.

15
Hence, Var(Y ) = 25 ⇒ k = 7.
P5
2.8.6. Determine the mean and variance of the sample mean X=5 ¯ −1 i=1 Xi , where X1 , . . . , X5 is a random
sample from a distribution having pdf f (x) = 4x3 , 0 < x < 1, zero elsewhere.
Solution.
Z 1 Z 1
3 4 2 2 2
E(X) = x(4x )dx = , E(X ) = x2 (4x3 )dx = ⇒ Var(X) = .
0 5 0 3 75

Hence,

4 Var(X) 2
E(X̄) = E(X) = = 0.8, Var(X̄) = = ≈ 0.00533.
5 5 375

2.8.7. Let X and Y be random variables with µ1 = 1, µ2 = 4, σ12 = 4, σ22 = 6, ρ = 12 . Find the mean and
variance of the random variable Z = 3X − 2Y .
Solution.

E(Z) = 3E(X) − 2E(Y ) = 3µ1 − 2µ2 = −5


Var(Z) = 32 Var(X) + 22 Var(Y ) − 12Cov(X, Y )
= 9σ12 + 4σ22 − 12ρσ1 σ2

= 60 − 12 6 ≈ 30.6.

2.8.8. Let X and Y be independent random variables with means µ1 , µ2 and variances σ12 , σ22 . Determine
the correlation coefficient of X and Z = X − Y in terms of µ1 , µ2 , σ12 , σ22 .
Solution.
Since X and Y are independent,

Var(Z) = Var(X) + Var(Y ) = σ12 + σ22 ,


Cov(X, Z) = Cov(X, X − Y ) = Var(X) − Cov(X, Y ) = σ12 .

Hence, the correlation coefficient is

Cov(X, Z) σ2 σ1
ρ= p = p 2 21 =p 2 .
Var(X)Var(Z) σ1 (σ1 + σ22 ) σ1 + σ22

2.8.10. Determine the correlation coefficient of the random variables X and Y if var(X) = 4, var(Y ) = 2,
and var(X + 2Y ) = 15.
Solution.
√ √ √
15 = Var(X + 2Y ) = Var(X) + 4Var(Y ) + 4Cov(X, Y ) = 4 + 4(2) + 4ρ 4 2 = 12 + 8 2ρ.

Hence, ρ = 3/(8 2) ≈ 0.265.
2.8.11. Let X and Y be random variables with means µ1 , µ2 ; variances σ12 , σ22 ; and correlation coefficient
ρ. Show that the correlation coefficient of W = aX + b, a > 0, and Z = cY + d, c > 0, is ρ.
Solution.

Var(W ) = a2 Var(X) = a2 σ12 , Var(Z) = c2 Var(Y ) = c2 σ22 , Cov(W, Z) = acCov(X, Y ) = acρσ1 σ2 .


p
Hence, Corr(W, Z) = Cov(W, Z)/( Var(W )Var(Z)) = ρ because a > 0 and c > 0.

16
2.8.13. Let X1 and X2 be independent random variables with nonzero variances. Find the correlation
coefficient of Y = X1 X2 and X1 in terms of the means and variances of X1 and X2 .
Solution.
Let µ1 , µ2 and σ12 , σ22 denote the means and the variances of X1 and X2 , respectively. Since the two r.v.s.
are independent,

Var(Y ) = Var(X1 X2 )
= E(X12 X22 ) − E(X1 X2 )2
= E(X12 )E(X22 ) − E(X1 )2 E(X2 )2
= (µ21 + σ12 )(µ22 + σ22 ) − µ21 µ22
= µ21 σ22 + σ12 µ22 + σ12 σ22 ,
Cov(Y, X1 ) = Cov(X1 X2 , X1 )
= E(X12 X2 ) − E(X1 X2 )E(X1 )
= E(X12 )E(X2 ) − E(X1 )2 E(X2 )
= (µ21 + σ12 )µ2 − µ21 µ2
= σ12 µ2

Hence,

Cov(Y, X1 ) σ12 µ2 σ1 µ2
ρ= p =p =p .
Var(Y )Var(X1 ) µ21 σ22 + σ12 µ22 + σ12 σ22 (σ1 ) µ21 σ22 + σ12 µ22 + σ12 σ22

2.8.15. Let X1 , X2 , and X3 be random variables with equal variances but with correlation coefficients
ρ12 = 0.3, ρ13 = 0.5, and ρ23 = 0.2. Find the correlation coefficient of the linear functions Y = X1 + X2 and
Z = X2 + X3 .
Solution.
Let σ 2 denote the variance of X1 , X2 , and X3 . Then

Var(Y ) = Var(X1 ) + Var(X2 ) + 2Cov(X1 , X2 ) = 2σ 2 (1 + ρ12 ) = 2.6σ 2 ,


Var(Z) = Var(X2 ) + Var(X3 ) + 2Cov(X2 , X3 ) = 2σ 2 (1 + ρ23 ) = 2.4σ 2 ,
Cov(Y, Z) = Cov(X1 + X2 , X2 + X3 ) = σ 2 (ρ12 + ρ13 + 1 + ρ23 ) = 2σ 2 .

Therefore, the correlation coefficient, ρ, is

Cov(Y, Z) 2σ 2
ρ= p =p ≈ 0.801.
Var(Y )Var(Z) 2.6(2.4)σ 2

2.8.17. Let X and Y have the parameters µ1 , µ2 , σ12 , σ22 , and ρ. Show that the correlation coefficient of X
and [Y − ρ(σ2 /σ1 )X] is zero.
Solution.

Cov(X, Y − ρ(σ2 /σ1 )X) = Cov(X, Y ) − ρ(σ2 /σ1 )Var(X) = ρσ1 σ2 − ρ(σ2 /σ1 )σ12 = 0.

17

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy