Solutions To Exam 1: 1 2 N N A N
Solutions To Exam 1: 1 2 N N A N
Fall 2009
Solutions to Exam 1
1. Convergence. In each of the following four parts, you are asked a question about the convergence
of a sequence of random variables. If you say yes, provide a proof and the limiting random variable.
If you say no, disprove or provide a counterexample.
(a) Let A1 , A2 , . . . be a sequence of independent events such that P(An ) 1 as n . Now define a sequence of random variables Xn = 11An , n = 1, 2, . . .. Does Xn converge in probability
as n ?
p.
Ans: We can guess that Xn 1. To prove this, consider P{|Xn 1| }. Clearly P{|Xn 1|
} = 0 n if > 1, since |Xn 1| cannot exceed 1. Thus it remains to see if this probability converges
to 0 for 0 < 1. For 0 < 1
P{|Xn 1| } = P(Acn ) = 1 P(An ) 0
as n
m.s.
(b) Suppose Xn X as n and E[Xn4 ] < for all n. Does Xn2 necessarily converge in
mean square as n ?
Ans: No. Consider = [0, 1] with the uniform probability measure, and let Xn = n11{[0,1/n4 ]} .
m.s.
2
Then E[X 4 ] = 1 < for all n, and Xn X, with X = 0 a.s., but E[Xn2 Xn1
] = n2 (n 1)2 /n4
2 2
2
1 6= E[X X ] = 0, Thus, by the Cauchy criterion, Xn does not converge in m.s. sense.
3
4
and P{Xn = 1} =
1
4
Pn
i=1 Xi .
(b) Use the Central Limit Theorem to find an approximation for P{S100 50} in terms of the
Q() function.
1
3
Ans: = E[Xn ] = 41 and 2 = Var(Xn ) = E[Xn2 ] 2 = 14 16
= 16
. Thus, by the Central Limit
Theorem, (S100 100)/(10) is approximately N (0, 1). Therefore,
S100 100
50 n
50 n
10
P{S100 > 50} = P
>
Q
=Q
10
10
10
3
c
V.
V. Veeravalli, 2009
50
4
3
S100
1
100
2
e100 `(0.5)
e
= = ln 3
0.5 =
3 + e
Thus `(0.5) = 0.5 ln 3 ln(3/2) = 0.5 ln 4 0.5 ln 3, and the upper bound follows.
fy (y) =
0
and for 0 y 1,
fX|Y (x|y) =
2x
fX,Y (x, y)
=
11{0x1y} .
fy (y)
(1 y)2
Therefore, for 0 y 1,
1y
Z
E[X|Y = y] =
2 (1 y)3
2
= (1 y)
3 (1 y)2
3
xfX|Y (x|y)dx =
0
and E[X|Y ] =
2
(1 Y ).
3
(b) Find the MSE achieved by E[X|Y ], i.e. find the minimum MSE.
Ans: It is easy to see that fx (x) = 6x(1 x)11{0x1} . Thus, the minimum MSE is given by
2
E[X ] E[(E[X|Y ]) ] =
0
4
6x (1 x)dx
9
3
Z
0
3(1 y)4 dy =
3
4
1
=
.
10 15
30
c
V.
V. Veeravalli, 2009
4. (14 pts) Suppose X, Y1 , Y2 are zero-mean jointly Gaussian with covariance matrix
4 1 1
X
0
Cov Y1 = 1 1
1 0
1
Y2
(a) Find P{Y1 + Y2 X 10} in terms of the Q() function.
Ans: Let W = Y1 + Y2 X. Then W is Gaussian with E[W ] = 0 and
Var(W ) = E[W 2 ] = E[Y12 ] + E[Y12 ] + E[X 2 ] + 2E[Y1 Y2 ] 2E[XY1 ] 2E[XY2 ] = 1 + 1 + 4 + 0 + 2 + 2 = 10.
fX|V (x|v) N (v, 2). Thus P({X 2}|{Y1 + Y2 = 0}) = P({X 2}|{V = 0}) = Q( 2 ).
E[X|Z]
= E[X] 0 = 0
c
V.
V. Veeravalli, 2009