Conditioning A Random Variable On An Event
Conditioning A Random Variable On An Event
41
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Suppose we observe the light bulb at time t0 and it has burned out.
What is the conditional pdf for T ?
Here, the event A is
A = {T ≤ t0},
and so
(f (
T (t) λ e−λt
P(A)
0 ≤ t ≤ t0 1−e−λt0
0 ≤ t ≤ t0
fT |A(t) = =
0 otherwise 0 otherwise.
Now suppose that we observe the light bulb at time t0 and it has not
burned out yet. Now what is the conditional pdf for T conditioned
on A = {T ≥ t0}?
( (
λ e−λt
e−λt0
t ≥ t0 λe−λ(t−t0), t ≥ t0
fT |A(t) = =
0 otherwise 0 otherwise.
42
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Total probability theorem for pdfs
If A1, . . . , An are events that partition the sample space Ω,
Ai ∩ Aj = ∅, and
n
[
Ai = Ω,
i=1
then we can break apart the pdf fX (x) for a random variable X as
n
X
fX (x) = P (Ai) fX|Ai (x)
i=1
Exercise:
Suppose that Sublime Doughnuts makes a fresh batch once every
hour starting at 6am. You enter the store between 8:30am and
10:15am, with your arrival time being a uniform random variable
over this interval. What is the pdf for how old the doughnuts are?
43
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Conditioning one random variable on another
Let X, Y be continuous random variables with joint pdf fX,Y (x, y).
For any y with fY (y) > 0, we can define the conditional pdf:
fX,Y (x, y)
fX|Y (x|y) = .
fY (y)
The conditional pdf is a valid pdf which reflects how our knowledge
of X changes given a certain observation Y = y. For any fixed value
of y, this is just a function of x, but it can be a different function
for different values of y.
Example.
1 2 3 4 x 1/2
1 2 3 4 x
44
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Example. Uniform pdf on a disc
You throw a dart at a circular target of radius r. We will assume
you always hit the target, and each point of impact (x, y) is equally
likely, so that
(
1
πr2
, if x2 + y 2 ≤ r2
fX,Y (x, y) =
0 otherwise.
Now we have
fX,Y (x, y)
fX|Y (x|y) =
fY (y)
or equivalently
45
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Sometimes, it is more natural to build up a joint model using this
factorization, as the next example illustrates.
Exercise:
The speed of a typical vehicle on I-285 can be modeled as an ex-
ponentially distributed random variable X with mean 65 miles per
hour. Suppose that we (or a police officer) measure the speed Y of
a randomly chosen vehicle using a radar gun, but our measurement
has an error which is modeled as a normal random variable with
zero mean and standard deviation equal to one tenth of the vehicle’s
speed. What is the joint pdf of X and Y ?
46
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Conditional expectation
Once we have the conditional density defined, the definition of con-
ditional expectation is straightforward.
• If A is an event with P (A) > 0, then
Z ∞
E[X|A] = x fX|A(x) dx
−∞
47
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
• The definition of conditional expectation and the total expec-
tation theorem extends to arbitrary functions g(X, Y ) of the
random variables X, Y as well:
Z
E[g(X, Y )|Y = y] = g(x, y) fX|Y (x|y) dx
and Z
E[g(X, Y )] = E[g(X, Y )|Y = y] fY (y) dy
or equivalently
Z
E[g(X, Y )] = E[g(X, Y )|X = x] fX (x) dx
Exercise:
Suppose that the random variable X has the piecewise constant pdf
2/3, 0 ≤ x ≤ 1,
fX (x) = 1/3, 1 < x ≤ 2,
0, otherwise
48
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Iterated expectation
Here is an identity which seems a little weird at first, but is actually
very useful:
E[X] = E[E[X|Y ]]
This is called the law of iterated expectation, or double expecta-
tion. Don’t worry if that expression looks confusing the first time
you see it; everybody thinks that. Hopefully the explanation below
will help you make some sense of it.
Let’s see where the law of iterated expectation comes from. By now,
we are comfortable with the notion of conditional expectation; if X
and Y are related random variables, then observing Y = y may
change the distribution, and hence the expectation, of X. Thus
49
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Example. Suppose that a coin is potentially biased and that the
probability of heads, denoted by P is itself random with a uniform
distribution over [0, 1]. We toss the coin n times, and let X be the
number of heads obtained. Then for any fixed value of P = p,
E[X|P = p] = np,
E[X|P ] = nP.
E[X] = E[E[X|P ]]
= E[nP ]
= n E[P ]
= n/2
50
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014
Exercise:
You are holding a stick of length `. You choose a point uniformly at
random along the length of the stick and break it, keeping the piece
in your left hand. You then repeat this process, breaking the (now
smaller) stick at a randomly chosen location and then keeping the
piece in your left hand.
What is the expected length of the piece we are left with after break-
ing twice?
51
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell. Last updated 0:01, June 24, 2014