0% found this document useful (0 votes)
5 views

5_RV-II

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

5_RV-II

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Slide set  Random variables II

∙ Continuous random variables: pdf


∙ Popular continuous r.v.s
∙ Mean and variance of continuous r.v.s
∙ General random variables: cdf
∙ Functions of a random variable
∙ Summary of random variables I, II

© Copyright  Abbas El Gamal


Continuous random variables

∙ A continuous r.v. takes a continuum of values each with probability zero


∙ Examples:
󳶳 A random number between  and 
󳶳 Voltage across a hot resistor; current through a photodetector
󳶳 Phase of a sinusoidal power source
󳶳 Dow Jones Industrial Average, . . .

∙ How do we find the probability of an event involving a continuous r.v.?


For discrete r.v., sum the pmf over points in the event to find its probability
For continuous r.v., integrate probability density over event to find its probability
Analogous to mass density in physics (integrate mass density to get the mass)

 / 
Probability density function (pdf )

∙ A continuous r.v. X is specified by a probability density function (pdf ),


which is a function fX (x) such that the probability of any event A is
fX (x)
P{X ∈ A} = 󵐐 fX (x) dx
A

∙ For example, for A = (a, b]


b
P{X ∈ (a, b]} = 󵐐 fX (x) dx x
a a b
∙ Properties of fX (x):
. fX (x) ≥ 

. ∫−∞ fX (x) dx = 

∙ Notation: X ∼ fX (x) means that X has pdf fX (x)


∙ Warning: fX (x) is not a probability measure — can be > 

 / 
Probability density function (pdf )

∙ A continuous r.v. X is specified by a probability density function (pdf ),


which is a function fX (x) such that the probability of any event A is
fX (x)
P{X ∈ A} = 󵐐 fX (x) dx
A

∙ For example, for A = (a, b]


b
P{X ∈ (a, b]} = 󵐐 fX (x) dx x
a x x + Δx
∙ Properties of fX (x):
. fX (x) ≥ 

. ∫−∞ fX (x) dx = 

∙ Notation: X ∼ fX (x) means that X has pdf fX (x)


∙ Warning: fX (x) is not a probability measure — can be > 
For small Δx, P{X ∈ (x, x + Δx]} ≈ fX (x) Δx
 / 
Popular continuous r.v.s

∙ Uniform: X ∼ Unif[a, b] for b > a has the pdf



for a ≤ x ≤ b
fX (x) = 󶁇 b−a
 otherwise

fX (x)


b−a

x
a b

∙ Uniform r.v. models, for example,


󳶳 Finite precision computation error (roundoff error)
󳶳 Unif[, ] is a perfect random number generator

 / 
Popular continuous r.v.s

∙ Exponential: X ∼ Exp(λ) for λ >  has the pdf


λe−λx for x ≥ 
fX (x) = 󶁇
 otherwise

fX (x)

∙ Exponential r.v. is used to model, for example,


󳶳 Lifetime of a particle, a light bulb, . . .
󳶳 Service time in a queue
󳶳 Interarrival time in queues (time between consecutive packets, customer arrivals)

 / 
Example

∙ Let X ∼ Exp(.) be the customer service time at a post office (in minutes).
The person ahead of you has been served for over  minutes, what is
the probability that you will wait over  more minutes before being served?

∙ The probability that you’ve waited >  minutes, P{X > } = ∫ .e−.x dx = e−.
We want to find: P{X >  | X > }
By definition of conditional probability
P{X > , X > }
P{X >  | X > } =
P{X > }
P{X > }
= , since {X > } ⊆ {X > }
P{X > }

∫ .e−.x dx
= = e−.
e−.
Hence, the probability of waiting >  more minutes given that you’ve waited
>  minutes is equal to the unconditional probability of waiting >  minutes !
 / 
Exponential is memoryless

∙ The above example illustrates the memoryless property of exponential r.v.


∙ In general, a r.v. X ≥  is memoryless if for every  < a < ξ,
P{X > ξ | X > a} = P{X > ξ − a}

∙ Let’s show that X ∼ Exp(λ) is memoryless P{X > x}

P{X > ξ, X > a} 


P{X > ξ | X > a} =
P{X > a}
e−λx
P{X > ξ}
=
P{X > a} x
∞ −λx
ξ −a
∫ξ λe dx
= P{X > x | X > a}

∫a λe−λx dx

−λξ
e
= e−λ(x−a)
e−λa
= e−λ(ξ−a) = P{X > ξ − a} x
a ξ
 / 
Mean and variance of continuous r.v.s

∙ Consider a continuous r.v. X ∼ fX (x). The mean of X is defined as



E(X) = 󵐐 x fX (x) dx
−∞

This has the interpretation of the center of mass for a mass density
∙ The second moment (average power) is defined as


E(X ) = 󵐐 x fX (x) dx
−∞

∙ The variance is defined as



Var(X) = 󵐐 (x − E(X)) fX (x) dx
−∞
= E(X ) − (E(X))

 / 
Mean and variance of continuous r.v.s

∙ Uniform: X ∼ Unif[a, b] has mean


b
x a+b
E(X) = 󵐐 dx =
a b−a 
∙ The second moment is

b
x b − a
E(X ) = 󵐐 dx =
a b − a (b − a)
∙ The variance is
 (b − a)

Var(X) = E(X ) − (E(X)) =

∙ In particular, for X ∼ Unif[, ],
  
E(X) = , E(X  ) = , Var(X) =
  

 / 
Mean and variance of continuous r.v.s

∙ Exponential: X ∼ Exp(λ) has mean



E(X) = 󵐐 xλe−λx dx


= 󵐐 x d 󶀢−e−λx 󶀲

󵄨󵄨∞ ∞
−λx 󵄨
= 󶀢−xe 󶀲 󵄨󵄨󵄨 + 󵐐 e−λx dx integration by parts (see Some math)
󵄨󵄨 
 −λx 󵄨󵄨󵄨


=  − e 󵄨󵄨󵄨 =
λ 󵄨󵄨 λ

∙ The second moment and variance are



E(X  ) =
,
λ

Var(X) = 
λ

 / 
Another very popular continuous r.v.

∙ Gaussian: X ∼ N(μ, σ  ) has pdf


(x − μ)
 −
fX (x) = e σ  for − ∞ < x < ∞,
󵀂πσ 

where μ is the mean and σ  is the variance Johann Carl Friedrich Gauss (-)
fX (x)

x
μ

∙ Gaussian r.v.s are encountered frequently in nature, e.g.,


thermal and shot noise in electronic devices are Gaussian.
It’s also used in modeling various social, biological, and other phenomena
 / 
Summary of discrete and continuous r.v.s
∙ Discrete r.v. X specified by a pmf: pX (x) ≥ , 󵠈 pX (x) = 
x∈X

∙ Continuous r.v. X specified by a pdf: fX (x) ≥ , 󵐐 fX (x) dx = 
−∞
∙ Mean, second moment, and variance:
󳶳 For a discrete r.v.:
E(X) = 󵠈 xpX (x),
x∈X

E(X  ) = 󵠈 x pX (x),
x∈X

Var(X) = 󵠈 (x − E(X)) pX (x) = E(X  ) − [E(X)]


x∈X
󳶳 For a continuous r.v.:

E(X) = 󵐐 xfX (x) dx,
−∞

E(X  ) = 󵐐 x fX (x),
−∞

Var(X) = 󵐐 (x − E(X)) fX (x) dx = E(X  ) − [E(X)]
−∞
 / 
Summary of popular r.v.s

Random variable pmf/pdf Mean Variance

Bern(p) pX () = p, pX () =  − p p p( − p)

Geom(p) pX (k) = p( − p)k− , k = , , . . . /p ( − p)/p

n
Binom(n, p) pX (k) = 󶀤 󶀴pk ( − p)n−k , k = , , . . . , n np np( − p)
k

λ k −λ
Poisson(λ) pX (k) = e , k = , , . . . λ λ
k!

Unif[a, b] fX (x) = , x ∈ [a, b] (a + b)/ (b − a) /
b−a

Exp(λ) fX (x) = λe−λx , x ≥  /λ /λ 

 (x−󰜇)
 −
N(μ, σ ) fX (x) = e 󰜎  μ σ
󵀂πσ 

 / 
Mixed random variables

∙ Many real-world r.v.s are mixed, i.e., have discrete and continuous components
∙ Example: A packet arrives at a router in a communication network.
If the input buffer is empty (probability p), the packet is serviced immediately.
Otherwise the packet must wait for a random continuous amount of time
Define the r.v. X to be the packet service time
X is neither discrete nor continuous, how do we specify it?
∙ We can use the cumulative distribution function (cdf ) to specify any r.v.

 / 
Cumulative distribution function (cdf )

∙ The cumulative distribution function (cdf ) of a r.v. X is defined as


FX (x) = P{X ≤ x} for x ∈ (−∞, ∞)

∙ Like the pmf, the cdf is the probability of something, hence,  ≤ FX (x) ≤ 
∙ The normalization axiom implies that FX (x)

FX (∞) = , and FX (−∞) =  

a x
b
∙ FX (x) is monotonically nondecreasing, i.e., if b > a then FX (b) ≥ FX (a)
∙ The probability of any event can be computed from the cdf, e.g.,
P{X ∈ (a, b]} = P{a < X ≤ b}
= P{X ≤ b} − P{X ≤ a} (additivity)
= FX (b) − FX (a)

∙ The probability of any point a, P{X = a} = FX (a) − FX (a− )


 / 
Cumulative distribution function (cdf )

∙ If a r.v. is discrete, its cdf consists of a set of steps


pX (x) FX (x)

x x
 
∙ If a r.v. X is continuous with pdf fX (x), then its cdf is
x
FX (x) = P{X ≤ x} = 󵐐 fX (α) dα
−∞

Hence, the cdf of a continuous r.v. X is continuous


FX (x)

∙ In fact the precise way to define a continuous r.v. is that its cdf is continuous
 / 
Cumulative distribution function (cdf )

∙ Further, if FX (x) is differentiable (almost everywhere), then


dFX (x) FX (x + Δx) − FX (x)
fX (x) = = lim
dx Δx→ Δx
P{x < X ≤ x + Δx}
= lim
Δx→ Δx
FX (x)

∙ The cdf of a mixed r.v. has the general form


FX (x)

 / 
Cdf of popular continuous r.v.s

∙ Uniform: X ∼ Unif[a, b]

󶀂  if x < a
󶀒
󶀒
󶀒 x  x−a
FX (x) = 󶀊∫a dα = if a ≤ x ≤ b
󶀒
󶀒
b−a b−a
󶀒
󶀚 if x ≥ b

fX (x) FX (x)

 
b−a

x x
a b a b

 / 
Cdf of popular continuous r.v.s

∙ Exponential: X ∼ Exp(λ)
FX (x) =  − e−λx for x ≥ , FX (x) =  for x < 

fX (x) FX (x)

λ 

x x

 / 
Cdf of popular continuous r.v.s

∙ Gaussian: X ∼ N(μ, σ  )
There is no closed form for the cdf; it’s found by numerical integration
∙ As will soon see, we only need to compute the cdf of the standard normal N(, )
x
 − 󰜉
Φ(x) = 󵐐 e dξ
−∞ 󵀂π

N(, )

Q(x)

x ξ

∙ In some applications, the function Q(x) =  − Φ(x), x ≥ , is used

 / 
Cdf of a mixed r.v.

∙ Example: Let X be the service time of a packet at a router


If the buffer is empty (with probability p), the packet is serviced immediately
If it is not empty, service time is modeled as an Exp(λ) r.v.
The cdf of X is
 if x < 
󶀂
󶀒
FX (x) = 󶀊p if x = 
󶀒
󶀚
p + ( − p)( − e−λx ) if x > 

FX (x)

p
x

 / 
Recap
∙ Discrete r.v. X specified by a pmf pX (x) ≥ , 󵠈 pX (x) = 
x∈X
󳶳 To find the probability of an event, sum the pmf over the points in the event

∙ Continuous r.v. X specified by a pdf fX (x) ≥ , 󵐐 fX (x) dx = 
−∞
󳶳 To find the probability of an event, integrate the pdf over the event

∙ Any r.v. can be specified by a cdf FX (x) = P{X ≤ x} for − ∞ < x < ∞
󳶳 FX (x) ≥  FX (x)
󳶳 FX (∞) =  and FX (−∞) =  
󳶳 FX (x) is monotonically nondecreasing
x
󳶳 P{X = a} = FX (a) − FX (a− )
x
∙ For continuous r.v. X with pdf fX (x), FX (x) = 󵐐 fX (ξ) dξ
−∞

dFX (x)
∙ If FX (x) is differentiable (almost everywhere), fX (x) =
dx
 / 
Functions of a random variable

∙ Given a r.v. X, with known distribution (pmf, pdf, cdf ), and a function y = g(x),
we wish to find the distribution of Y

X g(x) Y

∙ Examples:
󳶳 X is the input voltage to a circuit, Y is its output voltage
󳶳 X is the input to a signal processor, Y is its output signal
󳶳 X is sun light, Y is the output power of a photovoltaic system
󳶳 X is wind speed, Y is the output power of a wind generator

∙ The function Y is a r.v. over the same sample space as X, i.e., Y(ω) = g(X(ω))
∙ However, we don’t assume knowledge of the underlying probability model
and wish to specify Y directly from the pmf, pdf, or cdf of X

 / 
Functions of a discrete random variable

∙ First assume that X is a discrete r.v. with pmf pX (x),


then Y is also discrete and is specified by a pmf pY (y)
∙ We can derive pY (y) directly from pX (x):
g(xi ) = y

x x x ... y y . . . y . . .

The probability of {Y = y} is the probability of its inverse image under g(x), i.e.,

pY (y) = 󵠈 pX (xi )
{x i : g(x i )=y}

∙ This is exactly how we specified a r.v. given a probability model:


We can treat X as a sample space and pX (x) as a probability measure over it,
and define the r.v. Y as a function over X

 / 
Example
∙ Let X be a discrete r.v. with the pmf: pX () = /, pX () = /,
pX () = /, pX () = / and define the function
 if x ≥ 
g(x) = 󶁇
 otherwise
Find the pmf of Y = g(X)
∙ Here is the mapping under g(x)

x y
     
pY (y) is the probability of the inverse image of y under g
  
pY () = 󵠈 pX (x) = pX () + pX () = + =
{x: g(x)=}
  

pY () =  − pY () =

 / 
Derived densities

∙ Let X be a continuous r.v. with pdf fX (x) and Y = g(X) be a function of X.


We wish to find the pdf of Y, fY (y)
∙ Recall the derived pmf approach: Given pX (x) and Y = g(X), the pmf of Y is
pY (y) = 󵠈 pX (x),
{x: g(x)=y}

i.e., pY (y) is the sum of pX (x) over all x that yield g(x) = y
∙ This procedure does not immediately extend to deriving a pdf,
since the probability of each point is zero
∙ But the general approach extends nicely to cdfs

 / 
Derived densities

∙ To find the pdf of Y = g(X) from the pdf of X, we first find the cdf of Y as
FY (y) = P{g(X) ≤ y} = 󵐐 fX (x) dx
{x: g(x)≤y}

x g(x)
g(x) ≤ y y
{x : g(x) ≤ y}

That is, the probability of the inverse image of {Y ≤ y} under g

∙ We then differentiate to obtain the pdf


dFY (y)
fY (y) =
dy
∙ The hard part is typically getting the limits on the integral correct —
often they are obvious, but sometimes they are more subtle
 / 
Linear function

∙ Let X ∼ fX (x) and Y = aX + b for some a >  and b


∙ To find the pdf of Y, we first find its cdf
ax + b

x
y−b
a

y−b y−b
FY (y) = P{Y ≤ y} = P{aX + b ≤ y} = P 󶁃X ≤ 󶁓 = FX 󶀣 󶀳
a a
dFY (y)  y−b
Thus, fY (y) = = fX 󶀣 󶀳
dy a a
 y−b
∙ For general a ̸= : fY (y) = fX 󶀣 󶀳
|a| a
 / 
Example

∙ Let X ∼ Exp(λ), i.e., fX (x) = λe−λx , x ≥ , and Y = aX + b


∙ Let’s use the formula we derived
 y−b
fY (y) = fX 󶀣 󶀳
|a| a
λ(y − b)
󶀂
󶀒
󶀒λe
− y−b
= 󶀊 |a|
a , if ≥
a
󶀒
󶀒
󶀚 otherwise

∙ Note: substituting without specifying the domain of fY gives incorrect answer!

 / 
Example
 (x−󰜇)
∙ Let X ∼ N(μ, σ  ), i.e., fX (x) = e −
󰜎  , and Y = aX + b
󵀂πσ 

∙ Again, let’s use the formula for the derived density of a linear function
 y−b
fY (y) = fX 󶀣 󶀳
|a| a
(y−b) 
󶀢 a
− μ󶀲
  −
= e σ 
| a | 󵀂πσ 

󶀡y − b − aμ󶀱
 −
= e a σ  for − ∞ < y < ∞
󵀄π(aσ)

Therefore, Y ∼ N(aμ + b, a σ  ), that is, a linear function of a Gaussian r.v.


yields a Gaussian r.v. – a very important and useful fact

 / 
Computing Gaussian cdf

∙ The fact that a linear function of Gaussian is Gaussian can be used to compute
the cdf of any Gaussian using the cdf of the standard normal X ∼ N(, )
∙ To compute the cdf of Y ∼ N(μY , σY ), we express it as Y = σY X + μY to obtain
y − μY y − μY
FY (y) = P{Y ≤ y} = P{σY X + μY ≤ y} = P 󶁃X ≤ 󶁓 = FX 󶀣 󶀳
σY σY

∙ This is exactly how it’s computed in Python


scipy.stats.norm.cdf(y, loc= , scale= ), where loc = μ, scale = σ
∙ Example: Let Y ∼ N(, ), find the P{. < Y < }
∙ First we write: P{. < Y < } = P{Y < } − P{Y ≤ .} = FY () − FY (.)
Now we compute:
FY (): scipy.stats.norm.cdf(3, loc= 2, scale= 2),
FY (.): scipy.stats.norm.cdf(1.5, loc= 2, scale= 2)

And we obtain: P{. < Y < } = FY () − FY (.) = . − . = .
 / 
Derived densities – recap

∙ To find the pdf of Y = g(X) from the pdf of X, we first find the cdf of Y as
FY (y) = P{g(X) ≤ y} = 󵐐 fX (x) dx
{x: g(x)≤y}

x g(x)
g(x) ≤ y y
{x : g(x) ≤ y}

That is, the probability of the inverse image of {Y ≤ y} under g

∙ We then differentiate to obtain the pdf


dFY (y)
fY (y) =
dy
 y−b
∙ Linear function: Y = aX + b, fY (y) = fX 󶀣 󶀳
|a| a
 / 
A nonlinear function

∙ John is driving  miles at constant speed S ∼ Unif[, ] miles/hr


What is the pdf of the duration of the trip? fS (s)

∙ The duration of the trip is the r.v. T = /S 

To find fT (t), we first find FT (t) s


 
Note that {T ≤ t} = {S ≥ /t}, thus t

∞ 󶀂
󶀒  if t ≤  /s
󶀒 
FT (t) = 󵐐 fS (s) ds = 󶀊∫/t fS (s) ds if  < t ≤  
/t 󶀒
󶀒
󶀚 if t > 
s
 if t ≤   
󶀂
󶀒
= 󶀊( − /t) if  < t ≤  FT (t)
󶀒
󶀚
 if t >  

Differentiating, we obtain ( − /t)


/t  if  ≤ t ≤ 
fT (t) = 󶁇
 otherwise t
 
 / 
Amplifier
∙ Consider an amplifier with the shown input-output characteristics
Let the input be a r.v. X with the Laplace pdf y

 a
fX (x) = e−|x| , x ∈ (−∞, ∞) −
 
x
Find the cdf FY (y) of the output Y −a

∙ Consider fX (x)

󳶳 For y < −a, FY (y) =  
−
 x 
󳶳 For y = −a, FY (y) = 󵐐 e dx = e−
−∞  
󳶳 For −a < y < , x = y/a and x
y/a FY (y)
 x 
Fy (y) = 󵐐 e dx = ey/a
−∞  
󳶳 For  ≤ y < a, 
 −
− 
e
y/a
  −x 
Fy (y) = + 󵐐 e dx =  − e−y/a  −
e
    
y
󳶳 For y ≥ a, FY (y) =  −a a
 / 
Summary of random variables

∙ Classification of r.v.s:
󳶳 Discrete: specified by pmf
󳶳 Continuous: specified by pdf
󳶳 Mixed (all): specified by cdf
∙ Popular r.v.s: Bern(p), Geom(p), B(n, p), Poisson(λ); Unif[a.b], Exp(λ), N(μ, σ  )
󳶳 Binom(n, λ/n) → Poisson(λ) as n → ∞
󳶳 Geom(p), Exp(λ) are memoryless

∙ Mean of r.v.: measures average outcome


∙ Standard deviation of r.v.: measures spread around the mean (randomness)
∙ Derived density: find cdf of function, take derivatives, watch out for limits!
If X ∼ N(μ, σ  ) and Y = aX + b, then Y ∼ N(aμ + b, a σ  )
This is used to compute the cdf of Y ∼ N(μY , σY ) from the cdf of N(, )

 / 

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy