0% found this document useful (0 votes)
36 views

Discrete Random Variables and Probability Distributions

1. A discrete random variable is a variable that can take on only countable number of possible values. The probability distribution or probability mass function (pmf) of a discrete random variable defines the probability of each possible value. 2. The expected value or mean of a discrete random variable is calculated by taking the sum of each possible value multiplied by its probability. It represents the long-run average value of the random variable. 3. A Bernoulli random variable is a random variable that can only take on the values of 0 or 1. It is used to model experiments with two possible outcomes, like success/failure of a trial.

Uploaded by

Poonam Naidu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

Discrete Random Variables and Probability Distributions

1. A discrete random variable is a variable that can take on only countable number of possible values. The probability distribution or probability mass function (pmf) of a discrete random variable defines the probability of each possible value. 2. The expected value or mean of a discrete random variable is calculated by taking the sum of each possible value multiplied by its probability. It represents the long-run average value of the random variable. 3. A Bernoulli random variable is a random variable that can only take on the values of 0 or 1. It is used to model experiments with two possible outcomes, like success/failure of a trial.

Uploaded by

Poonam Naidu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Chapter 3 Random Variable

Bernoulli Random Variable


3.1 For a given sample space of some
experiment, a random variable is any
Discrete Random rule that associates a number with Any random variable whose only
Random
Variables and each outcome in . possible values are 0 and 1 is called a

Probability Variables We use X, Y, ... to denote random variables, use


Bernoulli random variable.
x,y,... to represent particular values of a random

Distributions variable.

Types of Random Variables Parameter of a Probability Distribution


3.2 Probability Distribution
A discrete random variable is an rv whose Suppose that p(x) depends on a quantity
that can be assigned any one of a number of
possible values either constitute a finite set Probability Distributions The probability distribution or possible values, each with different value
or else can listed in an infinite sequence. probability mass function (pmf) of a
A random variable is continuous if its set for Discrete Random discrete rv is defined for every number x
determining a different probability
distribution. Such a quantity is called a
of possible values consists of an entire
interval on a number line.
Variables by p(x) = P (all s ∈ :X ( s ) = x). parameter of the distribution. The
collection of all distributions for all different
parameters is called a family of
distributions.

Cumulative Distribution Function Proposition Probability Distribution for the


Random Variable X
The cumulative distribution function (cdf) 3.3
For any two numbers a and b with a ≤ b, A probability distribution for a random variable X:
F(x) of a discrete rv variable X with pmf p
P(a ≤ X ≤ b) = F (b) − F (a −)
(x) is defined for every number by x –8
P(X = x) 0.13
–3
0.15
–1
0.17
0
0.20
1
0.15
4 6
0.11 0.09
Expected Values of
“a–” represents the largest possible X
F ( x) = P( X ≤ x) =
y: y ≤ x
p( y)
value that is strictly less than a. Discrete Random
Find
For any number x, F(x) is the probability Note: For integers a. P ( X ≤ 0 ) 0.65 Variables
that the observed value of X will be at b. P ( −3 ≤ X ≤ 1)
P(a ≤ X ≤ b) = F (b) − F (a − 1) 0.67
most x.

Ex. Use the data below to find out the expected The Expected Value of a Function
The Expected Value of X number of the number of credit cards that a student
Rules of the Expected Value
will possess.
Let X be a discrete rv with set of If the rv X has the set of possible values E (aX + b) = a ⋅ E ( X ) + b
x = # credit cards
possible values D and pmf p(x). The D and pmf p(x), then the expected
expected value or mean value of X,
x P(x =X) E ( X ) = x1 p1 + x2 p2 + ... + xn pn value of any function h(x), denoted This leads to the following:
0 0.08
denoted E ( X ) or µ X , is 1 0.28
= 0(.08) + 1(.28) + 2(.38) + 3(.16) E[h( X )] or µh ( X ) , is 2. For any constant a,
+ 4(.06) + 5(.03) + 6(.01)
E ( X ) = µX = x ⋅ p (x )
2 0.38
E[h ( X )] = h (x) ⋅ p (x) E (aX ) = a ⋅ E ( X ).
3 0.16
x∈D
=1.97 D
4 0.06
5 0.03 About 2 credit cards
2. For any constant b,
6 0.01 E ( X + b) = E ( X ) + b.
The Variance and Standard Ex. The quiz scores for a particular student are given 2 2
V ( X ) = .08 (12 − 21) + .15 (18 − 21) + .31( 20 − 21)
2
below: Shortcut Formula for Variance
Deviation 22, 25, 20, 18, 12, 20, 24, 20, 20, 25, 24, 25, 18
2 2
+.08 ( 22 − 21) + .15 ( 24 − 21) + .23( 25− 21)
2

Let X have pmf p(x), and expected value µ Find the variance and standard deviation.
V ( X ) = 13.25
Then the variance of X, denoted V(X) Value 12 18 20 22 24 25 V (X ) = σ 2 = x2 ⋅ p ( x ) − µ 2
(or σ X2 or σ 2 ), is Frequency 1 2 4 1 2 3 σ = V (X ) = 13.25 ≈ 3.64 D

V (X ) = ( x − µ ) 2 ⋅ p ( x ) = E [( X − µ ) 2 ]
Probability
µ = 21
.08 .15 .31 .08 .15 .23
( )
= E X2 − E(X )
2

D 2 2 2
V ( X ) = p1 ( x1 − µ ) + p2 ( x2 − µ ) + ... + pn ( xn − µ )
The standard deviation (SD) of X is
σ = V (X )
σ X = σ X2

Rules of Variance Binomial Experiment


An experiment for which the following 1. The trials are identical, and each trial
2 2 2
V (aX + b) = σ aX +b = a ⋅ σ X 3.4 four conditions are satisfied is called a can result in one of the same two
and σ aX +b = a ⋅ σ X binomial experiment. possible outcomes, which are denoted
The Binomial 1. The experiment consists of a
by success (S) or failure (F).
This leads to the following:
2
Probability sequence of n trials, where n is fixed in
2. The trials are independent.
1. σ aX = a 2 ⋅ σ X2 , σ aX = a ⋅σ X advance of the experiment. 3. The probability of success is constant
2. σ X +b = σ X2
2 Distribution from trial to trial: denoted by p.

Binomial Experiment Binomial Random Variable Computation of a


Notation for the pmf
Suppose each trial of an experiment can of a Binomial rv Binomial pmf
Given a binomial experiment consisting
result in S or F, but the sampling is of n trials, the binomial random variable
without replacement from a population of X associated with this experiment is n n− x
Because the pmf of a binomial rv X p x (1 − p ) x = 0,1, 2,...n
size N. If the sample size n is at most 5% defined as b ( x; n, p ) = p
of the population size, the experiment can depends on the two parameters n and
p, we denote the pmf by b(x;n,p). 0 otherwise
be analyzed as though it were exactly a X = the number of S’s among n trials
binomial experiment.

Ex. A card is drawn from a standard 52-card deck. Ex. 5 cards are drawn, with replacement, from a
If drawing a club is considered a success, find the Notation for cdf standard 52-card deck. If drawing a club is
probability of Mean and Variance considered a success, find the mean, variance, and
a. exactly one success in 4 draws (with replacement). standard deviation of X (where X is the number of
For X ~ Bin(n, p), the cdf will be successes).
p = ¼; q = 1– ¼ = ¾
For X ~ Bin(n, p), then E(X) = np, V p = ¼; q = 1– ¼ = ¾
denoted by
4 1
1
3
3
x (X) = np(1 – p) = npq, σ X = npq
1
1 4 4 ≈ 0.422 P( X ≤ x ) = B ( x; n, p ) = b( y ; n , p ) (where q = 1 – p). µ = np = 5 =1.25
4
b. no successes in 5 draws (with replacement). y =0
1 3
V ( X ) = npq = 5 = 0.9375
5 1
0
3
5 x = 0, 1, 2, …n 4 4
≈ 0.237
0 4 4 σ X = npq = 0.9375 ≈ 0.968
Ex. If the probability of a student successfully passing
this course (C or better) is 0.82, find the probability
The Hypergeometric Distribution
that given 8 students 1. Each individual can be
3.5 The three assumptions that lead to a characterized as a success (S) or
8
a. all 8 pass. ( 0.82 ) ( 0.18) ≈ 0.2044
8 0
hypergeometric distribution: failure (F), and there are M
8
successes in the population.
b. none pass. 8
( 0.82) 0 ( 0.18) 8 ≈ 0.0000011 Hypergeometric and 1. The population or set to be sampled
0 consists of N individuals, objects, or 2. A sample of n individuals is
c. at least 6 pass. Negative Binomial elements (a finite population). selected without replacement in
8
( 0.82 )6 ( 0.18) 2 +
8
( 0.82 )7 ( 0.18 )1 +
8
( 0.82 )8 ( 0.18) 0 Distributions such a way that each subset of size
6 7 8 n is equally likely to be chosen.
≈ 0.2758 + 0.3590 + 0.2044 = 0.8392

Hypergeometric Distribution The Negative Binomial Distribution


If X is the number of S’s in a completely Hypergeometric Mean and
random sample of size n drawn from a Variance The negative binomial rv and distribution 3. The probability of success is constant
population consisting of M S’s and (N – M) are based on an experiment satisfying the from trial to trial, so P(S on trial i) = p for
F’s, then the probability distribution of X, following four conditions: i = 1, 2, 3, …
called the hypergeometric distribution, is M N −n M M
E( X ) = n⋅ V (X ) = ⋅ n⋅ 1− 1. The experiment consists of a sequence
given by M N −M N N −1 N N 4. The experiment continues until a total
of independent trials. of r successes have been observed, where
x n− x
P( X = x ) = h ( x; n, M , N ) = 2. Each trial can result in a success (S) or r is a specified positive integer.
N
n a failure (F).
max(0, n − N + M ) ≤ x ≤ min(n, M )

Poisson Distribution
pmf of a Negative Binomial
Negative Binomial
Mean and Variance
3.6
The pmf of the negative binomial rv X A random variable X is said to have
with parameters r = number of S’s and a Poisson distribution with
p = P(S) is r (1 − p) r (1 − p) The Poisson Probability parameter λ ( λ > 0 ) , if the pmf of X
E(X ) = V (X ) =
x + r +1 p p
2
Distribution is
nb ( x; r , p) = p r (1 − p) x e −λ λ x
r −1 p ( x; λ ) = x = 0,1, 2...
x!
x = 0, 1, 2, …

The Poisson Distribution Poisson Process


Poisson Distribution
as a Limit
Mean and Variance 1. The probability of more than one
3 Assumptions: event during ∆t is o ( ∆t ) .
Suppose that in the binomial pmf b(x;n, p),
If X has a Poisson distribution with 1. There exists a parameter α > 0 such
we let n → ∞ and p → 0 2. The number of events during the time
parameter λ , then that for any short time interval of length
interval ∆t is independent of the
in such a way that np approaches a value ∆t, the probability that exactly one event
E( X ) = V ( X ) = λ number that occurred prior to this time
λ > 0. is received is α ⋅ ∆t + o ( ∆t ) .
interval.
Then b( x; n, p ) → p ( x; λ ).
Poisson Distribution
Pk (t ) = e−α t ⋅ (α t ) k / k !, so that the number
of pulses (events) during a time interval of
length t is a Poisson rv with parameter
λ = α t. The expected number of pulses
(events) during any such time interval is
α t , so the expected number during a unit
time interval is α .

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy