UNIT 2 Rejinpaul
UNIT 2 Rejinpaul
com
www.rejinpaul.com
UNIT SYLLABUS
UNIT II TWO - DIMENSIONAL RANDOM VARIABLES
JOINT DISTRIBUTIONS
&
MARGINAL AND CONDITIONAL DISTRIBUTIONS
Examples :
1. Signal transmission : X is high quality signals and
Y low quality signals.
Similarly,
the conditional PMF of X, given Y = y , is given by
P[ X = x ,Y = y ] p XY ( x , y )
PX Y ( x y ) = = provided pY ( y ) > 0.
P[Y = y ] pY ( y )
CONDITIONAL MEANS
(For discrete random variables)
Similarly,
the conditional expected value X, given Y = y ,
is given by E [ X Y = y ] = x p X Y ( x y )
x
Example :
The joint probability mass function of (X , Y ) is given by
p(x , y ) = k (2 x + 3y ), x = 0,1,2; y = 1,2,3. Find all the
marginal and conditional probability distributi ons.
Solution :
The joint probability mass function of ( X , Y ) is given below
Y 1 2 3
X
0 3k 6k 9k
1 5k 8k 11k
2 7k 10k 13k
P [ X = 1] = 5k + 8k + 11k = 24k =
24 1
=
72 3
P [Y = 1] = 3k + 5k + 7k = 15k =
15 5
=
72 24
[ ]
P Y = 2 = 6k + 8k + 10k = 24k =
24 1
=
72 3
[ ]
P Y = 3 = 9k + 11k + 13k = 33k =
33 11
=
72 24
P [ X = 0 , Y = 1] 3k
[ ]
P X =0 Y = 1 = = =
3 1
=
P [Y = 1] 15k 15 5
P [ X = 1 , Y = 1] 5k
P [ X = 1 Y = 1] =
5 1
= = =
P [Y = 1] 15k 15 3
P [ X = 2 , Y = 1] 7k
P [ X = 2 Y = 1] =
7
= =
P [Y = 1] 15k 15
Example :
The Joint p.m.f of X and Y is
Y
p(x,y)
0 1 2
0 0.1 0.04 0.02
X 1 0.08 0.20 0.06
2 0.06 0.14 0.30
Solution :
The marginal p.m.f of X and Y are given by
Y
p(x,y) p(x) = P(X=x)
0 1 2
0 0.1 0.04 0.02 0.16
X 1 0.08 0.20 0.06 0.34
2 0.06 0.14 0.30 0.5
p(y) = P(Y=y) 0.24 0.38 0.38 1
p(x = 0 ,y = 0 ) = 0 .1
f X ( x) = f
XY ( x , y )dy
Similarly,
the marginal density function of Y is denoted by
fY ( y ) and is defined as,
fY ( y ) = f
XY ( x, y )dx
Similarly,
the conditional PDF of X, Y = y , is given by
f ( x, y)
f X Y ( x y ) = XY provided f Y ( y ) > 0
fY ( y )
Example :
cx ( x y ) ; 0 < x < 2 , x < y < x
Given f XY ( x , y ) = .
0 ; otherwise
(i ) Evaluate c (ii ) f X ( x ) (iii ) fY X ( y x ) and (iv ) fY (y ).
Solution :
Given f ( x , y ) is the jo int p.d . f , we have
f ( x , y ) dx dy = 1
2 x
c (x xy ) dy dx = 1
2
0 x
c x 2 ( x ( x )) ( x 2 x 2 ) dx = 1
x
0
2
c
[16 0]= 1 8c = 1 c = 1
2 8
f X (x ) = f (x , y ) dy = (x 2 xy )dy
1
8 x
1 2 x
y
= x 2 (y ) x x
x
8 2 x
= x 2 (x ( x )) (x 2 x 2 )
1 x
8 2
= [x (2x ) 0]
1 2
8
x3
= ,0 < x < 2
4
(
1 2
f (x , y ) 8 x xy
)
f Y X (y x ) = =
f X (x ) x3
4
4 x (x y )
=
8 x3
xy
= 2
, x< y < x
2x
Example :
8 x y ; 0 < x < y < 1
Given the joint pdf of (X, Y) as f ( x , y ) = . Find the
0 , otherwise
marginal and conditional probability density functions of X and Y.
Are X and Y are independent?
Solution :
Marginal density of X is
1
f X ( x ) = f ( x , y ) dy = 8 x y dy
x
1
y2
1
= 8 x y dy = 8 x
x 2 x
= 4 x (1 x 2 ) , 0 < x < 1
fY ( y ) = f ( x , y ) dx = 8 x y dx
0
y y
x 2
= 8 y x dx = 8 y
0 2 0
= 4 y ( y 2 0) = 4 y 3 , 0 < y < 1
and f X ( x ) . fY ( y ) = 4 x (1 x 2 ). 4 y 3 8 xy = f XY ( x , y )
yj y xi x
MARGINAL CDFs
The marginal distribution of X with respect to the joint CDF
FXY ( x , y ) is FX ( x ) = P[ X x] = P[ X x, Y < ]
P[ X x , Y = y ], for discrete random var iables
y
i .e ., FX ( x ) = x
f XY ( x , y )dy dx , for continuous random var iables
and the marginal distribution of Y with respect to the joint CDF
FXY ( x , y ) is FY ( y ) = P[Y y] = P[ X < , Y y ]
COVARIANCE
AND
CORRELATION COEFFICIENT
COVARIANCE
COVARIANCE(Definition)
The covariance of X and Y, which is denoted by Cov( X, Y)
or XY , is defined by Cov( X, Y) = E{[X - E(X)][Y - E(Y)]}
= E[ XY] E[ X]E[Y]
Note :
(i ) If Cov ( X ,Y ) = 0, we define the two random variables
to be uncorrelated.
Result 1 :
Let X and Y be any two random variables and a, b be constants.
Prove that Cov (a X , bY ) = ab cov ( X ,Y )
Result 2 :
Let X and Y be any two random variables and a,b be constants.
Prove that Cov(X + a, Y + b) = Cov(X, Y)
Result 3 :
Show that Cov 2 ( X ,Y ) Var ( X ).Var (Y )
CORRELATION COEFFICIENT
Result 1 :
1. Prove that the limits of correlatio n coefficien t is between 1 and 1.
i.e., 1 XY 1 or XY 1 or C XY X Y .
Result 2 :
Prove that two independen t variables are uncorrelated.
Result 3 :
Prove that correlation coefficient is independent of change of
origin and scale.
Example :
If X 1 has mean 4 and variance 9 while X 2 has mean 2
and variance 5 and the two variables independen t, find
Var (2 X 1 + X 2 5 ).
Solution :
Given E [ X 1 ] = 4 ,Var [ X 1 ] = 9
E [ X 2 ] = 2 ,Var [ X 2 ] = 5
Solution :
The marginal p.m.f of X and Y are given by
X
p(x,y) p(Y= y)
-1 1
0 1/8 3/8 4/8
Y
1 2/8 2/8 4/8
P(X= x) 3/8 5/8 1
18 December 2014 Two Dimensional Random 43
Variables by Dr M Radhakrishnan
3 5 2
E[X] = x i p( x i ) = ( 1) + (1) =
i 8 8 8
3 2 5
E[X ] = x p( x i ) = ( 1) + (1) = 1
2 2
i
2
i 8 8
4 4 1
E[Y] = y i p( y i ) = (0) + (1) =
i 8 8 2
4 4 1
E[Y 2 ] = y i2 p( y i ) = (0) 2 + (1) 2 =
i 8 8 2
E[XY] = x i y j p( x i , y i )
i j
1 3 2 2
= (0)(-1) + (0)(1) + (1)(-1) + (1)(1) = 0
8 8 8 8
4 15
= E[X ] [ E ( X )] = 1 =
2
X
2 2
64 16
15
X =
4
1 1 1
= E[Y ] [ E (Y )] = =
2
Y
2 2
2 4 4
1
Y =
2
Cov( X, Y )
Hence ( X, Y ) =
X Y
E[ XY] E[ X]E[ Y]
=
X Y
= 0.2582.
Example :
Suppose that the two dimensional RVs (X, Y) has the joint p.d.f
x + y, 0 < x < 1, 0 < y < 1
f(x,y) = .
0 , otherwise
Obtain the correlation coefficient between X and Y.
Solution :
The marginal density function of X is given by
1
y2
1
1
f X ( x ) = f ( x ) = f XY ( x , y )dy = ( x + y )dy = xy + = x + .
0 2 0 2
E[ X] = xf ( x )dx
1
2 x
= x + dx
0
2
1
x3 x2 7
= + = .
3 4 0 12
E[Y ] = yf ( y )dy
1
2 y
= y + dy
0
2
1
y3 y2 7
= + = .
3 4 0 12
E[ X 2 ] = x 2f ( x )dx
1
x x
1
x 2 4 3
= x 3 + dx = +
0
2 4 6 0
5
= .
12
1
y y
1
y
2 4 3
E[Y 2 ] = y 2f ( y )dy = y 3 + dy = +
0
2 4 6 0
5
= .
12
= xy( x + y )dxdy
0 0
1
1 2
= y ( x + xy )dx dy
0 0
y y2
1
= + dy
0
3 2
1
= .
3
5 49 11
Var ( X ) = E[ X ] [ E ( X )] =
2 2
=
12 144 144
11
X = .
12
5 49 11
Var (Y ) = E[Y 2 ] [ E (Y )]2 = =
12 144 144
11
Y = .
12
Coc( X , Y ) E[XY] - E[X]E[Y] 1
Hence (X, Y) = = = .
X Y X Y 11
LINEAR REGRESSION
LINES OF REGRESSION
Note
Both the lines of regression passes through (x, y).
REGRESSION COEFFICIENTS
y
Regression coefficien t of y on x is = b yx
x
x
Regression coefficien t of x on y is = b xy
y
PROPERTIES OF REGRESSION
Result :
The acute angle between the two lines of regression is
(1 r )
y
2
tan =
x
r ( x + y )
2 2
.
Example :
If the equations of the to lines of regression of y on x and x on y
are respective ly, 7x - 16y + 9 = 0; 5y - 4x - 3 = 0, calculate the
coefficien t of correlatio n, x and y .
Solution :
Since both the regression lines passes through (x, y), we get
7x - 16y = -9 & 4x - 5y = -3
3 15
Solving, we get x = - and y = .
29 29
Example
From the following data, find
(i) the two regression equations,
(ii) the coefficient of correlation between the marks in
Economics and Statistics and
(iii) the most likely marks in statistics when marks in
Economics are 30.
Marks in
25 28 35 32 31 36 29 38 34 32
Economics
Marks in
43 46 49 41 36 32 31 30 33 39
Statistics
where x =
x
= 32; y=
y
= 38
n n
and b YX =
( x x )( y y )
= 0.6643
(x x) 2
b XY =
( x x )( y y )
= 0.2337
(y y ) 2
Note
A linear regression line has an equation of the form
Y = a + bX, where X is the explanator y variable and Y
is the dependent variable. The slope of the line is ' b' and
' a' is the intercept. (the value of Y when x = 0)
Example :
1, if y = x , 0 < x < 1
Let (X, Y) have the joint p.d.f given by f(x, y) = .
0, otherwise
Show that the regression of y on x is linear but regression of
x on y is not linear.
Solution :
The marginal p.d.f of X is
x
x x
1 1 y2 1 x2 x2
= y dy = = = 0
-x
2x 2x 2 x 2x 2 2
1 1
x2 1
= x dx = = .
0 2 0 2
TRANSFORMATION OF RANDOVARIABLES
Result
Assume that U = g(X, Y), and we are required to find the
p.d.f of U. We can use the above transformation method
by defining an auxiliary function W = X or Y so we can
obtain the joint PDF f UW ( u, w ) of U and W.
Then we obtain the required marginal PDF f U ( u) as follows :
f U ( u) = f
UW ( u, w ) dw
Example :
If X and Y are independen t RVs with pdf' s e - x , x 0, e - y , y 0,
X
respectively, find the density functions of U = and V = X + Y.
X+Y
Are U and V independen t?
Solution :
Since X and Y are independent, f X Y (x y ) = f X (x)fY (y) = e (x + y) .
x
Solving the equations u = andv = x + y,
x+y
we get x = uv and y = v(1 - u).
x x
(x, y) u v = v u
J= = = v.
(u, v) y y v (1 u)
u v
= e -(x + y) v = e v v .
= 1, 0 u 1
= e v vdu
0
= e v v (1 0 ) = e v v v 0.
Now f U (u ) f V ( v ) = (1) e v v
= e v v
= f UV (u, v )
Therefore, U and V are independent RVs.
-1 y
= tan are the transformations from cartesiansto polars.
x
Therefore,the inverse transformations are given by x = rcos and y = rsin
x x
(x, y) r
J= = = r.
(r, ) y y
r
18 December 2014 Two Dimensional Random 71
Variables by Dr M Radhakrishnan
r
= e r 2 / 2 2
( )02
2 2
r r 2 / 2 2
= e r0
2
r2
Putting t = 2 .
2
1
f ( ) =
t
e dt
2 0
1 e t 1
= = , 0 2
2 1 0 2