F_Regression
F_Regression
Regression
Models
One independent More than One
variable independent variable
Simple Multiple
Regression Regression
Y 0 1 X 1
• Multiple linear regression – refers to a regression model on
more than one independent variables.
Y 0 1 X 1 2 X 2 ... k X k
• Nonlinear regression.
1
Y 0 X 2 3
1 2 X 1
Linear Regression
The null and alternative hypotheses for the SLR model can be
stated as follows:
H0: There is no relationship between X and Y
HA: There is a relationship between X and Y
Simple Linear Regression
• Managerial decisions often are based on the relationship between two or
more variables.
• Regression analysis can be used to develop an equation showing how the
variables are related.
• The variable being predicted is called the dependent variable and is denoted
by y.
• The variables being used to predict the value of the dependent variable are
called the independent variables and are denoted by x.
Simple Linear Regression
• Simple linear regression involves one independent variable and one
dependent variable.
• The relationship between the two variables is approximated by a
straight line.
• Regression analysis involving two or more independent variables is
called multiple regression.
Simple Linear Regression Model
• The equation that describes how y is related to x and an error term is called
the regression model.
• The simple linear regression model is:
y = 0 + 1x +
where:
0 and 1 are called parameters of the model,
is a random variable called the error term.
Simple Linear Regression Equation
• The simple linear regression equation is:
E(y) = 0 + 1x
• Graph of the regression equation is a straight line.
• 0 is the y intercept of the regression line.
• 1 is the slope of the regression line.
• E(y) is the expected value of y for a given x value.
Simple Linear Regression Equation
• Positive Linear Relationship
E(y)
Regression line
Intercept Slope 1
0 is positive
x
Simple Linear Regression Equation
• Negative Linear Relationship
E(y)
Intercept
0 Regression line
Slope 1
is negative
x
Simple Linear Regression Equation
• No Relationship
E(y)
x
Estimated Simple Linear Regression Equation
• The estimated simple linear regression equation
� = �0 + �1 �
• The graph is called the estimated regression line.
• b0 is the y intercept of the line.
• b1 is the slope of the line.
• � is the estimated value of y for a given x value.
Estimation Process
Regression Model Sample Data:
y = 0 + 1x + x y
Regression Equation x1 y1
E(y) = 0 + 1x . .
Unknown Parameters . .
0, 1 xn yn
Estimated
b0 and b1 Regression Equation
provide estimates of � = �0 + �1 �
0 and 1 Sample Statistics
b0, b1
Least Squares Method
• Least Squares Criterion
min (�� − �� )2
where:
yi = observed value of the dependent variable
for the i th observation
�� = estimated value of the dependent variable
for the i th observation
Least Squares Method
• Slope for the Estimated Regression Equation
(�� − �)(�� − �)
�1 =
(�� − �)2
where:
xi = value of independent variable for i th observation
yi = value of dependent variable for i th observation
� = mean value for independent variable
� = mean value for dependent variable
Least Squares Method
• y-Intercept for the Estimated Regression Equation
�0 = � − �1 �
Simple Linear Regression
• Example: Reed Auto Sales
Reed Auto periodically has a special week-long sale. As part of
the advertising campaign Reed runs one or more television
commercials during the weekend preceding the sale. Data from a
sample of 5 previous sales are shown on the next slide.
Simple Linear Regression
• Example: Reed Auto Sales
Number of Number of
TV Ads (x) Cars Sold (y)
1 14
3 24
2 18
1 17
3 27
Sx = 10 Sy = 100
�=2 � = 20
Estimated Regression Equation
• Slope for the Estimated Regression Equation
(�� − �)(�� − �) 20
�1 = 2 = =5
(�� − �) 4
where:
SST = total sum of squares
SSR = sum of squares due to regression
SSE = sum of squares due to error
Coefficient of Determination
• The coefficient of determination is:
r2 = SSR/SST
where:
SSR = sum of squares due to regression
SST = total sum of squares
Coefficient of Determination
r2 = SSR/SST = 100/114 = .8772
y = 5x + 10
15
R 2 = 0 .8 7 7 2
10
5
0
0 1 2 3 4
TV Ad s
Sample Correlation Coefficient
where:
b1 = the slope of the estimated regression
equation � = �0 + �1 �
Sample Correlation Coefficient
��� = (sign of �1 ) �2
��� =+ . 8772
rxy = +.9366
Assumptions About the Error Term
1. The error is a random variable with mean of zero.
2. The variance of , denoted by 2, is the same for all values of the
independent variable.
3. The values of are independent.
4. The error is a normally distributed random variable.
Testing for Significance
• To test for a significant regression relationship, we must conduct a
hypothesis test to determine whether the value of 1 is zero.
• Two tests are commonly used:
• Both the t test and F test require an estimate of 2, the variance of in the
regression model.
Testing for Significance
• An Estimate of 2
The mean square error (MSE) provides the estimate of 2, and the notation
s2 is also used.
s 2 = MSE = SSE/(n - 2)
where:
SSE
s = MSE =
�−2
Testing for Significance: t Test
• Hypotheses
H0: 1 = 0
Ha: 1 ≠ 0
• Test Statistic
�1 �
�= where ��1 =
��1 (�� − �)2
Testing for Significance: t Test
• Rejection Rule
where:
t is based on a t distribution
with n - 2 degrees of freedom
Testing for Significance: t Test
1. Determine the hypotheses. H0: 1 = 0
Ha: 1 ≠ 0
�1
3. Select the test statistic. �=
��1
�1 ± ��/2 ��1
where
b1 is the point estimator,
��/2 ��1 is the margin of error, and
ta/2 is the t value providing an area of
/2 in the upper tail of a t distribution
with n - 2 degrees of freedom
Confidence Interval for 1
• Rejection Rule
• Conclusion
0 is not included in the confidence interval. Reject H0
Testing for Significance: F Test
• Hypotheses
H0: b1 = 0
Ha: b1 ≠ 0
• Test Statistic
F = MSR/MSE
Testing for Significance: F Test
• Rejection Rule
Reject H0 if
p-value <
or F > F
where:
F is based on an F distribution with
1 degree of freedom in the numerator and
n - 2 degrees of freedom in the denominator
Testing for Significance: F Test
1. Determine the hypotheses. H0: b1 = 0
Ha: b1 ≠ 0
F = 17.44 provides an area of .025 in the upper tail. Thus, the p-value
corresponding to F = 21.43 is less than .025. Hence, we reject H0.
The statistical evidence is sufficient to conclude that we have a significant
relationship between the number of TV ads aired and the number of cars
sold.
Some Cautions about the
Interpretation of Significance Tests
• Rejecting H0: 1 = 0 and concluding that the relationship between x and y
is significant does not enable us to conclude that a cause-and-effect
relationship is present between x and y.
• Interpretation of 0 and 1 in Y = 0 + 1 X
The above measures and tests are essential, but not exhaustive.
Coefficient of Determination (R-Square or R2)
Yi
0 1 X i i
Variation in Y Variation in Y explained Variation in Y not explained
by the model by the model
In absence of the predictive model for Yi, the users will use the mean value of Yi. Thus, the
total variation is measured as the difference between Yi and mean value of Yi (i.e.,Yi -Y ).
Description of total variation, explained variation and unexplained variation
the mean value of Y
Variation not explained by ( )
Yi Yi
Variation not explained by the model is the
model difference between the actual value and the
predicted value of Yi (error in prediction)
The relationship between the total variation, explained variation and the unexplained variation is
given as follows:
Yi Y Yi Y Yi Yi
Total Variation in Y Variation in Y explained by the model Variation in Y not explained by the model
It can be proved mathematically that sum of squares of total variation is equal to sum of squares of
explained variation plus sum of squares of unexplained variation
2 2 2
n n n
Y i Y Y i Y Y i Y i
i 1 i 1 i 1
SST SSR SSE
where SST is the sum of squares of total variation, SSR is the sum of squares of variation explained
by the regression model and SSE is the sum of squares of errors or unexplained variation.
Coefficient of Determination or R-Square
The coefficient of determination (R2) is given by
2
Yi Y
2 Explained variation SSR
Coefficien t of determinat ion R
Total variation SST 2
Yi Y
2004 1 2
2005 6 2
2006 12 2
2007 58 2
2008 145 11
2009 360 21
2010 608 31
2011 845 40
2012 1056 51
Hypothesis Test for Regression Co-efficient (t-Test)
In above Eq. Se is the standard error of estimate (or standard error of the
residuals) that measures the accuracy of prediction and is given by
n n
2 2
(Y i Y i ) i
Se i 1 i 1
n 2 n 2
n
2
(Y i Y i ) n 2
Se i 1
S e (1)
2
(X i X ) ( X i X )2
The null and alternative hypotheses for the SLR model can be
stated as follows:
H0: There is no relationship between X and Y
HA: There is a relationship between X and Y
• 1 = 0 would imply that there is no linear relationship between
the response variable Y and the explanatory variable X. Thus, the
null and alternative hypotheses can be restated as follows:
H0: 1 = 0
HA: 1 0
• The corresponding t-statistic is given as
1 1 1 0 1
t
Se ( 1) Se ( 1) Se ( 1)
Test for Overall Model: Analysis of Variance (F-test)