0% found this document useful (0 votes)
10 views

Chapter 10 Time Series Regression Models

This document discusses techniques for dealing with autocorrelation in time series regression models. It covers detecting autocorrelation using the Durbin-Watson test and comparing the test statistic to bounds tables. It also discusses some methods for eliminating autocorrelation such as adding lagged dependent or independent variables to the regression model.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Chapter 10 Time Series Regression Models

This document discusses techniques for dealing with autocorrelation in time series regression models. It covers detecting autocorrelation using the Durbin-Watson test and comparing the test statistic to bounds tables. It also discusses some methods for eliminating autocorrelation such as adding lagged dependent or independent variables to the regression model.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

11/16/2023

Chapter 10. REGRESSION WITH TIME SERIES


DATA

Nguyen VP Nguyen, Ph.D.


Department of Industrial & Systems Engineering, HCMUT
Email: nguyennvp@hcmut.edu.vn

TIME SERIES DATA AND THE PROBLEM OF AUTOCORRELATION

• The (all) regression models assume that the errors, ε, are


independent (or uncorrelated) random variables. That’s
mean,
 Different values of the response variable Y can be related to the
values of the predictor variables the X’s but not to one another
 Different values of the response variable Y do not related to one
another (Yt related to Yt-1, Yt-2…so on)
 Depend on the assumption of independence.
• With time series data,
 Autocorrelation can occur because the effect of a predictor variable
on the response is distributed over time.
 The assumption of independence seldom holds.
 The demand data in different years are autocorrelated; they are not
independent (Yt related to Yt-1, Yt-2…so on)
2

1
11/16/2023

TIME SERIES DATA AND THE PROBLEM OF AUTOCORRELATION

• For example, the new car price in the current year is


related to (correlated with) the price in the previous
year, and maybe the price two years ago, and so
forth.
• From a forecasting perspective, autocorrelation is not
all bad.
 The previous Y’s can be used to predict future Y’s, to do this,
we need “fixing up” the standard regression model
• To fix this issue for accommodating the
autocorrelation
 changing the mix of predictor variables and/or the form of the
regression function (not really work!)
 changing the nature of the error term ε (typically!)

first-order serial correlation

• The error term in the current time period is directly


related to the error term in the previous time
period. The magnitude of the AC coefficient, ρ, where 1
𝜌 1, indicates the strength of the serial correlation.

level of one error term


𝜀 directly affects
the level of the next
error term 𝜀 .

2
11/16/2023

the true relation between Y and X, indicated by


the solid line in the figure, is increasing over
time

the first Y value is above the true


regression line, then the next several Y
values are likely to be above the line
because of the positive autocorrelation

there may be a sequence of Y’s above


the true regression line (negative errors
be followed by negative errors)

Strong autocorrelation

• Strong autocorrelation can make two unrelated


variables appear to be related.
These two series
were formed in
such a way that
the first series Yt
is not related to
the second series
Xt

The two time


series appear to
move together

3
11/16/2023

Strong autocorrelation

each sequence of observations


is highly autocorrelated

Strong autocorrelation

• It might be possible to relate the Yt series to the Xt


series, using a simple linear regression model.

4
11/16/2023

Strong autocorrelation

the residuals would reveal problems with this regression analysis,


assumption of independent errors is wrong 9

Strong autocorrelation

• The significance of the “truthly” strong


autocorrelation

10

5
11/16/2023

Technical problems with autocorrelated data


• If we do not fix this problem of autocorrelated time
series, we could come up with the faulty conclusion or
not justified conclusion: Obtaining spurious (sai,
giả) regressions
 The standard error of the estimate (SEreg) can seriously
underestimate the variability of the error terms.
 The usual inferences based on the t and F statistics are no
longer strictly applicable.
 The standard errors of the regression coefficients (SEbi)
underestimate the variability of the estimated regression
coefficients. Spurious regressions can result
• If regression models are used with autocorrelated
(time series) data, it is especially important to
examine the residuals.
11

DETECTING AUTOCORRELATION BY DW TEST

• Method 1: Durbin-Watson (DW) statistic is directly


computing a statistic that is useful for detecting
first order-serial correlation, or lag 1
autocorrelation,
a

The Durbin-Watson statistic:

Small values of the DW statistic are consistent with


positive serial correlation. 12

6
11/16/2023

DETECTING AUTOCORRELATION BY DW TEST

• Method 2: Compare the calculated value of the


Durbin-Watson statistic with lower and upper
bounds.

13

DETECTING AUTOCORRELATION BY DW TEST


Durbin-Watson Test Bounds

14

7
11/16/2023

DETECTING AUTOCORRELATION BY DW TEST


Durbin-Watson Test Bounds

15

DETECTING AUTOCORRELATION BY DW TEST

• Example 2: Projecting future sales for Reynolds


Metals Company

DW<dL, Reject H0, the errors are positively


n=21, k=1, α 1% correlated.  The regression model should be
dL=0.97; dU=1.16 modified before it is used for forecasting 16

8
11/16/2023

DETECTING AUTOCORRELATION BY DW TEST

17

SOLUTIONS TO AUTOCORRELATION PROBLEMS

• After autocorrelation has been discovered in a


regression of time series data, it is necessary to
remove it, or model it
• Step 1: evaluation of the model specification.
 Is the functional form correct?
 Were any important variables omitted?
 Are there effects that might have some pattern over
time that could have introduced autocorrelation into the
errors?

18

9
11/16/2023

Techniques for eliminating autocorrelation

• Step 2: ( Approach 1) To add an omitted variable


X? to the regression function that explains the
association in the response Yt from one period
to the next Yt-1, Yt-2,...so on

19

Techniques for eliminating autocorrelation

• Example 3: The projection of future sales for the


Novak Corporation

DW<dL, Reject H0, the errors are positively


correlated  A key variable X? that accounts
for the remaining association in sales from one
n=17, k=1, α 1% year to the next may be missing from the
dL=0.87; dU=1.1 model.
20

10
11/16/2023

21

Techniques for eliminating autocorrelation

DW = 1.98 > du=1.25, fail to reject H0, no


evidence of first-order serial correlation
 The function Y^ = -0.014 + 0.03X1 - 0.35X2
n=17, k=2, α 1% can be used to predict Novak sales
dL= 0.77; dU=1.25
22

11
11/16/2023

23

Techniques for eliminating autocorrelation

• Step 2’: (Approach 2): Differencing


 For highly autocorrelated data, modeling changes
rather than levels can often eliminate the serial
correlation, instead of X1,X2, Xk…
 The regression model is specified in terms of changes,
generalized differences, rather than levels.

where the “prime” indicates the


generalized differences 24

12
11/16/2023

Techniques for eliminating autocorrelation

• Example 4: Forecasting Sears Roebuck sales in


thousands
• of dollars To relate sales to disposable
income using a log linear
regression model
 to estimate the income elasticity
of sales.

a log linear regression model:

25

Techniques for eliminating autocorrelation

n=21, k=1, α 1%
dL= 0.97; dU=1.16

The DW statistic of 0.49 is small < dL=0.97


 The correlation between successive errors is positive and large (close to 1)
 To model the changes or differences in the logarithms of sales and
income, respectively 26

13
11/16/2023

Techniques for eliminating autocorrelation

dL=1.22 < DW = 1.27 > du=1.42, test for


positive serial correlation is inconclusive
n=21, k=1, α 5%
 check the residual autocorrelations
dL= 1.22; dU=1.42

27

Techniques for eliminating autocorrelation

28

14
11/16/2023

Techniques for eliminating autocorrelation

All rk(s) lie well within their


two standard error limits for
the first few
 We can conclude that
serial correlation had been
eliminated
 Be able to use the fitted
equation for forecasting.

29

Techniques for eliminating autocorrelation


a log linear regression model:

The forecast for Sears sales in 1997 was obtained by :

taking antilogs:
30

15

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy