0% found this document useful (0 votes)
5 views

Lecture 3 WN

Uploaded by

azizchaouahi89
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Lecture 3 WN

Uploaded by

azizchaouahi89
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Time Series Analysis

Fundamental Concepts

Dr. Riadh Aloui

Tunis Business School

October 2022

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 1 / 21


Stochastic Process

A sequence of random variables indexed by time.

fYt : t = 0, 1, 2...g

Alternative def: Probability distribution over a space of paths.


If the joint distributions of the Y’s are multivariate normal
distributions, then the …rst and second moments completely
determine all the joint distributions.

E (X ), E (X 2 )

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 2 / 21


Means, Variances, and Covariances
A reminder

Let X have probability density function f (x ) and let the pair (X , Y )


have joint probability density function f (x, y ).

1 The mean or The expected value of X :


R∞
µX = E (X ) = xf (x )dx

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 3 / 21


Means, Variances, and Covariances
A reminder

Let X have probability density function f (x ) and let the pair (X , Y )


have joint probability density function f (x, y ).

1 The mean or The expected value of X :


R∞
µX = E (X ) = xf (x )dx

2 The Variance: σ2X
= Var (X ) = E [(X E (X ))2 ]. Find a more
suitable form for Var (X ).

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 3 / 21


Means, Variances, and Covariances
A reminder

Let X have probability density function f (x ) and let the pair (X , Y )


have joint probability density function f (x, y ).

1 The mean or The expected value of X :


R∞
µX = E (X ) = xf (x )dx

2 The Variance: σ2X
= Var (X ) = E [(X E (X ))2 ]. Find a more
suitable form for Var (X ).
3 The Covariance: Cov (X , Y ) = E [(X µX )(Y µY )]

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 3 / 21


Means, Variances, and Covariances
A reminder

Let X have probability density function f (x ) and let the pair (X , Y )


have joint probability density function f (x, y ).

1 The mean or The expected value of X :


R∞
µX = E (X ) = xf (x )dx

2 The Variance: σ2X
= Var (X ) = E [(X E (X ))2 ]. Find a more
suitable form for Var (X ).
3 The Covariance: Cov (X , Y ) = E [(X µX )(Y µY )]
Cov (X ,Y )
4 The Correlation:Corr (X , Y )= p
Var (X )Var (Y )
Appendix A, p.24 for more details.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 3 / 21


Autocovariance, Autocorrelation

For a stochastic process fYt : t = 0, 1, 2...g :

1 The mean function : µt = E (Yt ) for t = 0, 1, 2...

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 4 / 21


Autocovariance, Autocorrelation

For a stochastic process fYt : t = 0, 1, 2...g :

1 The mean function : µt = E (Yt ) for t = 0, 1, 2...


2 The autocovariance (ACVF) :
γt,s = Cov (Yt , Ys ) = E [(Yt µt )(Ys µs )] = E (Yt Ys ) µt µs

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 4 / 21


Autocovariance, Autocorrelation

For a stochastic process fYt : t = 0, 1, 2...g :

1 The mean function : µt = E (Yt ) for t = 0, 1, 2...


2 The autocovariance (ACVF) :
γt,s = Cov (Yt , Ys ) = E [(Yt µt )(Ys µs )] = E (Yt Ys ) µt µs
3 The autocorrelation (ACF):
Cov (Y t ,Y s ) γt,s
ρt,s = Corr (Yt , Ys ) = p = p
γt,t γs ,s
Var (Y t )Var (Y s )

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 4 / 21


Autocovariance

Lack of independence between adjacent values in time series Yt and


Ys can be numerically assessed.
Autocovariance Function: γt,s = Cov (Yt , Ys )
Note that γt,s = γs ,t for all time points s and t.
The autocovariance measures the linear dependence between two
points on the same series observed at di¤erent times.
Very smooth series exhibit autocovariance functions that stay large
even when the t and s are far apart, whereas choppy series tend to
have autocovariance functions that are nearly zero for large
separations.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 5 / 21


Autocovariance

ACF measures the linear predictability of Yt using only Ys .


1 ρt,s 1
If we can predict Yt perfectly from Ys through a linear relationship,
then ACF will be either +1 or 1.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 6 / 21


The Random Walk

Let Zt ( (as always throughout these lectures) be a sequence of i.i.d.


random variables with zero mean and variance σ2Z , A process Yt is
said to be a random walk if

Yt = Yt 1 + Zt
with initial condition Y1 = Z1

Example: share prices on successive days, option pricing...

share price on day t = share price on day (t 1) + random error

Find the mean, variance, autocovariance and autocorrelation


functions of Yt .
Compute ρ2,3 , ρ14,15 , ρ1,30 .

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 7 / 21


The Random Walk

µt = 0 for all t

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 8 / 21


The Random Walk

µt = 0 for all t
Var (Yt ) = tσ2Z , change with t.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 8 / 21


The Random Walk

µt = 0 for all t
Var (Yt ) = tσ2Z , change with t.
γt,s = Cov (Yt , Ys ) = tσ2Z for 1 t s

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 8 / 21


The Random Walk

µt = 0 for all t
Var (Yt ) = tσ2Z , change with t.
γt,s = Cov (Yt , Ys ) = tσ2Z for 1 t s
q
γt,s
ρt,s = γ γ = st for 1 t s
p
t,t s ,s

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 8 / 21


The Random Walk

µt = 0 for all t
Var (Yt ) = tσ2Z , change with t.
γt,s = Cov (Yt , Ys ) = tσ2Z for 1 t s
q
γt,s
ρt,s = γ γ = st for 1 t s
p
t,t s ,s

What about the …rst di¤erence?

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 8 / 21


Simple Moving Average
A simple moving average of span N assigns weights N1 to the most
recent N observations Yt , Yt 1 , . . . , Yt N +1 , and weight zero to all
other observations.
Useful to overlay a smoothed version of the original data and to help
reveal patterns.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 9 / 21


Simple Moving Average

We consider a simple moving average of span N = 3;


Z t +Z t 1 +Z t 2
Yt = 3

Find the mean, variance, autocovariance and autocorrelation


functions of Yt .

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 10 / 21


Stationarity

In TSA our goal is to predict a series that typically is not


deterministic but contains a random component. If this random
component is stationary, then we can develop powerful techniques to
forecast its future values.
The basic idea of stationarity is that the probability laws that govern
the behavior of the process do not change over time.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 11 / 21


Which of these TS are stationary?

(a) Google stock price for 200 consecutive days; (b) Daily change in the Google stock price for 200 consecutive days; (c) Annual

number of strikes in the US; (d) Monthly sales of new one-family houses sold in the US; (e) Annual price of a dozen eggs in the

US; (f) Monthly total of pigs slaughtered in Victoria, Australia; (g) Annual total of lynx trapped in the McKenzie River district

of north-west Canada; (h) Monthly Australian beer production; (i) Monthly Australian electricity production.
Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 12 / 21
Stationarity

A process Yt is said to be strictly stationary if the joint distribution of


Yt1 , Yt2 ...Ytn is the same as the joint distribution of
Yt1 k , Yt2 k ...Ytn k for all t1 , t2 , . . . , tn and all choices of time lag k.
In other words, shifting the time origin by an amount k has no e¤ect
on the joint distributions, which must therefore depend only on the
intervals between t1 , t2 , . . . , tn .

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 13 / 21


Second-order Stationarity

In practice it is often useful to de…ne stationarity in a less restricted


way.
A process is called weakly stationary (or second-order stationary) if
its variance is …nite, its mean is constant and its ACF, γs ,t depends
on s and t only through their di¤erence js t j.
From now on, we will use the term stationary to mean weakly
stationary;

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 14 / 21


Remarks on Stationarity

Strict stationarity does not assume …nite variance thus strictly


stationary does NOT necessarily imply weakly stationary.
Processes like i.i.d Cauchy is strictly stationary but not weakly
stationary (second moment of the process is not …nite).
Weak stationarity usually does not imply strict stationarity as higher
moments of the process may depend on time t.
If time series Yt is Gaussian (i.e. the distribution functions of Yt are
all multivariate Gaussian), then weakly stationary
also implies strictly stationary. This is because a multivariate
Gaussian distribution is fully characterized by its …rst two moments.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 15 / 21


Remarks on Stationarity

Time series with trends, or with seasonality, are not stationary — the
trend and seasonality will a¤ect the value of the time series at
di¤erent times
A time series with cyclic behaviour (but with no trend or seasonality)
is stationary. This is because the cycles are not of a …xed length, so
before we observe the series we cannot be sure where the peaks and
troughs of the cycles will be.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 16 / 21


Autocorrelation for Stationary Time Series

"γs ,t depends on s and t only through their di¤erence js t j", thus


we can rewrite notation s = t + k, where k represents the time shift.
Autocovariance Function of Stationary Time Series:

γt +k ,t = Cov (Yt +k , Yt ) = Cov (Yk , Y0 ) = γk ,0 = γk

Autocorrelation Function of Stationary Time Series


γt +k ,t γk
ρk = p
γt +k ,t +k γt,t = γ0

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 17 / 21


Some basic models
White Noise

Many useful processes can be constructed from white noise.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 18 / 21


Some basic models
White Noise

Many useful processes can be constructed from white noise.


A white noise time series Zt is a sequence of independent, identically
distributed random variables with zero mean and variance σ2Z for all t.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 18 / 21


Some basic models
White Noise

Many useful processes can be constructed from white noise.


A white noise time series Zt is a sequence of independent, identically
distributed random variables with zero mean and variance σ2Z for all t.
If the Zt are Normally (Gaussian) distributed, the series is known as
Gaussian white noise

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 18 / 21


Some basic models
White Noise, Random Walk with drift, MA(1)and AR(1).

What is the mean function? What is the autocovariance function? Is


the process weakly stationary?
Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 19 / 21
Some basic models
White Noise, Random Walk with drift, MA(1)and AR(1).

White noise: Zt
Random walk with drift: xt = δ + xt 1 + Zt
Moving average MA(1): xt = β1 Zt 1 + Zt
Autoregressive AR(1): xt = α1 xt 1 + Zt

What is the mean function? What is the autocovariance function? Is


the process weakly stationary?

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 20 / 21


Di¤erencing

In Slide 12, note that the Google stock price was non-stationary in
panel (a), but the daily changes were stationary in panel (b).

0
yt = yt yt 1

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 21 / 21


Di¤erencing

In Slide 12, note that the Google stock price was non-stationary in
panel (a), but the daily changes were stationary in panel (b).
Di¤erencing can help stabilise the mean of a time series by removing
changes in the level of a time series, and therefore eliminating (or
reducing) trend and seasonality.
0
yt = yt yt 1

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 21 / 21


Di¤erencing

In Slide 12, note that the Google stock price was non-stationary in
panel (a), but the daily changes were stationary in panel (b).
Di¤erencing can help stabilise the mean of a time series by removing
changes in the level of a time series, and therefore eliminating (or
reducing) trend and seasonality.
0
yt = yt yt 1

Transformations such as logarithms can also help to stabilise the


variance of a time series.

Dr. Riadh Aloui (Institute) Time Series - Lecture 3 10/22 21 / 21

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy