Lecture 1
Lecture 1
1
LECTURE NOTES 1
2
TIME SERIES
• Whether we wish to predict the trend in
financial markets or electricity consumption,
time is an important factor that must be
considered in our models.
• A time series is simply a series of data points
ordered in time. In a time series, time is often
the independent variable and the goal is
usually to make a forecast for the future.
3
DEFINITION
• A time series is a set of observations
generated sequentially in time. Therefore,
they are dependent to each other. This means
that we do NOT have a random sample.
• We assume that observations are equally
spaced in time.
• We also assume that closer observations
might have stronger dependency.
4
Modern approach of time series
• The modern approach of tine series consider
the time series as A stochastic process
• A stochastic process yt t is a collection of
random variables or a process that develops in
time according to probabilistic laws.
• The theory of stochastic processes gives us a
formal way to look at time series variables.
5
DEFINITION
• Discrete time series is one in which the set T0
at which observations are made in a discrete
set. Continuous time series are obtained when
observations are recorded continuously over
some time interval.
6
EXAMPLES
• Data in business, economics, engineering,
environment, medicine, earth sciences, and other
areas of scientific investigations are often
collected in the form of time series.
• Hourly temperature readings
• Daily stock prices
• Weekly traffic volume
• Annual growth rate
• Seasonal ice cream consumption
• Electrical signals
7
• There is a company X which has been keeping
a record of monthly sales of shampoo for the
past 3 years. Company X wants to forecast the
sale of the shampoo for the next 4 months so
that the demand and supply gap can be
managed by the organization. Our main job
here is to simply predict the sales of the
shampoo for the next 4 months.
• Dataset comprises of only two columns. One
is the Date of the month and other is the sale
of the shampoo in that month.
8
EXAMPLES
9
EXAMPLES
10
EXAMPLES
11
EXAMPLES
12
EXAMPLES
13
EXAMPLES
14
EXAMPLES
15
EXAMPLES
16
EXAMPLES
17
EXAMPLES
18
OBJECTIVES OF TIME SERIES ANALYSIS
• Understanding the dynamic or time-
dependent structure of the observations of a
single series (univariate analysis)
• Forecasting of future observations
• Ascertaining the leading, lagging and feedback
relationships among several series
(multivariate analysis)
19
STEPS IN TIME SERIES ANALYSIS
• Model Identification
– Time Series plot of the series
– Check for the existence of a trend or seasonality
– Check for the sharp changes in behavior
– Check for possible outliers
• Remove the trend and the seasonal component to get
stationary residuals.
• Estimation MLE
• Diagnostic Checking
– Normality of error terms
– Independency of error terms
– Constant error variance (Homoscedasticity)
• Forecasting
– Exponential smoothing methods
– Minimum MSE forecasting
20
CHARACTERISTICS OF A SERIES
• For a time series Yt , t 0,1,2,
THE MEAN FUNCTION:
t EYt Exists iff E Yt .
The expected value of the process at time t.
0 0
21
CHARACTERISTICS OF A SERIES
• THE AUTOCOVARIANCE FUNCTION:
t ,s Cov Yt , Ys EYt t Ys s
E YtYs t s ; t , s 0, 1, 2,
Covariance between the value at time t and the value at time s of a stochastic process Yt.
22
A RULE ON THE COVARIANCE
• If c1, c2,…, cm and d1, d2,…, dn are constants
and t1, t2,…, tm and s1, s2,…, sn are time points,
then
m
i 1
n m n
Cov ciYti , d jYs j ci d j Cov Yti , Ys j
j 1 i 1 j 1
m
i 1
m
i 1
2
n i 1
i 2 j 1
Var ciYti ci Var Yti 2 ci c j Cov Yti , Yt j
23
JOINT PDF OF A TIME SERIES
• Remember that
FX1 x1 : the marginal cdf
f X 1 x1 : the marginal pdf
FX 1 , X 2 ,, X n x1 , x2 ,, xn : the joint cdf
f X 1 , X 2 ,, X n x1 , x2 ,, xn : the joint pdf
24
JOINT PDF OF A TIME SERIES
• For the observed time series, say we have
two points, t and s.
• The marginal pdfs: fYt yt and fYs ys
• The joint pdf: fYt ,Ys yt , ys fYt yt . fYs ys
25
JOINT PDF OF A TIME SERIES
• Since we have only one observation for each
r.v. Yt, inference is too complicated if
distributions (or moments) change for all t (i.e.
change over time). So, we need a simplification.
r.v.
26
JOINT PDF OF A TIME SERIES
• To be able to identify the structure of the
series, we need the joint pdf of Y1, Y2,…, Yn.
However, we have only one sample. That is,
one observation from each random variable.
Therefore, it is very difficult to identify the
joint distribution. Hence, we need an
assumption to simplify our problem. This
simplifying assumption is known as
STATIONARITY.
27
STATIONARITY
• The most vital and common assumption in
time series analysis.
• The basic idea of stationarity is that the
probability laws governing the process do not
change with time.
• The process is in statistical equilibrium.
28
TYPES OF STATIONARITY
• STRICT (STRONG OR COMPLETE) STATIONARY
PROCESS: Consider a finite set of r.v.s.
1 2 n
Yt , Yt ,, Yt from a stochastic process
Y w, t ; t 0,1,2,.
• The n-dimensional distribution function is defined
by
FYt ,Yt
1 2
,,Ytn yt , yt ,, yt Pw : Yt
1 2 n 1
y1 ,, Ytn yn
where yi, i=1, 2,…, n are any real numbers.
29
STRONG STATIONARITY
• A process is said to be first order stationary in
distribution, if its one dimensional distribution
function is time-invariant, i.e.,
FYt y1 FYt k y1 for any t1 and k.
1 1
30
STRONG STATIONARITY
n-th order stationarity in distribution = strong stationarity
31
STRONG STATIONARITY
• So, for a strong stationary process
fYt ,,Yt y1,, yn fYt k ,,Yt k y1,, yn
i) 1 n 1 n
2 2 2
iii)Var Yt Var Yt k t t k , t , k
The variance of a series is constant over time, homoscedastic.
Y1 Y2 Y3 ……………………………………….... Yn
2 2 2 2
t
35
WEAK STATIONARITY
E Yt , t
Var Yt 2 , t
CovYt , Yt k k , t
Corr Yt , Yt k k , t
36
EXAMPLE
• Consider a time series {Yt} where
Yt=et
and eti.i.d.N(0,2). Is the process stationary?
37
EXAMPLE
• MOVING AVERAGE: Suppose that {Yt} is
constructed as
et et 1
Yt
2
and eti.i.d.(0,2). Is the process {Yt}
stationary?
38
EXAMPLE
• RANDOM WALK
Yt e1 e2 et
where eti.i.d.(0,2). Is the process {Yt}
stationary?
39
EXAMPLE
• Suppose that time series has the form
Yt a bt et
where a and b are constants and {et} is a
weakly stationary process with mean 0 and
autocovariance function k. Is {Yt} stationary?
40
EXAMPLE
Yt 1 et
t
41
STRONG VERSUS WEAK STATIONARITY
• Strict stationarity means that the joint distribution only
depends on the ‘difference’ h, not the time (t1, . . . , tk).
• Finite variance is not assumed in the definition of strong
stationarity, therefore, strict stationarity does not
necessarily imply weak stationarity. For example, processes
like i.i.d. Cauchy is strictly stationary but not weak
stationary.
• A nonlinear function of a strict stationary variable is still
strictly stationary, but this is not true for weak stationary.
For example, the square of a covariance stationary process
may not have finite variance.
• Weak stationarity usually does not imply strict
stationarity as higher moments of the process may
depend on time t.
42
STRONG VERSUS WEAK STATIONARITY
• If process {Xt} is a Gaussian time series,
which means that the distribution functions
of {Xt} are all multivariate Normal, weak
stationary also implies strict stationary. This
is because a multivariate Normal distribution
is fully characterized by its first two
moments.
43
STRONG VERSUS WEAK STATIONARITY
• For example, a white noise is stationary but may
not be strict stationary, but a Gaussian white
noise is strict stationary. Also, general white
noise only implies uncorrelation while Gaussian
white noise also implies independence. Because
if a process is Gaussian, uncorrelation implies
independence. Therefore, a Gaussian white
noise is just i.i.d. N(0, 2).
44
STATIONARITY AND
NONSTATIONARITY
• Stationary and nonstationary processes are
very different in their properties, and they
require different inference procedures. We
will discuss this in detail through this course.
45