time-series-cheatsheet-en
time-series-cheatsheet-en
IN
IN
1 5. Repeat from step 2. The algorithm ends when the
CO
CO
Rej. H0 Accept H0 Rej. H0
estimated parameters vary very little between iter-
NC
NC
AR + LU No AR AR − ations.
LU
0
If not solved, look for high dependence in the series.
S
S
IV
IV
E
E
-1
PACF 0 dL dU 2 4−
d
4−
d
4
Exponential smoothing
1 U L
ft = αyt + (1 − α)ft−1
– Durbin’s h (endogenous regressors):
q where 0 < α < 1 is the smoothing parameter.
T
0 h = ρ̂ · 1−T ·υ
where υ is the estimated variance of the coefficient as- Predictions
sociated to the endogenous variable.
-1 Two types of predictions:
* H1 : Autocorrelation of order one, AR(1).
Conclusions differ between autocorrelation processes. Of the mean value of y for a specific value of x.
– Breusch-Godfrey test (endogenous regressors): it
Of an individual value of y for a specific value of x.
can detect MA(q) and AR(p) processes (εt is w. noise):
* MA(q): ut = εt − m1 ut−1 − · · · − mq ut−q
* AR(p): ut = ρ1 ut−1 + · · · + ρp ut−p + εt
TS-25.01-EN - github.com/marcelomijas/econometrics-cheatsheet - CC-BY-4.0 license
Stationarity Unit roots From unit root to percentage change
When an I(1) series is strictly positive, it is usually con-
Stationarity allows to correctly identify relations –that stay A process is I(d), that is, integrated of order d, if applying verted to logarithms before taking the first difference to
unchanged with time– between variables. differences d times makes the process stationary. obtain the (approx.) percentage change of the series:
Stationary process (strict stationarity) - if any collec- When d ≥ 1, the process is called a unit root process or yt − yt−1
tion of random variables is taken and shifted h periods it is said to have an unit root. ∆ log(yt ) = log(yt ) − log(yt−1 ) ≈
yt−1
(time changes), the joint probability distribution should A process have an unit root when the stability condition is
stay unchanged. not met (there are roots on the unit circle). Cointegration
Non-stationary process - for example, a series with
trend, where at least the mean changes with time.
Strong dependence When two series are I(1), but a linear combination of
Most of the time, economics series are strongly dependent
Covariance stationary process - its a weaker form of them is I(0). If the case, the regression of one series over
(or high persistent). Some examples of unit root I(1): the other is not spurious, but expresses something about
stationarity:
Random walk - an AR(1) process with ρ1 = 1. the long term relation. Variables are called cointegrated if
– E(xt ) is constant. – Var(xt ) is constant.
yt = yt−1 + et they have a common stochastic trend.
– For any t, h ≥ 1, the Cov(xt , xt+h ) depends only of h,
where {et : t = 1, 2, . . . , T } is an i.i.d. sequence with zero For example, {xt } and {yt } are I(1), but yt − βxt = ut
not of t.
mean and σe2 variance. where {ut } is I(0). (β is the cointegrating parameter).
Random walk with a drift - an AR(1) process with
Weak dependence ρ1 = 1 and a constant. Cointegration test
Weak dependence replaces the random sampling assump- yt = β0 + yt−1 + et Following the example above:
tion for time series. where {et : t = 1, 2, . . . , T } is an i.i.d. sequence with zero 1. Estimate yt = α + βxt + εt and obtain ε̂t .
An stationary process {xt } is weakly dependent when mean and σe2 variance. 2. Perform an ADF test on ε̂t with a modified distribution.
xt and xt+h are almost independent as h increases with- Unit root tests The result of this test is equivalent to:
out a limit. H0 : β = 0 (no cointegration)
A covariance stationary process is weakly dependent
Test H0 Reject H0 H1 : β ̸= 0 (cointegration)
ADF I(1) tau < Critical value if test statistic > critical value, reject H0 .
if the correlation between xt and xt+h tends to 0 fast
enough when h → ∞ (they are not asymptotically corre- KPSS I(0) level mu > Critical value
lated). I(0) trend tau > Critical value Heterocedasticity on time series
Weakly dependent processes are known as integrated of Phillips-Perron I(1) Z-tau < Critical value
Zivot-Andrews I(1) tau < Critical value The assumption affected is t4, which leads OLS to be
order zero, I(0). Some examples: not efficient.
Moving average - {xt } is a moving average of order q, From unit root to weak dependence Use tests like Breusch-Pagan or White’s, where H0 : No het-
MA(q): Integrated of order one, I(1), means that the first dif- erocedasticity. It is important for the tests to work that
xt = et + m1 et−1 + · · · + mq et−q ference of the process is weakly dependent or I(0) (and there is no autocorrelation.
where {et : t = 0, 1, . . . , T } is an i.i.d. sequence with zero usually, stationary). For example, let {yt } be a random
mean and σe2 variance. walk: ARCH
Autoregressive process - {xt } is an autoregressive pro- ∆yt = yt − yt−1 = et y, ∆y An autoregressive conditional heterocedasticity (ARCH),
cess of order p, AR(p): where {et } = {∆yt } is i.i.d. is a model to analyze a form of dynamic heterocedasticity,
xt = ρ1 xt−1 + · · · + ρp xt−p + et where the error variance follows an AR(p) process.
where {et : t = 1, 2, . . . , T } is an i.i.d. sequence with zero Note: Given the model: yt = β0 + β1 zt + ut where, there is AR(1)
mean and σe2 variance. The first difference of a se- and heterocedasticity:
Stability condition: if 1−ρ1 z−· · ·−ρp z p = 0 for |z| > 1 ries removes its trend. E(u2t | ut−1 ) = α0 + α1 u2t−1
then {xt } is an AR(p) stable process that is weakly de- Logarithms of a series sta- GARCH
pendent. For AR(1), the condition is: |ρ1 | < 1. bilizes its variance. A general autoregressive conditional heterocedasticity
ARMA process - is a combination of AR(p) and (GARCH), is a model similar to ARCH, but in this case,
MA(q); {xt } is an ARMA(p, q): t
the error variance follows an ARMA(p, q) process.
xt = et + m1 et−1 + · · · + mq et−q + ρ1 xt−1 + · · · + ρp xt−p