CHAPTER 12 TIME SERIES ANALYSIS Chapter 12 Time Series Analysis 12.1 Stochastic processes A stochastic process is a family of random variables Xt, tETy Example 1 [S, t=0, 1, 2, . where St= 2i=o Xi and X; N iid (0, 0). S has a different distribution at each point t 12.2 Stationarity and strict stationarity If Xt, t eT is a stochastic process such that Var(X,)<o for each E T, the autocor- variance funct ion y of xt is defined by (r, s)=Cov(Xr, Xs )=E(Xr- EXr)(Xs-EXs Because Var(Xt)<oo for eachtE T 2(,)≤[E(x-Ex]1E(x,-EX)2]12 by the Cauchy-Schwarz inequality The autocorrelation function P(r, s)is defined by (,s)=-2a(,s) √x(r,n)n(8,s) Example 2 Let Xt=e+Bet-1, et N iid(0, 02) + 72(t+h, t)=Cou(Xt+h,X)=8o h=±1 1 h=0 Pr(t+h, t) The time series{X,t∈ z with index set Z={0,±1,±2,…} is said to be( weakly) stationary, if 1.E|X<∞ for all t∈z 2.EXt= m for all t∈z 3.Y(r, s)=%(r+t, s+t) for all T, s, tE Z
CHAPTER 12 TIME SERIES ANALYSIS 1 Chapter 12 Time Series Analysis 12.1 Stochastic processes A stochastic process is a family of random variables {Xt ,t ∈ T} . Example 1 {St , t = 0, 1, 2, · · · } where St = t i=0 Xi and Xi ∼ iid (0, σ2 ). St has a different distribution at each point t. 12.2 Stationarity and strict strationarity If {Xt , t ∈ T} is a stochastic process such that V ar (Xt) < ∞ for each t ∈ T, the autocovariance function γx (·, ·) of {Xt} is defined by γx (r, s) = Cov (Xr, Xs) = E (Xr − EXr) (Xs − EXs). Because V ar (Xt) < ∞ for each t ∈ T, γx (r, s) ≤ E (Xr − EXr) 2 1/2 E (Xs − EXs) 2 1/2 < ∞ by the Cauchy—Schwarz inequality. The autocorrelation function ρx (r, s) is defined by ρx (r, s) = γx (r, s) γx (r, r) γx (s, s) Example 2 Let Xt = et + θet−1, et ∼ iid (0, σ2 ). γx (t + h, t) = Cov (Xt+h, Xt) = 1 + θ 2 σ 2 , h = 0 = θσ2 , h = ±1 = 0, |h| > 1 ρx (t + h, t) = 1, h = 0 = θ (1+θ 2 ) , h = ±1 = 0, |h| > 1 The time series {Xt , t ∈ Z} with index set Z = {0, ±1, ±2, · · · } is said to be (weakly) stationary, if 1. E |X2 t | < ∞ for all t ∈ Z 2. EXt = m for all t ∈ Z 3. γx (r, s) = γx (r + t, s + t) for all r, s, t ∈ Z
CHAPTER 12 TIME SERIES ANALYSIS Remark 1 If IXt, tez is statio nary, then =(r, s)=%x(r x(r-s,0) Hence, we may define the autoco variance function of a statio nary process as a function of just one variable, which is the difference of two time inder. That is, instead of ya(T, s), we may write 7 ane way Pn(h)=7x2(h)/x2(0) Example 3 Xt=et+8et-l, et N iid(0, a2) Xt is stationary e nple 4 Xt=Xt-1+ et, et niid(0, 02) T ei+ xo Xt is not stationary, since Var(Xt)=to(assume Xo=0) 3 xample5X≡N(0,a2) X is not stationa The time series IXt, tezi is said to be strict ly stationary if the joint distribution of tk+h)are the same for all posit ive integers k and for all th,h∈Z. 12.3 Autoregressive processes 3t=C13-1+.+ ap3h-p+et: AR(p)process t=s EC=0 and Eees{=0,t≠
CHAPTER 12 TIME SERIES ANALYSIS 2 Remark 1 If {Xt ,t ∈ Z} is stationary, then γx (r, s) = γx (r − s, s − s) = γx (r − s, 0). Hence, we may define the autocovariance function of a stationary process as a function of just one variable, which is the difference of two time index. That is, instead of γx (r, s), we may write γx (r − s) = γx (h). To be more precise, γx (h) = Cov (Xt+h, Xt). In the same way, ρx (h) = γx (h) /γx (0). Example 3 Xt = et + θet−1, et ∼ iid (0, σ2 ). Xt is stationary. Example 4 Xt = Xt−1 + et , et ∼ iid (0, σ2 ). Then Xt = t i=1 ei + X0. Xt is not stationary, since V ar (Xt) = tσ2 (assume X0 = 0). Example 5 Xt ≡ N (0, σ2 t ). Xt is not stationary. The time series {Xt , t ∈ Z} is said to be strictly stationary if the joint distribution of (Xt1 , · · · , Xtk ) ′ and (Xt1+h, · · · , Xtk+h) ′ are the same for all positive integers k and for all t1, · · · , tk, h ∈ Z. 12.3 Autoregressive processes yt = α1yt−1 + · · · + αpyt−p + et : AR (p) process where Eet = 0 and Eetes = σ 2 , t = s = 0, t = s
CHAPTER 12 TIME SERIES ANALYSIS Or we may write using the lag operator (1-a1L 1)92ct Asymptotic theory of AR(1) mode et a COL 犹t-1犹t We have 1. Cols a 2.Vr(a-a)N(O1-a2)a Proof hen, using Cheby chev's inequality, we may obtain -1e 2 P 2. By the central limit theorem VF∑数=115 N(oni "p lim 2yi-I NOh Her Vr(a-a d N(ohl
CHAPTER 12 TIME SERIES ANALYSIS 3 Or we may write using the lag operator (1 − α1L − · · · − αpL p ) yt = et (L p yt = yt−p) Asymptotic theory of AR (1) model yt = αyt−1 + et . |α| < 1 αˆOLS = T t=2 yt−1yt / T t=2 y 2 t−1 We have 1. αˆOLS p→ α 2. √ T (ˆα − α) d→ N (0, 1 − α 2 ). Proof. 1. αˆ − α = yt−1et/ y 2 t−1 . We may express yt−1 = ∞ i=0 α i et−1−i . Then, using Chebychev’s inequality, we may obtain yt−1et/T p→ 0 y 2 t−1 /T p→ σ 2 1 − α2 . 2. By the central limit theorem, 1 √ T yt−1et d→ N 0, σ2 p lim y 2 t−1 T = N 0, σ 4 1 − α2 Hence √ T (ˆα − α) d→ N 0, 1 − α 2
CHAPTER 12 TIME SERIES ANALYSIS a fl (ah1)hN(0,1ha2) (ah 1)f nn-n ym Ir, nd mv,rble (See Fuller(1976),"Inr du re l thVe ryH ar、 h 1 hN(0,1) ∑ AR(P) 3t=01-1+tttfap3t-p+et, t=p+1,ttt Up+1=a1yp+ttt+ apy1+ep+1 yr=C13r-1+ ttt+ ap3t-pter Za+ yr-1 ttt Jr-p (yh za(yh za) COLS
CHAPTER 12 TIME SERIES ANALYSIS 4 When α = 1, we have 1. αˆ p→ 1. But 2. √ T (αˆ − 1) d N (0, 1 − α 2 ). In fact, T (ˆα − 1) d→ a non—normal random variable. (See Fuller (1976), “Introduction to Statistical Time Series”). As a result, for a t−test for H0 : α = 1, t = αˆ − 1 σˆ 2 y 2 t−1 −1 d N (0, 1). Least squares estimation of AR (p) processes yt = α1yt−1 + · · · + αpyt−p + et , t = p + 1, · · · , T yp+1 = α1yp + · · · + αpy1 + ep+1 . . . yT = α1yT−1 + · · · + αpyt−p + eT or y = Xα + e where y = yp+1 . . . yT , X = yp · · · y1 . . . yT −1 · · · yT −p , α = α1 . . . αp , and e = ep+1 . . . eT . Then, αˆOLS = (X ′X) −1 X ′ y, σˆ 2 = (y − Xαˆ) ′ (y − Xαˆ) T − p . We can show that 1. αˆ p→ α
CHAPTER 12 TIME SERIES ANALYSIS 17 E if-/p=c/ g i)..-iAna=y Enw--way-] a./a. 3h i).a. iAnazy i] -a- all =w]f. /c-a=ac.if ic Aqua.iAn eh alzh a z-h ttth anzA w h li/nu. id/-/uni- ci=cl/ Example 6 Consider the AR arnprocess h d h The characteristic equation for this is eh zd hecz- w ae h h8Z nae h hrZn Hence, 3ht is stationary. We may also express ae h h.8Lnae h h.rLnyt w et which gives geh hLn ah. 8n (Impact of the event that happened long ago is negligible
CHAPTER 12 TIME SERIES ANALYSIS 5 2. √ T (αˆ − α) d→ N (0, Σ), where Σ = σ 2 γ0 γp−1 γ1 γ0 γp−2 . . . γp−1 γ0 −1 , γh = Eyt+hyt , if the process yt is stationary. Another way to state that yt is stationary is that all roots of the characteristic equation 1 − α1Z − α2Z 2 − · · · − αpZ p = 0 lie outside the unit circle. Example 6 Consider the AR (2) process yt − yt−1 + 0.16yt−2 = et . The characteristic equation for this is 1 − Z + 0.16Z 2 = (1 − 0.8Z) (1 − 0.2Z), which gives Z = 1 0.8 , 1 0.2 . Hence, yt is stationary. We may also express (1 − 0.8L) (1 − 0.2L) yt = et , which gives (1 − 0.8L) yt = ∞ i=0 (0.2)i et−i = ut and yt = ∞ i=0 (0.8)i ut−i . (Impact of the event that happened long ago is negligible)