lecture 18 time series nan ye
play

Lecture 18. Time Series Nan Ye School of Mathematics and Physics - PowerPoint PPT Presentation

Lecture 18. Time Series Nan Ye School of Mathematics and Physics University of Queensland 1 / 29 15 Quarterly Earnings per Share 10


  1. Lecture 18. Time Series Nan Ye School of Mathematics and Physics University of Queensland 1 / 29

  2. ● ● 15 ● ● Quarterly Earnings per Share ● ● ● ● ● ● 10 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 5 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0 1960 1965 1970 1975 1980 Time Johnson & Johnson quarterly earnings per share (1960-I to 1984-IV) 2 / 29

  3. 0.6 ● ● ● ● ● ● ● Global Temperature Deviations ● ● 0.4 ● ● ● ● ● ● ● ● ● ● ● ● 0.2 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0.0 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● −0.2 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● −0.4 ● ● ● ● 1880 1900 1920 1940 1960 1980 2000 Time Yearly average global temperature (1880-2009) 3 / 29

  4. 4000 3000 speech 2000 1000 0 0 200 400 600 800 1000 Time Speech recording of the syllable aaa...hhh 4 / 29

  5. 0.05 0.00 NYSE Returns −0.05 −0.15 0 500 1000 1500 2000 Time Returns from NYSE from 2 Feb 1984 to 31 Dec 1991. 5 / 29

  6. This Lecture • Nature of time series data • Time series modelling • Time series decomposition • Stationarity • Time domain models 6 / 29

  7. Nature of Time Series Data Characteristics • A time series is often viewed as a collection of random variables { X t } indexed by time. • In a time series, measurements at adjacent time points are usually correlated. • As compared to other types of correlated data, such as clustered or longitudinal data, observations in a time series may explicitly depend on previous observations and/or errors. 7 / 29

  8. Probabilistic description • We can describe a time series using the distribution of the random variables { X t } . • Frequently, we look at some summary statistics Mean function 𝜈 X ( t ) = E ( X t ) Autocovariance function 𝛿 X ( s , t ) = cov( X s , X t ) 𝛿 X ( s , t ) Autocorrelation function (ACF) 𝜍 X ( s , t ) = . √︁ 𝛿 X ( s , s ) 𝛿 X ( t , t ) • We often drop X from 𝜈 X , 𝛿 X and 𝜍 X when there is no ambiguity. 8 / 29

  9. Time Series Modelling Chasing after stationarity • The objective of time series modelling is to develop compact representation of the time series, to facilitate tasks including interpretation, prediction, control, hypothesis testing and simulation. • Some form of time invariance is required to find regularity in data and extrapolate into future. • Stationarity is a basic form of time invariance, and much of time series modelling is about transforming times series so that the transformed time series is stationary. 9 / 29

  10. Exploratory data analysis • Plotting the time series should be the first step before any formal modelling attempt. • This is useful for identifying important features for choosing an appropriate model. • For example, use the plots to look for the trends, presence of seasonal components or outliers. 10 / 29

  11. Modelling paradigms • There are two main modelling paradigms. • The time domain approach views a time series as the description of the evolution of an entity, and focuses on capturing the dependence of current value on history. • The frequency domain approach views a time series as the superposition (addition) of periodic variations. 11 / 29

  12. Time Series Decomposition An additive decomposition • A classical decomposition of a time series { X t } is X t = T t + S t + E t , where T t is the trend component, S t is the seasonal component (recurring variation with fixed period), E t is the error component. • The trend component and seasonal component can be deterministic or stochastic. • Sometimes, a cyclical component (recurring variation with non-fixed period) is included in the systematic component. 12 / 29

  13. A multiplicative decomposition • A common multiplicative decomposition is X t = T t S t E t . • This is equivalent to first converting X t to the log scale and then perform an additive decomposition ln X t = ln T t + ln S t + ln E t 13 / 29

  14. Stationarity Strict stationarity • A time series { X t } is strictly stationary if its probabilistic behavior is invariant to time shift. • To be precise, for any k , for any time points t 1 , . . . , t k , for any x 1 , . . . , x k , and for any 𝜀 , we have P ( X t 1 ≤ x 1 , . . . , X t k ≤ x k ) = P ( X t 1 + δ ≤ x 1 , . . . , X t k + δ ≤ x k ) 14 / 29

  15. • Strict stationarity implies that the mean function 𝜈 ( t ) = E ( X t ) and the autocovariance function 𝛿 ( t , t + h ) = cov( X t , X t + h ) do not depend on t . • Strict stationarity is often too much to ask for, and usually not necessary for learning a model. 15 / 29

  16. Stationarity • A time series { X t } is said to be (weakly) stationary if 𝜈 ( t ) and 𝛿 ( t , t + h ) do not depend on t . • The autocovariance and autocorrelation functions of a stationary time series can be more compactly written as 𝛿 ( h ) = 𝛿 ( t , t + h ) , 𝜍 ( h ) = 𝜍 ( t , t + h ) = 𝛿 ( h ) /𝛿 (0) . 16 / 29

  17. • Randomness in a stationary time series is sufficiently constrained for finding out regularity in data. • A stationary time series has a trivial system component (constant mean). • Stationary time series are used as an important building block for the random component of more sophisticated models. 17 / 29

  18. Time Domain Models White noise processes • A white noise process { 𝜗 t } is a collection of uncorrelated random variables with mean 0 and finite variance 𝜏 2 . • This is often denoted as 𝜗 t ∼ WN(0 , 𝜏 2 ). • The mean, autocovariance and autocorrelation functions are 𝜈 ( t ) = 0 {︄ 𝜏 2 , h = 0 , 𝛿 ( t , t + h ) = cov( 𝜗 t , 𝜗 t + h ) = 0 , h ̸ = 0 . {︄ 1 , h = 0 , 𝜍 ( t , t + h ) = 0 , h ̸ = 0 . • White noise processes are thus stationary, and they serve as an important building block for more sophisticated time series models. 18 / 29

  19. WN(0, 1) 2 1 x 0 −1 −2 0 50 100 150 200 Time An example white noise series. 19 / 29

  20. Random Walk • Consider the random walk X t = ∑︁ t i =1 𝜗 i , where 𝜗 t ∼ WN(0 , 𝜏 2 ). • The mean, autocovariance, and autocorrelation functions are 𝜈 ( t ) = 0 , 𝛿 ( t , t + h ) = t 𝜏 2 , t 𝜍 ( t , t + h ) = . √︁ t ( t + h ) • { X t } is not stationary. 20 / 29

  21. Random walk 10 0 x −10 −20 0 50 100 150 200 Time Three random walk series from the same model. 21 / 29

  22. Moving average process • { X t } is a moving average process of order 1, or MA(1), if X t = 𝜗 t + 𝜄𝜗 t − 1 , where 𝜗 t ∼ WN(0 , 𝜏 2 ). • The mean, autocovariance, and autocorrelation functions are 𝜈 ( t ) = 0 , ⎧ 𝜏 2 (1 + 𝜄 2 ) , h = 0 , ⎪ ⎨ 𝛿 ( t , t + h ) = 𝜏 2 𝜄, , h = ± 1 , ⎪ 0 , otherwise , ⎩ ⎧ 1 , h = 0 , ⎪ ⎨ 𝜍 ( t , t + h ) = 𝜄/ (1 + 𝜄 2 ) , . h = ± 1 , ⎪ 0 , otherwise , ⎩ • MA(1) is stationary. 22 / 29

  23. MA ( 1 ) θ = 0.9 4 3 2 1 x 0 −1 −2 −3 0 50 100 150 200 Time Two MA(1) series from the same model. 23 / 29

  24. Autoregressive process • { X t } is an autoregressive process of order 1, or AR(1), if X t = 𝜒 X t − 1 + 𝜗 t , where 𝜗 t ∼ WN(0 , 𝜏 2 ). • When AR(1) is stationary, the mean, autocovariance and autocorrelation functions are 𝜈 ( t ) = 0 , 𝛿 ( t , t + h ) = 𝜒 | h | 𝜏 2 1 − 𝜒 2 , 𝜍 ( t , t + h ) = 𝜒 | h | . 24 / 29

  25. AR ( 1 ) φ = 0.9 10 5 x 0 −5 0 50 100 150 200 Time Two AR(1) series from the same model. 25 / 29

  26. Linear processes • A linear process { X t } is a linear combination white noise variates 𝜗 t , that is, + ∞ ∑︂ X t = 𝜈 + 𝜔 i 𝜗 t − i , i = −∞ where 𝜗 t ∼ WN(0 , 𝜏 2 ). • The mean and covariance functions are 𝜈 ( t ) = 𝜈, ∞ ∑︂ 𝛿 ( t , t + h ) = 𝜏 2 𝜔 i 𝜔 i + h . i = −∞ 26 / 29

  27. {︄ 1 , i = 0 , • White noise is a linear process with 𝜈 = 0 and 𝜔 i = i ̸ = 0 . . 0 , ⎧ 1 , i = 0 , ⎪ ⎨ • MA(1) is a linear process with 𝜈 = 0, and 𝜔 i = 𝜄, i = 1 , . ⎪ 0 , otherwise . ⎩ {︄ 𝜒 i , i ≥ 0 , • AR(1) is a linear process with 𝜈 = 0, and 𝜔 i = otherwise . . 0 , 27 / 29

  28. ARMA • { X t } is ARMA( p , q ) if it is stationary and X t = 𝜒 1 X t − 1 + . . . + 𝜒 p X t − p + 𝜗 t + 𝜄 1 𝜗 t − 1 + 𝜄 q 𝜗 t − q , where 𝜒 p ̸ = 0, 𝜄 q ̸ = 0 and 𝜗 t ∼ WN(0 , 𝜏 2 ). • p and q are called the autoregressive and the moving average orders respectively. • AR(1) = ARMA(1 , 0), and MA(1) = ARMA(0 , 1). 28 / 29

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend