mcmc analysis of classical time series algorithms
play

MCMC analysis of classical time series algorithms. Isambi Sailon - PowerPoint PPT Presentation

MCMC analysis of classical time series algorithms. Isambi Sailon mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009 Introduction Theoretical background Practical results Conclusion Outline 1 Introduction


  1. MCMC analysis of classical time series algorithms. Isambi Sailon mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009

  2. Introduction Theoretical background Practical results Conclusion Outline 1 Introduction 2 Theoretical background Box-Jenkins models Box-Jenkins model identification Box-Jenkins model estimation 3 Practical results Series generation Box-Jenkins recursive estimation Regression estimation with new noise values Matlab garchfit estimation 4 Conclusion Isambi Sailon MCMC analysis of classical time series algorithms.

  3. Introduction Theoretical background Practical results Conclusion The aim of this work is to combine the modern MCMC approach with classical time series algorithms, to analyse predictive distributions of estimated parameters. There are three main phases to be analysed: phase one will be about Time Series analysis, phase two will be MCMC series and phase three will be the MCMC analysis for classical time series. Isambi Sailon MCMC analysis of classical time series algorithms.

  4. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion Box-Jenkins models Mathematical models used typically for accurate short-term forecasts of ’well-behaved’ data (that shows predictable repetitive cycles and patterns) and find the best fit of a time series to past values of this time series, in order to make forecasts. A time series is a sequence of observations based on a regular timely basis, e.g. hourly, daily, monthly, annually, etc . The classical time series analysis covers fitting autoregressive (AR) and moving average (MA) models. Isambi Sailon MCMC analysis of classical time series algorithms.

  5. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion AR( r ) x t = C + φ 1 x t − 1 + φ 2 x t − 2 + . . . + φ n x t − r + u t Where C is a constant and φ 1 · · · φ n are autoregressive parameters. MA( m ) x t = C + ψ 1 u t − 1 + ψ 2 u t − 2 + . . . + ψ n u t − m + u t Where C is a constant and ψ 1 · · · ψ n are moving average parameters. ARMA( r , m ) x t = C + φ 1 x t − 1 + φ 2 x t − 2 + . . . + φ n x t − r + ψ 1 u t − 1 + ψ 2 u t − 2 + . . . + ψ n u t − m + u t Where u t ∼ N (0 , σ 2 ) – white noise and x t is the time series Isambi Sailon MCMC analysis of classical time series algorithms.

  6. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion Autocorrelation (AC) A certain type of correlation concept that portrays the dependence of two consecutive observations of time series. Thus Autocorrelations are statistical measures that indicate how a time series is related to itself over time. For example: the autocorrelation at lag 1 is the correlation between the original series and the same series moved forward one period. r k = E [( x t − µ )( x t + k − µ )] σ 2 x where k is the specified lag number, x t are series observations, µ is the series mean value and σ 2 x is the variance. Isambi Sailon MCMC analysis of classical time series algorithms.

  7. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion The ACF will first test whether adjacent observations are autocorrelated; that is, whether there is correlation between observations 1 and 2, 2 and 3, 3 and 4, etc. This is known as lag one autocorrelation, since one of the pair of tested observations lags the other by one period or sample. Similarly, it will test at other lags. For instance, the autocorrelation at lag four tests whether observations 1 and 5, 2 and 6, ...,19 and 23, etc. are correlated. Isambi Sailon MCMC analysis of classical time series algorithms.

  8. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion Estimates at longer lags have been shown to be statistically unreliable (Box and Jenkins, 1970). In some cases, the effect of Autocorrelation at smaller lags will influence the estimate of autocorrelation at longer lags. For instance, a strong lag one autocorrelation would cause observation 5 to influence observation 6, and observation 6 to influences 7. This results in an apparent correlation between observations 5 and 7, even though no direct correlation exists. Isambi Sailon MCMC analysis of classical time series algorithms.

  9. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion Partial autocorrelation (PAC) The Partial Autocorrelation Function (PACF) removes the effect of shorter lag autocorrelation from the correlation estimate at longer lags. PAC is Similar to AC, except that when calculating it, the ACs with all the elements within the lag are partialled out. r k − r 2 r kk = k − 1 1 − r 2 k − 1 PCA measures the correlation between an observation k periods ago and current observation, after controlling for the observations at intermediate lags (all lags < k ), that is, the correlation between y t and y t − k , after removing the effects of y t − k +1 , y t − k +2 , · · · , y t − 1 Isambi Sailon MCMC analysis of classical time series algorithms.

  10. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion Stages in building a Box-Jenkins time series model 1. Model Identification This involves determining the order of the model required to capture the dynamic features of the data. Graphical procedures are used to determine the most appropriate specification (Making sure that the variables are stationary, identifying seasonality, and the use of plots of the autocorrelation and partial autocorrelation functions) Isambi Sailon MCMC analysis of classical time series algorithms.

  11. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion 2. Model Estimation This involves estimation of the parameters of the model specified in Model Identification. The most common methods used are maximum likelihood estimation and non-linear least-squares estimation, depending on the model. Isambi Sailon MCMC analysis of classical time series algorithms.

  12. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion 3. Model Validation This involves model checking, that is, determining whether the model specified and estmated is adequate. Thus: Model validation deals with testing whether the estimated model conforms to the specifications of a stationary univariate process. It can be done by Overfitting and Residual diagnosis Isambi Sailon MCMC analysis of classical time series algorithms.

  13. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion MODEL IDENTIFICATION In Box-Jenkins model one has to determine if the series is stationary and if there is any significant seasonality that needs to be modelled. Isambi Sailon MCMC analysis of classical time series algorithms.

  14. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion A stationary process has the property that the mean, variance and autocorrelation structure do not change over time. It can be assessed by run sequence plot (graph that displays observed data in a time sequence (A graphical data analysis technique for preliminary scanning of the data). Also, apart from Run sequence plot, the condition for testing stationarity of AR model is that the roots of the characteristics equation all lie outside the unit circle. For example x t = 3 x t − 1 − 2 . 75 x t − 2 + 0 . 75 x t − 3 + u t is not stationary because its roots 1 , 2 3 and 2 only one lies outside the unit. Isambi Sailon MCMC analysis of classical time series algorithms.

  15. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion x t = 3 x t − 1 − 2 . 75 x t − 2 + 0 . 75 x t − 3 + u t is expressed using lag operator notation thus it will be as follows: x t = 3 Lx t − 2 . 75 L 2 x t + 0 . 75 L 3 x 3 + u t (1 − 3 L + 2 . 75 L 2 − 0 . 75 L 3 ) x t = u t The characeristic equation is 1 − 3 z + 2 . 75 z 2 − 0 . 75 z 3 = 0. Solving for z ; the solution set is 1 , 2 3 , 2 Isambi Sailon MCMC analysis of classical time series algorithms.

  16. Introduction Box-Jenkins models Theoretical background Box-Jenkins model identification Practical results Box-Jenkins model estimation Conclusion For seasonality, it means periodic fluctuations, that is, a certain basic pattern tends to be repeated at regular seasonal intervals. This can be assessed by a run sequence plot, a seasonal subseries plot, multiple box plots, or autocorrelation plot. Isambi Sailon MCMC analysis of classical time series algorithms.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend