introduction to bayesian estimation
play

Introduction to Bayesian Estimation Wouter J. Den Haan London - PowerPoint PPT Presentation

Introduction to Bayesian Estimation Wouter J. Den Haan London School of Economics 2011 by Wouter J. Den Haan c May 31, 2015 Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Overview Maximum


  1. Introduction to Bayesian Estimation Wouter J. Den Haan London School of Economics � 2011 by Wouter J. Den Haan c May 31, 2015

  2. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Overview • Maximum Likelihood • A very useful tool: Kalman filter • Estimating DSGEs • Maximum Likelihood & DSGEs • formulating the likelihood • Singularity when #shocks ≤ number of observables • Bayesian estimation • Tools: • Metropolis Hastings

  3. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Standard Maximum Likelihood problem Theory: y t = a 0 + a 1 x t + ε t N ( 0, σ 2 ) ε t ∼ x t : exogenous Data: { y t , x t } T t = 1

  4. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other ML estimator T ∏ max p ( ε t ) a 0 , a 1 , σ t = 1 where ε t = y t − a 0 − a 1 x t � − ε 2 � 1 t p ( ε t ) = √ exp 2 σ 2 2 π σ

  5. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other ML estimator � � − ( y t − a 0 − a 1 x t ) 2 T 1 ∏ √ max exp 2 σ 2 a 0 , a 1 , σ σ 2 π t = 1

  6. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Rudolph E. Kalman born in Budapest, Hungary, on May 19, 1930

  7. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Kalman filter • Linear projection • Linear projection with orthogonal regressors • Kalman filter The slides for the Kalman filter is based on Ljungqvist and Sargent’s textbook

  8. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Linear projection • y : n y × 1 vector of random variables • x : n x × 1 vector of random variables • First and second moments exist x � = Σ xx E y = µ y y = y − µ y ˜ E ˜ x ˜ y � = Σ yy E x = µ x x = x − µ x ˜ E ˜ y ˜ x � = Σ yx E ˜ y ˜

  9. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Definition of linear projection The linear projection of y on x is the function � E [ y | x ] = a + Bx , a and B are chosen to minimize � ( y − a + Bx )( y − a + Bx ) � � E trace

  10. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Formula for linear projection The linear projection of y on x is given by � E [ y | x ] = µ y + Σ yx Σ − 1 xx ( x − µ x )

  11. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Difference with linear regression problem • True model: Bx + ¯ ¯ y = Dz + ε , E x = E z = E ε = 0, E [ ε | x , z ] = 0 , E [ z | x ] � = 0 ¯ B : measures the effect of x on y keeping all else–also z and ε –constant. • Particular regression model: y = ¯ Bx + u

  12. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Difference with linear regression problem Comments: • Least-squares estimate � = ¯ B • Projection: � D � E [ y | x ] = Bx = ¯ Bx + ¯ E [ z | x ] • Projection well defined linear projection can include more than the direct effect:

  13. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Message: • You can always define the linear projection • you don’t have to worry about the properties of the error term.

  14. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Linear Projection with orthogonal regressors • x = [ x 1 , x 2 ] and suppose that Σ x 1 x 2 = 0 • x 1 and x 2 could be vectors � µ y + Σ yx Σ − 1 E [ y | x ] = xx ( x − µ x ) � � Σ − 1 � � 0 x 1 x 1 Σ yx 1 Σ yx 2 = µ y + ( x − µ x ) Σ − 1 0 x 2 x 2 µ y + Σ y x 1 Σ − 1 x 1 x 1 ( x 1 − µ x 1 ) + Σ y x 2 Σ − 1 = x 2 x 2 ( x 2 − µ x 2 ) Thus � E [ y | x ] = � E [ y | x 1 ] + � E [ y | x 2 ] − µ y (1)

  15. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Time Series Model x t + 1 = Ax t + Gw 1, t + 1 y t = Cx t + w 2, t Ew 1, t = Ew 2, t = 0 � w 1, t + 1 � � w 1, t + 1 � � � V 1 � V 3 E = V � w 2, t w 2, t V 2 3

  16. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Time Series Model • y t is observed, but x t is not • the coefficients are known (could even be time-varying) • Initial condition: • x 1 is a random variable (mean µ x 1 & covariance matrix Σ 1 ) (it is not unusual that x t is simply set equal to µ x 1 . • w 1, t + 1 and w 2, t are serially uncorrelated and orthogonal to x 1

  17. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Objective The objective is to calculate � � E t x t + 1 ≡ E [ x t + 1 | y t , y t − 1 , · · · , y 1 , ˜ x 1 ] � � � x t + 1 | Y t , ˜ = E x 1 where ˜ x 1 is an initial estimate of x 1 Trick: get a recursive formulation

  18. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Orthogonalization of the information set • Let y t = y t − � • ˆ E [ y t | ˆ y t − 1 , ˆ y t − 2 , · · · , ˆ y 1 , ˜ x 1 ] Y t = { ˆ • ˆ y t , ˆ y t − 1 , · · · , ˆ y 1 } x 1 , ˆ Y t } = space spanned by { ˜ • space spanned by { ˜ x 1 , Y t } • That is, anything that can be expressed as a linear x 1 , ˆ Y t } can be expressed as a combination with elements in { ˜ linear combination of elements in { ˜ x 1 , Y t } .

  19. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Orthogonalization of the information set • Then � � = � � � = C � � � � y t + 1 | Y t , ˜ y t + 1 | ˆ Y t , ˜ x t + 1 | ˆ Y t , ˜ E x 1 E x 1 E x 1 (2)

  20. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Derivation of the Kalman filter From (1) we get � � � � = � � y t ] + � Y t − 1 , ˜ x t + 1 | ˆ Y t , ˜ x t + 1 | ˆ E x 1 E [ x t + 1 | ˆ E x 1 − E x t + 1 (3) The first term in (3) is a standard linear projection: y t )] − 1 ( ˆ � E [ x t + 1 | ˆ E x t + 1 + cov ( x t + 1 , ˆ y t ) [ cov ( ˆ y t , ˆ y t − E ˆ y t ] = y t ) y t )] − 1 ˆ = E x t + 1 + cov ( x t + 1 , ˆ y t ) [ cov ( ˆ y t , ˆ y t

  21. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Some algebra • Similar to the definition of ˆ y t , let x t + 1 − � x t + 1 ˆ = E [ x t + 1 | ˆ y t , ˆ y t − 1 , · · · , ˆ y 1 , ˜ x 1 ] x t + 1 − � = E t x t + 1 x � • Let Σ ˆ x t = E ˆ x t ˆ t x t C � + GV 3 cov ( x t + 1 , ˆ y t ) = A Σ ˆ x t C � + V 2 cov ( ˆ y t , ˆ y t ) = C Σ ˆ • To go from unconditional covariance, cov ( · ) , to conditional Σ ˆ x t requires some algebra (see appendix of Ljungqvist-Sargent for details)

  22. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Using the derived expressions � E [ x t + 1 | ˆ y t ] y t )] − 1 ˆ = E x t + 1 + cov ( x t + 1 , ˆ y t ) [ cov ( ˆ y t , ˆ y t � � � � − 1 ˆ x t C � + GV 3 x t C � + V 2 = E x t + 1 + A Σ ˆ C Σ ˆ y t (4)

  23. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Derivation Kalman filter • Now get an expression for the second term in (3). • From x t + 1 = Ax t + Gw 1, t + 1 , we get � � � � � x t + 1 | ˆ Y t − 1 , ˜ = A � x t | ˆ Y t − 1 , ˜ = A � E x 1 E x 1 E t − 1 x t (5)

  24. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Using (4) and (5) in (3) gives the recursive expression � E t x t + 1 = A � E t − 1 x t + K t ˆ y t where � � � � − 1 x t C � + GV 3 x t C � + V 2 K t = A Σ ˆ C Σ ˆ

  25. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Prediction for observable From y t + 1 = Cx t + 1 + w 2, t + 1 we get � x 1 ] = C � E [ y t + 1 | Y t , ˜ E t x t + 1 Thus y t + 1 = y t + 1 − C � ˆ E t x t + 1

  26. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Updating the covariance matrix • We still need an equation to update Σ ˆ x t . This is actually not that hard. The result is x t A � + GV 1 G � − K t ( A Σ ˆ x t C � + GV 3 ) � Σ ˆ x t + 1 = A Σ ˆ • Expression is deterministic and does not depend particular realizations. That is, precision only depends on the coefficients of the time series model

  27. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Applications Kalman filter • signal extraction problems • GPS, computer vision applications, missiles • prediction • simple alternative to calculating inverse policy functions • (see below)

  28. Overview ML Kalman Filter Estimating DSGEs ML & DSGE Bayesian estimation MCMC Other Estimating DSGE models • Forget the Kalman filter for now, we will not use it for a while • What is next? • Specify the neoclassical model that will be used as an example • Specify the linearized version • Specify the estimation problem • Maximum Likelihood estimation • Explain why Kalman filter is useful • Bayesian estimation • MCMC, a necessary tool to do Bayesian estimation

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend