Forecasting General problem: predict x n + m given x n , x n 1 , . - - PowerPoint PPT Presentation

forecasting general problem predict x n m given x n x n 1
SMART_READER_LITE
LIVE PREVIEW

Forecasting General problem: predict x n + m given x n , x n 1 , . - - PowerPoint PPT Presentation

Forecasting General problem: predict x n + m given x n , x n 1 , . . . , x 1 . General solution: the (conditional) distribution of x n + m given x n , x n 1 , . . . , x 1 . In particular, the conditional mean is the best


slide-1
SLIDE 1

Forecasting

  • General problem: predict xn+m given xn, xn−1, . . . , x1.
  • General solution: the (conditional) distribution of xn+m given

xn, xn−1, . . . , x1.

  • In particular, the conditional mean is the best predictor (i.e.

minimum mean squared error).

  • Special case: if {xt} is Gaussian, the conditional distribution

is also Gaussian, with a conditional mean that is a linear function of xn, xn−1, . . . , x1 and a conditional variance that does not depend on xn, xn−1, . . . , x1.

1

slide-2
SLIDE 2

Linear Forecasting

  • What if xt is not Gaussian?
  • Use the best linear predictor: xn

n+m.

  • Not the best possible predictor, but computable.

2

slide-3
SLIDE 3

One-step Prediction

  • The hard way: suppose

xn

n+1 = φn,1xn + φn,2xn−1 + · · · + φn,nx1.

  • Choose φn,1, φn,2, . . . , φn,n to minimize the mean squared pre-

diction error E

  • xn+1 − xn

n+1

2

.

  • Differentiate and equate to zero: n linear equations in the n

unknowns.

  • Solve recursively (in n) using the Durbin-Levinson algorithm.

Incidentally, the PACF is φn,n.

3

slide-4
SLIDE 4

One-step Prediction for an ARMA Model

  • The easy way: suppose we can write

xn+1 = some linear combination of xn, xn−1, . . . , x1 + something uncorrelated with xn, xn−1, . . . , x1.

  • Then the first part is the best linear predictor, and the second

part is the prediction error.

  • E.g. AR(p), p ≤ n:

xn+1 = φ1xn + φ2xn−1 + · · · + φpxn+1−p

  • first part

+ wn+1

second part

.

4

slide-5
SLIDE 5

General ARMA case

  • Now

xn+1 =φ1xn + φ2xn−1 + · · · + φpxn+1−p + θ1wn + θ2wn−1 + · · · + θqwn+1−q + wn+1.

  • First part on the right hand side is a linear combination of

xn, xn−1, . . . , x1.

  • Last part, wn+1, is uncorrelated with xn, xn−1, . . . , x1.

5

slide-6
SLIDE 6
  • Middle part?

If the model is invertible, wt is a linear com- bination of xt, xt−1, . . . , so if n is large, we can truncate the sum at x1, and wn, wn−1, . . . , wn+1−q are all (approximately) linear combinations of xn, xn−1, . . . , x1.

  • So the middle part is also approximately a linear combination
  • f xn, xn−1, . . . , x1, whence

xn

n+1 =φ1xn + φ2xn−1 + · · · + φpxn+1−p

+ θ1wn + θ2wn−1 + · · · + θqwn+1−q and wn+1 is the prediction error, xn+1 − xn

n+1.

6

slide-7
SLIDE 7

Multi-step Prediction

  • The easy way: build on one-step prediction. E.g. two-step:

xn+2 =φ1xn+1 + φ2xn + · · · + φpxn+2−p + θ1wn+1 + θ2wn + · · · + θqwn+2−q + wn+2.

  • Replace xn+1 by xn

n+1 + wn+1:

xn+2 =φ1xn

n+1 + φ2xn + · · · + φpxn+2−p

+ θ2wn + · · · + θqwn+2−q + wn+2 + (φ1 + θ1) wn+1.

7

slide-8
SLIDE 8
  • The first two parts are again (approximately) linear combi-

nations of xn, xn−1, . . . , x1, and the last is uncorrelated with xn, xn−1, . . . , x1. So xn

n+2 =φ1xn n+1 + φ2xn + · · · + φpxn+2−p

+ θ2wn + · · · + θqwn+2−q and the prediction error is xn+2 − xn

n+2 = wn+2 + (φ1 + θ1) wn+1.

  • Note that the mean squared prediction error is

σ2

w

  • 1 + (φ1 + θ1)2

≥ σ2

w.

Mean squared prediction error increases as we predict further into the future.

8

slide-9
SLIDE 9

Forecasting with proc arima

  • E.g. the fishery recruitment data.
  • proc arima program

and output.

  • Note that predictions approach the series mean, and “std

errors” approach the series standard deviation.

9

slide-10
SLIDE 10
  • The “autocorrelation test for residuals” is borderline, largely

because of residual autocorrelations at lags 12, 24, . . . .

  • Spectrum analysis shows that these are caused by seasonal

means, which can be removed: proc arima program and

  • utput.

10