Extensions of Saddlepoint-Based Bootstrap Inference With Application - - PowerPoint PPT Presentation

extensions of saddlepoint based bootstrap inference with
SMART_READER_LITE
LIVE PREVIEW

Extensions of Saddlepoint-Based Bootstrap Inference With Application - - PowerPoint PPT Presentation

Extensions of Saddlepoint-Based Bootstrap Inference With Application to the First Order Moving Average Model (SPBB Inference for the MA(1)) Alex Trindade Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri


slide-1
SLIDE 1

Extensions of Saddlepoint-Based Bootstrap Inference With Application to the First Order Moving Average Model

(SPBB Inference for the MA(1)) Alex Trindade

  • Dept. of Mathematics & Statistics, Texas Tech University

Rob Paige, Missouri University of Science and Technology Indika Wickramasinghe, Eastern New Mexico University

August 2013

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 1 / 17

slide-2
SLIDE 2

Outline

1

Overview of SPBB inference: Saddlepoint-Based Bootstrap An approximate parametric bootstrap for scalar parameter θ

2

Motivating Application: MA(1) Model Estimators as roots of quadratic estimating equations (QEEs) SPBB 95% CI coverages & lengths better than asymptotic

3

Extension 1: Non-Monotone QEEs Problem: invalidates SPBB Solution: double-SPA & importance sampling

4

Extension 2: Non-Gaussian QEEs Problem: need QEEs with tractable MGF... Solution: elliptically contoured distributions, and some tricks... Example: SPBB for Laplace MA(1)

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 2 / 17

slide-3
SLIDE 3

Saddlepoint-Based Bootstrap (SPBB) Inference

Pioneered by Paige, Trindade, & Fernando (SJS, 2009): SPBB: an approximate percentile parametric bootstrap; replace (slow) MC simulation with (fast) saddlepoint approx (SPA); estimators are roots of QEE (quadratic estimating equation); enjoys near exact performance;

  • rders of magnitude faster than bootstrap;

may be only alternative to bootstrap if no exact or asymptotic procedures; Idea:

relate distribution of root of QEE Ψ(θ) to that of estimator ˆ θ; under normality on data have closed form for MGF of QEE; use to saddlepoint approximate distribution of estimator (PDF or CDF); can pivot CDF to get a CI... numerically! leads to 2nd order accurate CIs, coverage error is O(n−1).

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 3 / 17

slide-4
SLIDE 4

SPBB: Key Steps (skip...)

Estimator ˆ θ of θ0 solves QEE Ψ(θ) = x⊺Aθx = 0 Assume: x ∼ N(µ, Σ) = ⇒ closed-form for MGF of QEE. QEE monotone (e.g., decreasing) in θ implies: F ˆ

θ(t) = P( ˆ

θ ≤ t) = P (Ψ(t) ≤ 0) = FΨ(t)(0) Nuisance parameter λ: substitute conditional MLE, ˆ λθ. Now: accurately approximate distribution of ˆ θ via SPA F ˆ

θ(t; θ0, λ0) ≈ ˆ

F ˆ

θ

  • t; θ0, ˆ

λθ0

  • = ˆ

FΨ(t)

  • 0; θ0, ˆ

λθ0

  • CI (θL, θU) produced by pivoting SPA of CDF

ˆ FΨ( ˆ

θobs)(0; θL, ˆ

λθL) = 1 − α 2, ˆ FΨ( ˆ

θobs)(0; θU, ˆ

λθU) = α 2

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 4 / 17

slide-5
SLIDE 5

SPBB: An Approximate Parametric Bootstrap

F ˆ

θ( ˆ

θobs) Ψ(θ) = 0 ˆ θ solves F ˆ

θ( ˆ

θobs) = FΨ( ˆ

θobs)(0)

Ψ(θ) monotone ˆ FΨ( ˆ

θobs)(0)

SPA via MGF of Ψ(θ) (θL, θU) pivot Intractable! (And bootstrap too expensive...)

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 5 / 17

slide-6
SLIDE 6

The MA(1): World’s Simplest Model?

Model: Xt = θ0Zt−1 + Zt, Zt ∼ iid (0, σ2), |θ0| ≤ 1 Uses:

special case of more general ARMA models; perhaps most useful in testing if data has been over-differenced... if we difference WN we get MA(1) with θ0 = −1 Xt = Zt = ⇒ Yt ≡ Xt − Xt−1 = Zt − Zt−1 connection with unit-root tests in econometrics (Tanaka, 1990, Davis et al., 1995, Davis & Dunsmuir, 1996, Davis & Song, 2011).

Inference: complicated...

common estimators (MOME, LSE, MLE) have mixed distributions, point masses at ±1 and continuous over (−1, 1); LSE & MLE are roots of polynomials of degree ≈ 2n.

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 6 / 17

slide-7
SLIDE 7

Unification of Parameter Estimators (New)

Theorem

For |θ| < 1, MOME, LSE, and MLE are all roots of QEE, Ψ(θ) = x⊺Aθx, where symmetric matrix Aθ in each case is MOME: Aθ = (1 + θ2)Jn − 2θIn and QEE is monotone in θ. LSE: Aθ = Ω−1

θ [Jn + 2θIn]Ω−1 θ

MLE: Aθ = function(θ, In, Jn, Ω−1

θ )

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 7 / 17

slide-8
SLIDE 8

SPA densities of estimators: MOME, LSE, MLE, AN

−1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5

n=10, θ=0.4

AN MLE CLSE MOME

−1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5 2.0

n=10, θ=0.8

−1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5 2.0

n=20, θ=0.4

−1.0 −0.5 0.0 0.5 1.0 0.0 1.0 2.0 3.0

n=20, θ=0.8 alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 8 / 17

slide-9
SLIDE 9

95% CI Coverages & Lengths for MOME (Gaussian Noise)

Settings Coverage Probability Average Length n θ0 SPBB Boot AN SPBB Boot AN 10 0.4 0.940 0.432 0.997 1.484 1.438 0.561 10 0.8 0.948 0.358 0.259 1.336 1.653 1.300 20 0.4 0.953 0.717 1.000 1.095 1.560 0.334 20 0.8 0.960 0.524 0.693 1.005 1.692 1.616

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 9 / 17

slide-10
SLIDE 10

Extension 1: Non-Monotone Estimating Equations

Monotonicity of QEE is key (Daniels, 1983), SPA for PDF of ˆ θ: ˆ f ˆ

θ (t) = ˆ

fΨ(t)(0)JD(t), JD(t) = −1 ˆ s ∂KΨ(t)(ˆ s) ∂t Skovgaard (1990) & Spady (1991) give expression for ˆ f ˆ

θ where

Jacobian does not require monotonicity of Ψ(t) in t. ˆ f ˆ

θ (t) = ˆ

fΨ(t)(0)JS(t), JS(t) = E | ˙ Ψ(t)| | Ψ(t) = 0

  • But: JS(t) is an intractable conditional expectation...

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 10 / 17

slide-11
SLIDE 11

Solution: Double-SPA & Importance Sampling

Define: ˆ f ˙

Ψ(t)|Ψ(t)(z|0) ≡ f (z)

Algorithm

For sufficiently large integer m, instrumental density g(z) ∼ t3, and grid

  • f values t ∈ [−1, 1], do:

draw an iid sample z1, . . . , zm from g(z); for i = 1, . . . , m, obtain double-SPA to f (zi) (Butler, 2007); form the importance sampling approximation to JS(t) as ˆ JS(t) = ∑m

i=1 |zi|f (zi)/g(zi)

∑m

i=1 f (zi)/g(zi)

;

  • btain ˆ

fΨ(t)(0) and set ˆ f ˆ

θ(t) = ˆ

fΨ(t)(0)ˆ JS(t).

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 11 / 17

slide-12
SLIDE 12

Example: Density of MLE (Gaussian Noise)

−1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5 2.0

n=10, θ=0.4

−1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5 2.0

n=10, θ=0.8

Skovgaard (solid) Daniels (dashed) Empirical (histogram)

−1.0 −0.5 0.0 0.5 1.0 0.0 1.0 2.0 3.0

n=20, θ=0.4

−1.0 −0.5 0.0 0.5 1.0 0.0 1.0 2.0 3.0

n=20, θ=0.8 alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 12 / 17

slide-13
SLIDE 13

Extension 2: Non-Gaussian QEEs

General Problem with SPBB: need QEEs with tractable MGF... One solution: elliptically contoured (EC) distributions. Provost and Cheong (2002): pdf of y ∼ ECn (µ, Σ, φ) can be expressed as infinite mixture of normals f (y) =

w (t) φn (y; µ, Σ/t) dt,

φn (y; µ, Σ/t): PDF of n-dim Nn(µ, Σ/t); w(t): appropriate weighting function integrating to 1 over R+.

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 13 / 17

slide-14
SLIDE 14

Relate MGFs of QEE: Gaussian vs. Elliptically Contoured

Let MN (s; µ, Σ) be MGF of Ψ(θ) ≡ y⊺Aθy when y ∼ Nn(µ, Σ), then MN (s; µ, Σ) = ...well known form... Let MEC (s; µ, Σ) be MGF of Ψ(θ) when y ∼ ECn (µ, Σ, φ) with weighting function w(t).

Theorem

With above definitions: MEC (s; µ, Σ) =

w (t) MN (s; µ, Σ/t) dt Still have problems of: finding members of EC family with known w(t), and performing a 1-dim integral.

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 14 / 17

slide-15
SLIDE 15

Example: SPBB for Laplace MA(1)

Example

Have QEE Ψ(θ) = x⊺Aθx with x ∼ multivariate Laplace. Yields MGF MΨ(θ) (s) = Γ(n/2) Γ(n)2(5+n)/2√π

e−q(t)dt (*) q(t) = 1/(8t) + (3/2) log(t) + (1/2) log[p(t)]. With Σ0 = (4n + 4)−1Ωθ0, p(t) = |tIn − 2sΣ0Aθ| = tn − tr(2sΣ0Aθ)tn−1 + O(s2) Now Laplace-approx integral in (*).

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 15 / 17

slide-16
SLIDE 16

SPBB approx PDFs of MOME in Laplace MA(1)

n=5, θ=0.4

Density −1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5

n=10, θ=0.4

Density −1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5

n=5, θ=0.8

Density −1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5

n=10, θ=0.8

Density −1.0 −0.5 0.0 0.5 1.0 0.0 0.5 1.0 1.5

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 16 / 17

slide-17
SLIDE 17

Key References

Butler, R.W. (2007), Saddlepoint Approximations With Applications, New York: Cambridge University Press. Cryer, J.D., and Ledolter, J. (1981), “Small-Sample Properties of the Maximum Likelihood Estimator in the First- Order Moving Average Model”, Biometrika, 68, 691–694. Daniels, H.E. (1983), “Saddlepoint approximations for estimating equations”, Biometrika, 70, 89–96. Paige, R.L., Trindade, A.A. and Fernando, P.H. (2009), “Saddlepoint-based bootstrap inference for quadratic estimating equations”, Scand. J. Stat., 36, 98–111. Provost, S.B. and Cheong, Y-H. (2002), “The Distribution of Hermitian Quadratic Forms in Elliptically Contoured Random Vectors”, J. Statist.

  • Plan. & Inf., 102, 303–316.

Skovgaard, I.M. (1990), “On the density of minimum contrast estimators”,

  • Ann. Statist., 18, 779–789.

alex.trindade@ttu.edu ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige, Missouri SPBB Inference for MA(1) August 2013 17 / 17