extensions of saddlepoint based bootstrap inference with
play

Extensions of Saddlepoint-Based Bootstrap Inference With Application - PowerPoint PPT Presentation

Extensions of Saddlepoint-Based Bootstrap Inference With Application to the First Order Moving Average Model (SPBB Inference for the MA(1)) Alex Trindade Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri


  1. Extensions of Saddlepoint-Based Bootstrap Inference With Application to the First Order Moving Average Model (SPBB Inference for the MA(1)) Alex Trindade Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri University of Science and Technology Indika Wickramasinghe , Eastern New Mexico University August 2013 SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 1 / 17 alex.trindade@ttu.edu

  2. Outline Overview of SPBB inference: Saddlepoint-Based Bootstrap 1 An approximate parametric bootstrap for scalar parameter θ Motivating Application: MA(1) Model 2 Estimators as roots of quadratic estimating equations (QEEs) SPBB 95% CI coverages & lengths better than asymptotic Extension 1: Non-Monotone QEEs 3 Problem: invalidates SPBB Solution: double-SPA & importance sampling Extension 2: Non-Gaussian QEEs 4 Problem: need QEEs with tractable MGF... Solution: elliptically contoured distributions, and some tricks... Example: SPBB for Laplace MA(1) SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 2 / 17 alex.trindade@ttu.edu

  3. Saddlepoint-Based Bootstrap (SPBB) Inference Pioneered by Paige, Trindade, & Fernando (SJS, 2009): SPBB: an approximate percentile parametric bootstrap; replace (slow) MC simulation with (fast) saddlepoint approx (SPA); estimators are roots of QEE ( quadratic estimating equation ); enjoys near exact performance; orders of magnitude faster than bootstrap; may be only alternative to bootstrap if no exact or asymptotic procedures; Idea: relate distribution of root of QEE Ψ ( θ ) to that of estimator ˆ θ ; under normality on data have closed form for MGF of QEE; use to saddlepoint approximate distribution of estimator (PDF or CDF); can pivot CDF to get a CI... numerically! leads to 2nd order accurate CIs, coverage error is O ( n − 1 ) . SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 3 / 17 alex.trindade@ttu.edu

  4. SPBB: Key Steps (skip...) Estimator ˆ θ of θ 0 solves QEE Ψ ( θ ) = x ⊺ A θ x = 0 Assume: x ∼ N ( µ , Σ ) = ⇒ closed-form for MGF of QEE. QEE monotone (e.g., decreasing) in θ implies: θ ( t ) = P ( ˆ F ˆ θ ≤ t ) = P ( Ψ ( t ) ≤ 0 ) = F Ψ ( t ) ( 0 ) Nuisance parameter λ : substitute conditional MLE, ˆ λ θ . Now: accurately approximate distribution of ˆ θ via SPA � � � � t ; θ 0 , ˆ 0; θ 0 , ˆ θ ( t ; θ 0 , λ 0 ) ≈ ˆ = ˆ F ˆ F ˆ λ θ 0 F Ψ ( t ) λ θ 0 θ CI ( θ L , θ U ) produced by pivoting SPA of CDF λ θ L ) = 1 − α λ θ U ) = α θ obs ) ( 0; θ L , ˆ θ obs ) ( 0; θ U , ˆ ˆ ˆ F Ψ ( ˆ 2, F Ψ ( ˆ 2 SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 4 / 17 alex.trindade@ttu.edu

  5. SPBB: An Approximate Parametric Bootstrap Intractable! (And bootstrap too expensive...) θ ( ˆ θ obs ) ( θ L , θ U ) F ˆ pivot ˆ θ solves ˆ θ obs ) ( 0 ) Ψ ( θ ) = 0 F Ψ ( ˆ Ψ ( θ ) monotone SPA via MGF of Ψ ( θ ) θ ( ˆ θ obs ) = F Ψ ( ˆ θ obs ) ( 0 ) F ˆ SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 5 / 17 alex.trindade@ttu.edu

  6. The MA(1): World’s Simplest Model? Model: Z t ∼ iid ( 0, σ 2 ) , X t = θ 0 Z t − 1 + Z t , | θ 0 | ≤ 1 Uses: special case of more general ARMA models; perhaps most useful in testing if data has been over-differenced... if we difference WN we get MA(1) with θ 0 = − 1 X t = Z t = ⇒ Y t ≡ X t − X t − 1 = Z t − Z t − 1 connection with unit-root tests in econometrics (Tanaka, 1990, Davis et al. , 1995, Davis & Dunsmuir, 1996, Davis & Song, 2011). Inference: complicated... common estimators (MOME, LSE, MLE) have mixed distributions, point masses at ± 1 and continuous over ( − 1, 1 ) ; LSE & MLE are roots of polynomials of degree ≈ 2 n . SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 6 / 17 alex.trindade@ttu.edu

  7. Unification of Parameter Estimators (New) Theorem For | θ | < 1 , MOME, LSE, and MLE are all roots of QEE, Ψ ( θ ) = x ⊺ A θ x , where symmetric matrix A θ in each case is MOME: A θ = ( 1 + θ 2 ) J n − 2 θ I n and QEE is monotone in θ . LSE: A θ = Ω − 1 θ [ J n + 2 θ I n ] Ω − 1 θ MLE: A θ = function ( θ , I n , J n , Ω − 1 θ ) SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 7 / 17 alex.trindade@ttu.edu

  8. SPA densities of estimators: MOME, LSE, MLE, AN n=10, θ =0.4 n=10, θ =0.8 1.5 2.0 AN MLE CLSE 1.5 1.0 MOME 1.0 0.5 0.5 0.0 0.0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 n=20, θ =0.4 n=20, θ =0.8 2.0 3.0 1.5 2.0 1.0 1.0 0.5 0.0 0.0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 8 / 17 alex.trindade@ttu.edu

  9. 95% CI Coverages & Lengths for MOME (Gaussian Noise) Settings Coverage Probability Average Length n θ 0 SPBB Boot AN SPBB Boot AN 10 0.4 0.940 0.432 0.997 1.484 1.438 0.561 10 0.8 0.948 0.358 0.259 1.336 1.653 1.300 20 0.4 0.953 0.717 1.000 1.095 1.560 0.334 20 0.8 0.960 0.524 0.693 1.005 1.692 1.616 SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 9 / 17 alex.trindade@ttu.edu

  10. Extension 1: Non-Monotone Estimating Equations Monotonicity of QEE is key (Daniels, 1983), SPA for PDF of ˆ θ : ∂ K Ψ ( t ) ( ˆ s ) J D ( t ) = − 1 ˆ θ ( t ) = ˆ f ˆ f Ψ ( t ) ( 0 ) J D ( t ) , s ∂ t ˆ Skovgaard (1990) & Spady (1991) give expression for ˆ f ˆ θ where Jacobian does not require monotonicity of Ψ ( t ) in t . ˆ θ ( t ) = ˆ � | ˙ � f ˆ f Ψ ( t ) ( 0 ) J S ( t ) , J S ( t ) = E Ψ ( t ) | | Ψ ( t ) = 0 But: J S ( t ) is an intractable conditional expectation... SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 10 / 17 alex.trindade@ttu.edu

  11. Solution: Double-SPA & Importance Sampling Define: ˆ Ψ ( t ) | Ψ ( t ) ( z | 0 ) ≡ f ( z ) f ˙ Algorithm For sufficiently large integer m, instrumental density g ( z ) ∼ t 3 , and grid of values t ∈ [ − 1, 1 ] , do: draw an iid sample z 1 , . . . , z m from g ( z ) ; for i = 1, . . . , m, obtain double-SPA to f ( z i ) (Butler, 2007); form the importance sampling approximation to J S ( t ) as J S ( t ) = ∑ m i = 1 | z i | f ( z i ) / g ( z i ) ˆ ; ∑ m i = 1 f ( z i ) / g ( z i ) obtain ˆ f Ψ ( t ) ( 0 ) and set ˆ θ ( t ) = ˆ f Ψ ( t ) ( 0 ) ˆ f ˆ J S ( t ) . SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 11 / 17 alex.trindade@ttu.edu

  12. Example: Density of MLE (Gaussian Noise) n=10, θ =0.4 n=10, θ =0.8 2.0 2.0 Skovgaard (solid) Daniels (dashed) 1.5 1.5 Empirical (histogram) 1.0 1.0 0.5 0.5 0.0 0.0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 n=20, θ =0.4 n=20, θ =0.8 3.0 3.0 2.0 2.0 1.0 1.0 0.0 0.0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 12 / 17 alex.trindade@ttu.edu

  13. Extension 2: Non-Gaussian QEEs General Problem with SPBB: need QEEs with tractable MGF... One solution: elliptically contoured (EC) distributions. Provost and Cheong (2002): pdf of y ∼ EC n ( µ , Σ , φ ) can be expressed as infinite mixture of normals � ∞ f ( y ) = w ( t ) φ n ( y ; µ , Σ / t ) dt , 0 φ n ( y ; µ , Σ / t ) : PDF of n -dim N n ( µ , Σ / t ) ; w ( t ) : appropriate weighting function integrating to 1 over R + . SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 13 / 17 alex.trindade@ttu.edu

  14. Relate MGFs of QEE: Gaussian vs. Elliptically Contoured Let M N ( s ; µ , Σ ) be MGF of Ψ ( θ ) ≡ y ⊺ A θ y when y ∼ N n ( µ , Σ ) , then M N ( s ; µ , Σ ) = ...well known form... Let M EC ( s ; µ , Σ ) be MGF of Ψ ( θ ) when y ∼ EC n ( µ , Σ , φ ) with weighting function w ( t ) . Theorem With above definitions: � ∞ M EC ( s ; µ , Σ ) = w ( t ) M N ( s ; µ , Σ / t ) dt 0 Still have problems of: finding members of EC family with known w ( t ) , and performing a 1-dim integral. SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 14 / 17 alex.trindade@ttu.edu

  15. Example: SPBB for Laplace MA(1) Example Have QEE Ψ ( θ ) = x ⊺ A θ x with x ∼ multivariate Laplace. Yields MGF � ∞ Γ ( n / 2 ) e − q ( t ) dt M Ψ ( θ ) ( s ) = Γ ( n ) 2 ( 5 + n ) / 2 √ π (*) 0 q ( t ) = 1 / ( 8 t ) + ( 3 / 2 ) log ( t ) + ( 1 / 2 ) log [ p ( t )] . With Σ 0 = ( 4 n + 4 ) − 1 Ω θ 0 , p ( t ) = | tI n − 2 s Σ 0 A θ | = t n − tr ( 2 s Σ 0 A θ ) t n − 1 + O ( s 2 ) Now Laplace-approx integral in (*). SPBB Inference for MA(1) ( Dept. of Mathematics & Statistics, Texas Tech University Rob Paige , Missouri August 2013 15 / 17 alex.trindade@ttu.edu

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend