when uniform weak convergence fails empirical processes
play

When uniform weak convergence fails: Empirical processes for - PowerPoint PPT Presentation

When uniform weak convergence fails: Empirical processes for dependence functions and residuals via epi- and hypographs Axel B ucher, Johan Segers and Stanislav Volgushev Universit e catholique de Louvain and Ruhr-Universit at Bochum


  1. When uniform weak convergence fails: Empirical processes for dependence functions and residuals via epi- and hypographs Axel B¨ ucher, Johan Segers and Stanislav Volgushev Universit´ e catholique de Louvain and Ruhr-Universit¨ at Bochum Van Dantzig Seminar, Mathematical Institute, Leiden University, 11 Apr 2014 1/ 32

  2. Motivation Uniform convergence of bounded functions Strong implications vs. Restricted applicability 2/ 32

  3. Motivation Uniform convergence of bounded functions Strong implications vs. Restricted applicability ◮ Implies pointwise, continuous, L p -convergence . . . 2/ 32

  4. Motivation Uniform convergence of bounded functions Strong implications vs. Restricted applicability ◮ Implies pointwise, continuous, L p -convergence . . . ◮ Well-developed weak convergence theory Great success story in mathematical statistics [Van der Vaart and Wellner (1996): Weak convergence and empirical processes] 2/ 32

  5. Motivation Uniform convergence of bounded functions Strong implications vs. Restricted applicability ◮ Implies pointwise, continuous, L p -convergence . . . ◮ Well-developed weak convergence theory Great success story in mathematical statistics [Van der Vaart and Wellner (1996): Weak convergence and empirical processes] ◮ Many applications through the continuous mapping theorem and the functional delta method 2/ 32

  6. Motivation Uniform convergence of bounded functions Strong implications vs. Restricted applicability ◮ Implies pointwise, continuous, ◮ Continuous functions cannot L p -convergence . . . converge to jump functions ◮ Well-developed weak convergence 1.2 theory 1.0 Great success story in 0.8 mathematical statistics 0.6 0.4 [Van der Vaart and Wellner (1996): Weak 0.2 convergence and empirical processes] 0.0 ◮ Many applications through the − 0.2 continuous mapping theorem and 0.0 0.2 0.4 0.6 0.8 1.0 the functional delta method 2/ 32

  7. Motivation Uniform convergence of bounded functions Strong implications vs. Restricted applicability ◮ Implies pointwise, continuous, ◮ Continuous functions cannot L p -convergence . . . converge to jump functions ◮ Well-developed weak convergence 1.2 theory 1.0 Great success story in 0.8 mathematical statistics 0.6 0.4 [Van der Vaart and Wellner (1996): Weak 0.2 convergence and empirical processes] 0.0 ◮ Many applications through the − 0.2 continuous mapping theorem and 0.0 0.2 0.4 0.6 0.8 1.0 the functional delta method ◮ Questions: Weaker metric? Weak convergence theory? Applications? 2/ 32

  8. Empirical processes via epi- and hypographs The empirical copula process Weak convergence with respect to the uniform metric Non-smooth copulas: when weak convergence fails The hypi-semimetric and weak convergence Applications 3/ 32

  9. Empirical processes via epi- and hypographs The empirical copula process Weak convergence with respect to the uniform metric Non-smooth copulas: when weak convergence fails The hypi-semimetric and weak convergence Applications 4/ 32

  10. Copulas ◮ A d -variate copula C is a d -variate distribution function with uniform (0 , 1) margins. 5/ 32

  11. Copulas ◮ A d -variate copula C is a d -variate distribution function with uniform (0 , 1) margins. ◮ Sklar’s (1959) theorem: If F is a d -variate distribution function with margins F 1 , . . . , F d , then there exists a copula C such that � � F ( x 1 , . . . , x d ) = C F 1 ( x 1 ) , . . . , F d ( x d ) 5/ 32

  12. Copulas ◮ A d -variate copula C is a d -variate distribution function with uniform (0 , 1) margins. ◮ Sklar’s (1959) theorem: If F is a d -variate distribution function with margins F 1 , . . . , F d , then there exists a copula C such that � � F ( x 1 , . . . , x d ) = C F 1 ( x 1 ) , . . . , F d ( x d ) ◮ Moreover, if the margins are continuous, then C is unique and is given by the distribution function of ( F 1 ( X 1 ) , . . . , F d ( X d )), with ( X 1 , . . . , X d ) ∼ F : C ( u 1 , . . . , u d ) = P [ F 1 ( X 1 ) ≤ u 1 , . . . , F d ( X d ) ≤ u d ] = P [ X 1 ≤ F − 1 ( u 1 ) , . . . , X d ≤ F − d ( u d )] F − 1 ( u 1 ) , . . . , F − � � = F d ( u d ) with F − j ( u ) = inf { x : F j ( x ) ≥ u } the generalized inverse (quantile function) 5/ 32

  13. Copulas ◮ A d -variate copula C is a d -variate distribution function with uniform (0 , 1) margins. ◮ Sklar’s (1959) theorem: If F is a d -variate distribution function with margins F 1 , . . . , F d , then there exists a copula C such that � � F ( x 1 , . . . , x d ) = C F 1 ( x 1 ) , . . . , F d ( x d ) ◮ Moreover, if the margins are continuous, then C is unique and is given by the distribution function of ( F 1 ( X 1 ) , . . . , F d ( X d )), with ( X 1 , . . . , X d ) ∼ F : C ( u 1 , . . . , u d ) = P [ F 1 ( X 1 ) ≤ u 1 , . . . , F d ( X d ) ≤ u d ] = P [ X 1 ≤ F − 1 ( u 1 ) , . . . , X d ≤ F − d ( u d )] F − 1 ( u 1 ) , . . . , F − � � = F d ( u d ) with F − j ( u ) = inf { x : F j ( x ) ≥ u } the generalized inverse (quantile function) ◮ Usage: Modelling dependence between components X 1 , . . . , X d , irrespective of their marginal distributions 5/ 32

  14. The empirical copula ◮ Situation: ( X i ) i =1 ,..., n i.i.d. rvs, X i ∼ F = C ( F 1 , . . . , F d ), continuous marginals F j . [hence C ( u ) = F { F − 1 ( u 1 ) , . . . , F − d ( u d ) } with the generalized inverse F − j ( u ) = inf { x : F j ( x ) ≥ u } ]

  15. The empirical copula ◮ Situation: ( X i ) i =1 ,..., n i.i.d. rvs, X i ∼ F = C ( F 1 , . . . , F d ), continuous marginals F j . [hence C ( u ) = F { F − 1 ( u 1 ) , . . . , F − d ( u d ) } with the generalized inverse F − j ( u ) = inf { x : F j ( x ) ≥ u } ] ◮ Goal: Estimate C nonparametrically.

  16. The empirical copula ◮ Situation: ( X i ) i =1 ,..., n i.i.d. rvs, X i ∼ F = C ( F 1 , . . . , F d ), continuous marginals F j . [hence C ( u ) = F { F − 1 ( u 1 ) , . . . , F − d ( u d ) } with the generalized inverse F − j ( u ) = inf { x : F j ( x ) ≥ u } ] ◮ Goal: Estimate C nonparametrically. ◮ Simple plug-in estimation: empirical cdfs n n F n ( x ) := 1 F nj ( x j ) := 1 � � I ( X i 1 ≤ x 1 , . . . , X id ≤ x d ) , I ( X ij ≤ x j ) . n n i =1 i =1 yield the empirical copula n C n ( u ) = F n { F − n 1 ( u 1 ) , . . . , F − nd ( u d ) } = n − 1 � I { X i 1 ≤ F − n 1 ( u 1 ) , . . . , X id ≤ F − nd ( u d ) } i =1 6/ 32

  17. The empirical copula ◮ Situation: ( X i ) i =1 ,..., n i.i.d. rvs, X i ∼ F = C ( F 1 , . . . , F d ), continuous marginals F j . [hence C ( u ) = F { F − 1 ( u 1 ) , . . . , F − d ( u d ) } with the generalized inverse F − j ( u ) = inf { x : F j ( x ) ≥ u } ] ◮ Goal: Estimate C nonparametrically. ◮ Simple plug-in estimation: empirical cdfs n n F n ( x ) := 1 F nj ( x j ) := 1 � � I ( X i 1 ≤ x 1 , . . . , X id ≤ x d ) , I ( X ij ≤ x j ) . n n i =1 i =1 yield the empirical copula n C n ( u ) = F n { F − n 1 ( u 1 ) , . . . , F − nd ( u d ) } = n − 1 � I { X i 1 ≤ F − n 1 ( u 1 ) , . . . , X id ≤ F − nd ( u d ) } i =1 n � � ˆ U i 1 ≤ u 1 , . . . , ˆ = n − 1 + O ( n − 1 ) � I U id ≤ u d i =1 [where ˆ U ij = rank( X ij ) / n are ‘pseudo-observations’ of C (rescaled ranks)] 6/ 32

  18. The empirical copula process u �→ C n ( u ) = √ n { C n ( u ) − C ( u ) } ∈ ℓ ∞ ([0 , 1] d ) is called empirical copula process. [ ℓ ∞ ([0 , 1] d ) the space of bounded functions on [0 , 1] d .] 7/ 32

  19. The empirical copula process u �→ C n ( u ) = √ n { C n ( u ) − C ( u ) } ∈ ℓ ∞ ([0 , 1] d ) is called empirical copula process. [ ℓ ∞ ([0 , 1] d ) the space of bounded functions on [0 , 1] d .] Many applications. ◮ Testing for structural assumptions. Example: symmetry [Genest, Neˇ slehov´ a, Quessy (2012)]. Null hypothesis: C ( u , v ) = C ( v , u ) for all u , v . � � { C n ( u , v ) − C n ( v , u ) } 2 du dv { C n ( u , v ) − C n ( v , u ) } 2 du dv H 0 T n = n = 7/ 32

  20. The empirical copula process u �→ C n ( u ) = √ n { C n ( u ) − C ( u ) } ∈ ℓ ∞ ([0 , 1] d ) is called empirical copula process. [ ℓ ∞ ([0 , 1] d ) the space of bounded functions on [0 , 1] d .] Many applications. ◮ Testing for structural assumptions. Example: symmetry [Genest, Neˇ slehov´ a, Quessy (2012)]. Null hypothesis: C ( u , v ) = C ( v , u ) for all u , v . � � { C n ( u , v ) − C n ( v , u ) } 2 du dv { C n ( u , v ) − C n ( v , u ) } 2 du dv H 0 T n = n = ◮ Minimum-distance estimators of parametric copulas [Tsukahara (2005)]. { C θ | θ ∈ Θ } class of parametric candidate models. Estimator: � { C θ ( u , v ) − C n ( u , v ) } 2 du dv . ˆ θ := argmin θ 7/ 32

  21. The empirical copula process u �→ C n ( u ) = √ n { C n ( u ) − C ( u ) } ∈ ℓ ∞ ([0 , 1] d ) is called empirical copula process. [ ℓ ∞ ([0 , 1] d ) the space of bounded functions on [0 , 1] d .] Many applications. ◮ Testing for structural assumptions. Example: symmetry [Genest, Neˇ slehov´ a, Quessy (2012)]. Null hypothesis: C ( u , v ) = C ( v , u ) for all u , v . � � { C n ( u , v ) − C n ( v , u ) } 2 du dv { C n ( u , v ) − C n ( v , u ) } 2 du dv H 0 T n = n = ◮ Minimum-distance estimators of parametric copulas [Tsukahara (2005)]. { C θ | θ ∈ Θ } class of parametric candidate models. Estimator: � { C θ ( u , v ) − C n ( u , v ) } 2 du dv . ˆ θ := argmin θ ◮ Goodness-of fit tests, Asymptotics of estimators for Pickands dep. fct. ... 7/ 32

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend