assessing the dependence of high dimensional time series
play

Assessing the dependence of high-dimensional time series via sample - PowerPoint PPT Presentation

Assessing the dependence of high-dimensional time series via sample autocovariances and correlations Johannes Heiny Ruhr University Bochum, Germany Joint work with Thomas Mikosch (Copenhagen), Richard Davis (Columbia), and Jianfeng Yao (HKU).


  1. Assessing the dependence of high-dimensional time series via sample autocovariances and correlations Johannes Heiny Ruhr University Bochum, Germany Joint work with Thomas Mikosch (Copenhagen), Richard Davis (Columbia), and Jianfeng Yao (HKU). KIAS, Random Matrices and Related Topics, May 9, 2019 J. Heiny Heavy-tailed correlation and covariance matrices 1 / 37

  2. Motivation: S&P 500 Index ● ● 5.0 ● ● ● ● ● ● ● ● ● ● ● ● ● 4.5 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 4.0 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● Upper tail index ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 3.5 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 3.0 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 2.5 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 2.0 ● 2.0 2.5 3.0 3.5 4.0 4.5 5.0 Lower tail index Figure: Estimated tail indices of log-returns of 478 time series in the S&P 500 index. J. Heiny Heavy-tailed correlation and covariance matrices 2 / 37

  3. Setup Data matrix X = X p : p × n matrix with iid centered columns. X = ( X it ) i =1 ,...,p ; t =1 ,...,n Sample covariance matrix S = 1 n XX ′ Ordered eigenvalues of S λ 1 ( S ) ≥ λ 2 ( S ) ≥ · · · ≥ λ p ( S ) Applications: Principal Component Analysis Linear Regression, . . . J. Heiny Heavy-tailed correlation and covariance matrices 3 / 37

  4. Sample Correlation Matrix Sample correlation matrix R with entries S ij R ij = � , i, j = 1 , . . . , p S ii S jj and eigenvalues λ 1 ( R ) ≥ λ 2 ( R ) ≥ · · · ≥ λ p ( R ) . J. Heiny Heavy-tailed correlation and covariance matrices 4 / 37

  5. The Model Data structure: X p = A p Z p , where A p is a deterministic p × p matrix such that ( � A p � ) is bounded and Z p = ( Z it ) i =1 ,...,p ; t =1 ,...,n has iid, centered entries with unit variance (if finite). J. Heiny Heavy-tailed correlation and covariance matrices 5 / 37

  6. The Model Data structure: X p = A p Z p , where A p is a deterministic p × p matrix such that ( � A p � ) is bounded and Z p = ( Z it ) i =1 ,...,p ; t =1 ,...,n has iid, centered entries with unit variance (if finite). Population covariance matrix Σ = AA ′ . Population correlation matrix Γ = (diag( Σ )) − 1 / 2 Σ (diag( Σ )) − 1 / 2 Note: E [ S ] = Σ but E [ R ij ] = Γ ij + O ( n − 1 ) . J. Heiny Heavy-tailed correlation and covariance matrices 5 / 37

  7. The Model Sample Population Covariance matrix S Σ Correlation matrix R Γ J. Heiny Heavy-tailed correlation and covariance matrices 6 / 37

  8. The Model Sample Population Covariance matrix S Σ Correlation matrix R Γ Growth regime: p n = n p → ∞ and → γ ∈ [0 , ∞ ) , as p → ∞ . n p p High dimension: lim n ∈ (0 , ∞ ) p →∞ p Moderate dimension: lim n = 0 p →∞ J. Heiny Heavy-tailed correlation and covariance matrices 6 / 37

  9. Main Result Approximation Under Finite Fourth Moment Assume X = AZ and E [ Z 4 11 ] < ∞ . Then we have as p → ∞ , � n/p � diag( S ) − diag( Σ ) � a.s. → 0 . Approximation Under Infinite Fourth Moment Assume X = Z and E [ Z 4 11 ] = ∞ . Then we have as p → ∞ , � S − diag( S ) � P c np → 0 . ���� → 0 J. Heiny Heavy-tailed correlation and covariance matrices 7 / 37

  10. Main result Assume X = AZ and E [ Z 4 11 ] < ∞ . Then we have as p → ∞ , � n/p � diag( S ) − diag( Σ ) � a.s. → 0 , and � n/p � (diag( S )) − 1 / 2 − (diag( Σ )) − 1 / 2 � a.s. → 0 . J. Heiny Heavy-tailed correlation and covariance matrices 8 / 37

  11. Main result Assume X = AZ and E [ Z 4 11 ] < ∞ . Then we have as p → ∞ , � n/p � diag( S ) − diag( Σ ) � a.s. → 0 , and � n/p � (diag( S )) − 1 / 2 − (diag( Σ )) − 1 / 2 � a.s. → 0 . Relevance: Note that R = (diag( S )) − 1 / 2 S (diag( S )) − 1 / 2 . n XX ′ and R = Y Y ′ , where S = 1 � � X ij √ � n Y = ( Y ij ) p × n = t =1 X 2 it p × n In general, any two entries of Y are dependent. J. Heiny Heavy-tailed correlation and covariance matrices 8 / 37

  12. A Comparison Under Finite Fourth Moment Approximation of the sample correlation matrix Assume X = AZ and E [ Z 4 11 ] < ∞ . Then we have � n � a.s. p � R − (diag( Σ )) − 1 / 2 S (diag( Σ )) − 1 / 2 → 0 . � �� � S Q J. Heiny Heavy-tailed correlation and covariance matrices 9 / 37

  13. A Comparison Under Finite Fourth Moment Approximation of the sample correlation matrix Assume X = AZ and E [ Z 4 11 ] < ∞ . Then we have � n � a.s. p � R − (diag( Σ )) − 1 / 2 S (diag( Σ )) − 1 / 2 → 0 . � �� � S Q Spectrum comparison An application of Weyl’s inequality yields � n � n � � � � p � R − S Q � a.s. � λ i ( R ) − λ i ( S Q ) p max � ≤ → 0 . i =1 ,...,p J. Heiny Heavy-tailed correlation and covariance matrices 9 / 37

  14. A Comparison Under Finite Fourth Moment Approximation of the sample correlation matrix Assume X = AZ and E [ Z 4 11 ] < ∞ . Then we have � n � a.s. p � R − (diag( Σ )) − 1 / 2 S (diag( Σ )) − 1 / 2 → 0 . � �� � S Q Spectrum comparison An application of Weyl’s inequality yields � n � n � � � � p � R − S Q � a.s. � λ i ( R ) − λ i ( S Q ) p max � ≤ → 0 . i =1 ,...,p Operator norm consistent estimation � � R − Γ � = O ( p/n ) a . s . J. Heiny Heavy-tailed correlation and covariance matrices 9 / 37

  15. Notation Empirical spectral distribution of p × p matrix C with real eigenvalues λ 1 ( C ) , . . . , λ p ( C ) : p � F C ( x ) = 1 1 { λ i ( C ) ≤ x } , x ∈ R . p i =1 Stieltjes transform : � x − z d F C ( x ) = 1 1 p tr( C − z I ) − 1 , z ∈ C + , s C ( z ) = R Limiting spectral distribution : Weak convergence of ( F C p ) to distribution function F a.s. J. Heiny Heavy-tailed correlation and covariance matrices 10 / 37

  16. Limiting Spectral Distribution of R Assume X = AZ , E [ Z 4 11 ] < ∞ and that F Γ converges to a probability distribution H . 1 If p/n → γ ∈ (0 , ∞ ) , then F R converges weakly to a distribution function F γ,H , whose Stieltjes transform s satisfies � d H ( t ) z ∈ C + . s ( z ) = t (1 − γ − γs ( z )) − z , 2 If p/n → 0 , then F √ n/p ( R − Γ ) converges weakly to a distribution function F , whose Stieltjes transform s satisfies � d H ( t ) z ∈ C + , s ( z ) = − s ( z ) , z + t � where � s is the unique solution to � s ( z )) − 1 t d H ( t ) and z ∈ C + . � s ( z ) = − ( z + t � J. Heiny Heavy-tailed correlation and covariance matrices 11 / 37

  17. Special Case A = I Simplified assumptions: d 1 iid, symmetric entries X it = X p 2 Growth regime: lim n = γ ∈ [0 , 1] p →∞ J. Heiny Heavy-tailed correlation and covariance matrices 12 / 37

  18. Marˇ cenko–Pastur and Semicircle Law Marˇ cenko–Pastur law F γ has density � � 1 ( b − x )( x − a ) , if x ∈ [ a, b ] , 2 πxγ f γ ( x ) = 0 , otherwise, where a = (1 − √ γ ) 2 and b = (1 + √ γ ) 2 . J. Heiny Heavy-tailed correlation and covariance matrices 13 / 37

  19. Marˇ cenko–Pastur and Semicircle Law Marˇ cenko–Pastur law F γ has density � � 1 ( b − x )( x − a ) , if x ∈ [ a, b ] , 2 πxγ f γ ( x ) = 0 , otherwise, where a = (1 − √ γ ) 2 and b = (1 + √ γ ) 2 . Semicircle law SC J. Heiny Heavy-tailed correlation and covariance matrices 13 / 37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend