potential pca interpretation problems for volatility
play

Potential PCA Interpretation Problems for Volatility Smile Dynamics - PowerPoint PPT Presentation

Potential PCA Interpretation Problems for Volatility Smile Dynamics Robert Tompkins, Dimitri Reiswich July, 16th 2010 Analysis, Stochastics, and Applications A Conference in Honour of Walter Schachermayer Robert Tompkins, Dimitri Reiswich PCA


  1. Potential PCA Interpretation Problems for Volatility Smile Dynamics Robert Tompkins, Dimitri Reiswich July, 16th 2010 Analysis, Stochastics, and Applications A Conference in Honour of Walter Schachermayer Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 1 / 34

  2. Introduction The goal of PCA is to reduce the dimensionality of multiple correlated random variables to a parsimonious set of uncorrelated components. Suppose that the correlated random variables are summarized in a n × 1 vector x with covariance matrix Σ. Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 2 / 34

  3. Introduction The goal of PCA is to reduce the dimensionality of multiple correlated random variables to a parsimonious set of uncorrelated components. Suppose that the correlated random variables are summarized in a n × 1 vector x with covariance matrix Σ. Initially, PCA determines a new random variable y 1 which is a linear combination of the components of x weighted by the components of a vector γ 1 ∈ R n × 1 n � y 1 = γ T 1 x = γ 1 i x i i =1 Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 2 / 34

  4. Introduction The goal of PCA is to reduce the dimensionality of multiple correlated random variables to a parsimonious set of uncorrelated components. Suppose that the correlated random variables are summarized in a n × 1 vector x with covariance matrix Σ. Initially, PCA determines a new random variable y 1 which is a linear combination of the components of x weighted by the components of a vector γ 1 ∈ R n × 1 n � y 1 = γ T 1 x = γ 1 i x i i =1 The vector γ 1 is chosen such that y 1 has maximum variance Var( y 1 ) and γ 1 has a length of 1. In the next step, a new variable y 2 = γ T 2 x is determined, which is not correlated with y 1 and has maximum variance. The k th derived variable y k = γ T k x is called k th Principal Component (PC). We hope that most of the variation in x will be accounted for by m PCs, where m << n . Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 2 / 34

  5. Introduction The maximization problem we have to solve at stage k is || γ k || =1 Var( y k ) max subject to Cov( y k , y k − 1 ) = Cov( γ T k x, γ T k − 1 x ) = 0 . This problem can be solved by the choice of γ k as the eigenvector corresponding to the k -th largest eigenvalue λ k of Σ - the covariance matrix of x . We then have Var( y k ) = λ k The explained variance associated with the k -th PC can be expressed as: λ k � n i =1 λ i Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 3 / 34

  6. Introduction The maximization problem we have to solve at stage k is || γ k || =1 Var( y k ) max subject to Cov( y k , y k − 1 ) = Cov( γ T k x, γ T k − 1 x ) = 0 . This problem can be solved by the choice of γ k as the eigenvector corresponding to the k -th largest eigenvalue λ k of Σ - the covariance matrix of x . We then have Var( y k ) = λ k The explained variance associated with the k -th PC can be expressed as: λ k � n i =1 λ i The empirical financial literature shows that most of the variance is explained by 3 factors, see for example Alexander (2003), Alexander (2001), Cont and da Fonseca (2002), Daglish et al. (2007), Fengler et al. (2003), Schmidt et al. (2002), Skiadopoulos et al. (2000), Zhu and Avellaneda (1997), P´ erignon et al. (2007). Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 3 / 34

  7. Introduction 1.0 1.0 0.5 0.5 Loading Loading 0.0 0.0 −0.5 −0.5 −1.0 −1.0 1.0 1.5 2.0 2.5 3.0 1.0 1.5 2.0 2.5 3.0 Loading Index Loading Index 1.0 1.0 0.5 0.5 Loading Loading 0.0 0.0 −0.5 −0.5 −1.0 −1.0 1.0 1.5 2.0 2.5 3.0 3.5 4.0 1 2 3 4 5 Loading Index Loading Index Figure: Level (upper left), Skew (upper right), Twist (lower left), Curvature (lower right) Patterns in PCA Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 4 / 34

  8. Potential Interpretation Problems A well known theorem stated by Perron and Frobenius reveals a priori information about the structure/shape of the eigenvectors (see (Meyer, 2000, Chapter 8), Lord and Pelsser (2007)) Theorem (Frobenius-Perron) For a n × n strictly positive matrix Σ , the eigenvector associated with the largest eigenvalue is strictly positive . Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 5 / 34

  9. Potential Interpretation Problems A well known theorem stated by Perron and Frobenius reveals a priori information about the structure/shape of the eigenvectors (see (Meyer, 2000, Chapter 8), Lord and Pelsser (2007)) Theorem (Frobenius-Perron) For a n × n strictly positive matrix Σ , the eigenvector associated with the largest eigenvalue is strictly positive . Lord and Pelsser (2007) provide another theorem based on the work of Gantmacher and Krein (1960) Theorem (Sign Change Pattern) Assume Σ is a valid n × n covariance matrix that is strictly totally positive of order k . Then we have λ 1 > λ 2 > ... > λ k . The eigenvector corresponding to the eigenvalue λ j has j − 1 changes of sign for j = 1 , ..., k (zero terms in the eigenvector are discarded). Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 5 / 34

  10. Potential Interpretation Problems 1.0 1.0 0.5 0.5 Loading Loading 0.0 0.0 −0.5 −0.5 −1.0 −1.0 1.0 1.5 2.0 2.5 3.0 1.0 1.5 2.0 2.5 3.0 Loading Index Loading Index 1.0 1.0 0.5 0.5 Loading Loading 0.0 0.0 −0.5 −0.5 −1.0 −1.0 1.0 1.5 2.0 2.5 3.0 3.5 4.0 1 2 3 4 5 Loading Index Loading Index Figure: Level (upper left), Skew (upper right), Twist (lower left), Curvature (lower right) Patterns in PCA Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 6 / 34

  11. Potential Interpretation Problems Consider the following empirical correlation matrix of implied volatilities of the Euro vs. US Dollar across five moneyness levels (in delta terms) for 1 month maturity options, using Bloomberg data from 03 . 10 . 2003 to 21 . 01 . 2009.   1 . 000 0 . 968 0 . 953 0 . 927 0 . 898 0 . 968 1 . 000 0 . 989 0 . 968 0 . 923     0 . 953 0 . 989 1 . 000 0 . 991 0 . 951     0 . 927 0 . 968 0 . 991 1 . 000 0 . 966   0 . 898 0 . 923 0 . 951 0 . 966 1 . 000 Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 7 / 34

  12. Potential Interpretation Problems Consider the following empirical correlation matrix of implied volatilities of the Euro vs. US Dollar across five moneyness levels (in delta terms) for 1 month maturity options, using Bloomberg data from 03 . 10 . 2003 to 21 . 01 . 2009.   1 . 000 0 . 968 0 . 953 0 . 927 0 . 898 0 . 968 1 . 000 0 . 989 0 . 968 0 . 923     0 . 953 0 . 989 1 . 000 0 . 991 0 . 951     0 . 927 0 . 968 0 . 991 1 . 000 0 . 966   0 . 898 0 . 923 0 . 951 0 . 966 1 . 000 The matrix represents the class of bisymmetric matrices. Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 7 / 34

  13. Potential Interpretation Problems Definition Let J ∈ R n × n be a matrix which has ones on its anti-diagonal, and zeros everywhere else 0 0 ... ... 0 1   0 0 ... ... 1 0   . .  ...  . . J = . (1)   . .     0 1 ... ... 0 0   1 0 ... ... 0 0 A bisymmetric matrix A ∈ R n × n is a matrix which is symmetric with respect to both of its diagonals and thus fulfills the following condition JAJ = A. Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 8 / 34

  14. Potential Interpretation Problems The matrix J can be used to define symmetric and skew-symmetric vectors, which will be important objects in the following analyses. Definition A vector γ s ∈ R n is called symmetric, if Jγ s = γ s . A vector γ ss ∈ R n is called skew-symmetric, if Jγ ss = − γ ss . Examples of these classes are � 3 � 1  1   1  � �  ,  , γ s = 2 γ s = , γ ss = 0 γ ss = .   3 − 1 1 − 1 Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 9 / 34

  15. Potential Interpretation Problems It can be shown that the eigenvectors of bisymmetric classes are either symmetric or skew-symmetric, see Cantoni and Butler (1976). We will summarize this in the following theorem, distinguishing between a quadratic matrix with an odd or even number of rows. Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 10 / 34

  16. Potential Interpretation Problems It can be shown that the eigenvectors of bisymmetric classes are either symmetric or skew-symmetric, see Cantoni and Butler (1976). We will summarize this in the following theorem, distinguishing between a quadratic matrix with an odd or even number of rows. Theorem Suppose A ∈ R n × n is bisymmetric and n is even. Matrix A has n/ 2 skew-symmetric and n/ 2 symmetric orthonormal eigenvectors. Let ⌈ x ⌉ denote the smallest integer ≥ x , and ⌊ x ⌋ denote the largest integer ≤ x . Define � n � u =: (2) 2 � n � l := (3) 2 2 respectively. Suppose A ∈ R n × n is bisymmetric to be the upper and lower integer of n and n is odd. Matrix A has l skew-symmetric and u symmetric orthonormal eigenvectors. Robert Tompkins, Dimitri Reiswich PCA Interpretation Problems July, 16th 2010 10 / 34

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend