mlcc 2015 dimensionality reduction and pca
play

MLCC 2015 Dimensionality Reduction and PCA Lorenzo Rosasco - PowerPoint PPT Presentation

MLCC 2015 Dimensionality Reduction and PCA Lorenzo Rosasco UNIGE-MIT-IIT June 25, 2015 Outline PCA & Reconstruction PCA and Maximum Variance PCA and Associated Eigenproblem Beyond the First Principal Component PCA and Singular Value


  1. MLCC 2015 Dimensionality Reduction and PCA Lorenzo Rosasco UNIGE-MIT-IIT June 25, 2015

  2. Outline PCA & Reconstruction PCA and Maximum Variance PCA and Associated Eigenproblem Beyond the First Principal Component PCA and Singular Value Decomposition Kernel PCA MLCC 2015 2

  3. Dimensionality Reduction In many practical applications it is of interest to reduce the dimensionality of the data: ◮ data visualization ◮ data exploration : for investigating the ”effective” dimensionality of the data MLCC 2015 3

  4. Dimensionality Reduction (cont.) This problem of dimensionality reduction can be seen as the problem of defining a map M : X = R D → R k , k ≪ D, according to some suitable criterion . MLCC 2015 4

  5. Dimensionality Reduction (cont.) This problem of dimensionality reduction can be seen as the problem of defining a map M : X = R D → R k , k ≪ D, according to some suitable criterion . In the following data reconstruction will be our guiding principle. MLCC 2015 5

  6. Principal Component Analysis PCA is arguably the most popular dimensionality reduction procedure. MLCC 2015 6

  7. Principal Component Analysis PCA is arguably the most popular dimensionality reduction procedure. It is a data driven procedure that given an unsupervised sample S = ( x 1 , . . . , x n ) derive a dimensionality reduction defined by a linear map M . MLCC 2015 7

  8. Principal Component Analysis PCA is arguably the most popular dimensionality reduction procedure. It is a data driven procedure that given an unsupervised sample S = ( x 1 , . . . , x n ) derive a dimensionality reduction defined by a linear map M . PCA can be derived from several prospective and here we give a geometric derivation. MLCC 2015 8

  9. Dimensionality Reduction by Reconstruction Recall that, if w ∈ R D , � w � = 1 , then ( w T x ) w is the orthogonal projection of x on w MLCC 2015 9

  10. Dimensionality Reduction by Reconstruction Recall that, if w ∈ R D , � w � = 1 , then ( w T x ) w is the orthogonal projection of x on w MLCC 2015 10

  11. Dimensionality Reduction by Reconstruction (cont.) First, consider k = 1 . The associated reconstruction error is � x − ( w T x ) w � 2 (that is how much we lose by projecting x along the direction w ) MLCC 2015 11

  12. Dimensionality Reduction by Reconstruction (cont.) First, consider k = 1 . The associated reconstruction error is � x − ( w T x ) w � 2 (that is how much we lose by projecting x along the direction w ) Problem: Find the direction p allowing the best reconstruction of the training set MLCC 2015 12

  13. Dimensionality Reduction by Reconstruction (cont.) Let S D − 1 = { w ∈ R D | � w � = 1 } is the sphere in D dimensions. Consider the empirical reconstruction minimization problem, n 1 � x i − ( w T x i ) w � 2 . � min n w ∈ S D − 1 i =1 The solution p to the above problem is called the first principal component of the data MLCC 2015 13

  14. An Equivalent Formulation A direct computation shows that � x i − ( w T x i ) w � 2 = � x i � − ( w T x i ) 2 MLCC 2015 14

  15. An Equivalent Formulation A direct computation shows that � x i − ( w T x i ) w � 2 = � x i � − ( w T x i ) 2 Then, problem n 1 � x i − ( w T x i ) w � 2 � min n w ∈ S D − 1 i =1 is equivalent to n 1 ( w T x i ) 2 � max n w ∈ S D − 1 i =1 MLCC 2015 15

  16. Outline PCA & Reconstruction PCA and Maximum Variance PCA and Associated Eigenproblem Beyond the First Principal Component PCA and Singular Value Decomposition Kernel PCA MLCC 2015 16

  17. Reconstruction and Variance x = 1 Assume the data to be centered, ¯ n x i = 0 , then we can interpret the term ( w T x ) 2 as the variance of x in the direction w . MLCC 2015 17

  18. Reconstruction and Variance x = 1 Assume the data to be centered, ¯ n x i = 0 , then we can interpret the term ( w T x ) 2 as the variance of x in the direction w . The first PC can be seen as the direction along which the data have maximum variance. n 1 ( w T x i ) 2 � max n w ∈ S D − 1 i =1 MLCC 2015 18

  19. Centering If the data are not centered, we should consider n 1 ( w T ( x i − ¯ � x )) 2 max (1) n w ∈ S D − 1 i =1 equivalent to n 1 ( w T x c � i ) 2 max n w ∈ S D − 1 i =1 with x c = x − ¯ x . MLCC 2015 19

  20. Centering and Reconstruction If we consider the effect of centering to reconstruction it is easy to see that we get n 1 � x i − (( w T ( x i − b )) w + b ) � 2 � min n w,b ∈ S D − 1 i =1 where (( w T ( x i − b )) w + b is an affine (rather than an orthogonal) projection MLCC 2015 20

  21. Outline PCA & Reconstruction PCA and Maximum Variance PCA and Associated Eigenproblem Beyond the First Principal Component PCA and Singular Value Decomposition Kernel PCA MLCC 2015 21

  22. PCA as an Eigenproblem A further manipulation shows that PCA corresponds to an eigenvalue problem. MLCC 2015 22

  23. PCA as an Eigenproblem A further manipulation shows that PCA corresponds to an eigenvalue problem. Using the symmetry of the inner product, n n n n 1 ( w T x i ) 2 = 1 w T x i w T x i = 1 i w = w T ( 1 w T x i x T � � � � x i x T i ) w n n n n i =1 i =1 i =1 i =1 MLCC 2015 23

  24. PCA as an Eigenproblem A further manipulation shows that PCA corresponds to an eigenvalue problem. Using the symmetry of the inner product, n n n n 1 ( w T x i ) 2 = 1 w T x i w T x i = 1 i w = w T ( 1 w T x i x T � � � � x i x T i ) w n n n n i =1 i =1 i =1 i =1 Then, we can consider the problem n C n = 1 w ∈ S D − 1 w T C n w, � x i x T max i n i =1 MLCC 2015 24

  25. PCA as an Eigenproblem (cont.) We make two observations: � n ◮ The (”covariance”) matrix C n = 1 i =1 X T n X n is symmetric and n positive semi-definite. MLCC 2015 25

  26. PCA as an Eigenproblem (cont.) We make two observations: � n ◮ The (”covariance”) matrix C n = 1 i =1 X T n X n is symmetric and n positive semi-definite. ◮ The objective function of PCA can be written as w T C n w w T w the so called Rayleigh quotient. MLCC 2015 26

  27. PCA as an Eigenproblem (cont.) We make two observations: � n ◮ The (”covariance”) matrix C n = 1 i =1 X T n X n is symmetric and n positive semi-definite. ◮ The objective function of PCA can be written as w T C n w w T w the so called Rayleigh quotient. Note that, if C n u = λu then u T C n u = λ , since u is normalized. u T u MLCC 2015 27

  28. PCA as an Eigenproblem (cont.) We make two observations: � n ◮ The (”covariance”) matrix C n = 1 i =1 X T n X n is symmetric and n positive semi-definite. ◮ The objective function of PCA can be written as w T C n w w T w the so called Rayleigh quotient. Note that, if C n u = λu then u T C n u = λ , since u is normalized. u T u Indeed, it is possible to show that the Rayleigh quotient achieves its maximum at a vector corresponding to the maximum eigenvalue of C n MLCC 2015 28

  29. PCA as an Eigenproblem (cont.) Computing the first principal component of the data reduces to computing the biggest eigenvalue of the covariance and the corresponding eigenvector. n C n = 1 � X T C n u = λu, n X n n i =1 MLCC 2015 29

  30. Outline PCA & Reconstruction PCA and Maximum Variance PCA and Associated Eigenproblem Beyond the First Principal Component PCA and Singular Value Decomposition Kernel PCA MLCC 2015 30

  31. Beyond the First Principal Component We discuss how to consider more than one principle component ( k > 1 ) M : X = R D → R k , k ≪ D The idea is simply to iterate the previous reasoning MLCC 2015 31

  32. Residual Reconstruction The idea is to consider the one dimensional projection that can best reconstruct the residuals r i = x i − ( p T x i ) p i MLCC 2015 32

  33. Residual Reconstruction The idea is to consider the one dimensional projection that can best reconstruct the residuals r i = x i − ( p T x i ) p i An associated minimization problem is given by n 1 � r i − ( w T r i ) w � 2 . � min n w ∈ S D − 1 ,w ⊥ p i =1 (note: the constraint w ⊥ p ) MLCC 2015 33

  34. Residual Reconstruction (cont.) Note that for all i = 1 , . . . , n , � r i − ( w T r i ) w � 2 = � r i � 2 − ( w T r i ) 2 = � r i � 2 − ( w T x i ) 2 since w ⊥ p MLCC 2015 34

  35. Residual Reconstruction (cont.) Note that for all i = 1 , . . . , n , � r i − ( w T r i ) w � 2 = � r i � 2 − ( w T r i ) 2 = � r i � 2 − ( w T x i ) 2 since w ⊥ p Then, we can consider the following equivalent problem n 1 ( w T x i ) 2 = w T C n w. � max n w ∈ S D − 1 ,w ⊥ p i =1 MLCC 2015 35

  36. PCA as an Eigenproblem n 1 ( w T x i ) 2 = w T C n w. � max n w ∈ S D − 1 ,w ⊥ p i =1 Again, we have to minimize the Rayleigh quotient of the covariance matrix with the extra constraint w ⊥ p MLCC 2015 36

  37. PCA as an Eigenproblem n 1 ( w T x i ) 2 = w T C n w. � max n w ∈ S D − 1 ,w ⊥ p i =1 Again, we have to minimize the Rayleigh quotient of the covariance matrix with the extra constraint w ⊥ p Similarly to before, it can be proved that the solution of the above problem is given by the second eigenvector of C n , and the corresponding eigenvalue. MLCC 2015 37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend