local fisher discriminant local fisher discriminant
play

Local Fisher Discriminant Local Fisher Discriminant Analysis for - PowerPoint PPT Presentation

ICML2006, Pittsburgh, USA June 25-29, 2006 Local Fisher Discriminant Local Fisher Discriminant Analysis for Supervised Analysis for Supervised Dimensionality Reduction Dimensionality Reduction Masashi Sugiyama Tokyo Institute of Technology,


  1. ICML2006, Pittsburgh, USA June 25-29, 2006 Local Fisher Discriminant Local Fisher Discriminant Analysis for Supervised Analysis for Supervised Dimensionality Reduction Dimensionality Reduction Masashi Sugiyama Tokyo Institute of Technology, Japan

  2. 2 Dimensionality Reduction Dimensionality Reduction � High dimensional data is not easy to handle: Need to reduce dimensionality � We focus on � Linear dimensionality reduction: � Supervised dimensionality reduction:

  3. 3 Within-Class Multimodality Within-Class Multimodality One of the classes has several modes Class 1 (blue) Class 2 (red) � Medical checkup: hormone imbalance (high/low) vs. normal � Digit recognition: even (0,2,4,6,8) vs. odd (1,3,5,7,9) � Multi-class classification: one vs. rest

  4. 4 Goal of This Research Goal of This Research � We want to embed multimodal data so that � Between-class separability is maximized � Within-class multimodality is preserved Separable but within-class Within-class multimodality Separable and within-class multimodality lost preserved but non-separable multimodality preserved FDA LPP LFDA A C B

  5. 5 Fisher Discriminant Analysis (FDA) Fisher Discriminant Analysis (FDA) Fisher (1936) � Within-class scatter matrix: � Between-class scatter matrix: � FDA criterion: � Within-class scatter is made small � Between-class scatter is made large

  6. 6 Interpretation of FDA Interpretation of FDA :Number of samples in class � Pairwise expressions: :Total number of samples � Samples in the same class are made close � Samples in different classes are made apart

  7. 7 Examples of FDA Examples of FDA Simple Label-mixed cluster Multimodal 10 10 10 10 10 10 close 5 5 5 5 5 5 close close 0 0 0 0 0 0 −5 −5 −5 −5 −5 −5 apart apart apart −10 −10 −10 −10 −10 −10 −10 −5 0 5 10 −10 −10 −5 −5 0 0 5 5 10 10 −10 −10 −5 −5 0 0 5 5 10 10 −10 −5 0 5 10 FDA does not take within-class multimodality into account NOTE: FDA can extract only C-1 features since :Number of classes

  8. 8 Locality Preserving Projection Locality Preserving Projection (LPP) He & Niyogi (NIPS2003) (LPP) � Locality matrix: � Affinity matrix: e.g., � LPP criterion: � Nearby samples in original space are made close � Constraint is to avoid

  9. 9 Examples of LPP Examples of LPP Simple Label-mixed cluster Multimodal 10 10 10 10 10 10 5 5 5 5 5 5 close close 0 0 0 0 0 0 −5 −5 −5 −5 −5 −5 close −10 −10 −10 −10 −10 −10 −10 −10 −5 −5 0 0 5 5 10 10 −10 −10 −5 −5 0 0 5 5 10 10 −10 −10 −5 −5 0 0 5 5 10 10 LPP does not take between-class separability into account (unsupervised)

  10. 10 Our Approach Our Approach We combine FDA and LPP � Nearby samples in the 10 same class are made apart close 5 � Far-apart samples in the close 0 same class are not made close −5 � Samples in different don’t care −10 classes are made apart −10 −5 0 5 10

  11. 11 Local Fisher Discriminent Analysis Local Fisher Discriminent Analysis � Local within-class scatter matrix: � Local between-class scatter matrix:

  12. 12 How to Obtain Solution How to Obtain Solution � Since LFDA has a similar form to FDA, solution can be obtained just by solving a generalized eigenvalue problem:

  13. 13 Examples of LFDA Examples of LFDA Simple Label-mixed cluster Multimodal 10 10 10 5 5 5 0 0 0 −5 −5 −5 −10 −10 −10 −10 −5 0 5 10 −10 −5 0 5 10 −10 −5 0 5 10 LFDA works well for all three cases! Note: Usually so LFDA can extract more than C features (cf. FDA)

  14. 14 Neighborhood Component Neighborhood Component Analysis (NCA) Analysis (NCA) Goldberger, Roweis, Hinton & Salakhutdinov (NIPS2004) � Minimize leave-one-out error of a stochastic k-nearest neighbor classifier � Obtained embedding is separable � NCA involves non-convex optimization There are local optima � No analytic solution available Slow iterative algorithm � LFDA has analytic form of global solution

  15. 15 Maximally Collapsing Maximally Collapsing Metric Learning (MCML) Metric Learning (MCML) Globerson & Roweis (NIPS2005) � Idea is similar to FDA � Samples in the same class are close (“one point”) � Samples in different classes are apart � MCML involves non-convex optimization � There exists a nice convex approximation Non-global solution � No analytic solution available Slow iterative algorithm

  16. 16 Simulations Simulations � Visualization of UCI data sets: � Letter recognition (D=16) � Segment (D=18) � Thyroid disease (D=5) � Iris (D=4) � Extract 3 classes from original data � Merge 2 classes Class 1 (blue) Class 2 (red)

  17. 17 Summary of Simulation Results Summary of Simulation Results Lett Segm Thyr Iris Comments FDA No multi-modal LPP No label-separability LFDA NCA Slow, local optima MCML Slow, no multi-modal Separable and multimodality preserved Separable but no multimodality Multimodality preserved but no separability

  18. 18 Letter Recognition Letter Recognition FDA LPP LFDA FDA LPP LFDA A C B NCA MCML NCA MCML Blue vs. Red

  19. 19 Segment Segment FDA LPP LFDA FDA LPP LFDA Brickface Sky Foliage NCA MCML NCA MCML Blue vs. Red

  20. 20 Thyroid Disease Thyroid Disease FDA LPP LFDA FDA LPP LFDA Hyper Hypo Normal NCA MCML NCA MCML Blue vs. Red

  21. 21 Iris Iris FDA LPP LFDA FDA LPP LFDA Setosa Virginica Verisicolour NCA MCML NCA MCML Blue vs. Red

  22. 22 Kernelization Kernelization � LFDA can be non-linearized by kernel trick � FDA: Kernel FDA Mika et al . (NNSP1999) � LPP: Laplacian eigenmap Belkin & Niyogi (NIPS2001) � MCML: Kernel MCML Globerson & Roweis (NIPS2005) � NCA: not available yet?

  23. 23 Conclusions Conclusions � LFDA effectively combines FDA and LPP. � LFDA is suitable for embedding multimodal data. � Same as FDA, LFDA has analytic optimal solution thus computationally efficient. � Same as LPP, LFDA needs to pre-specify affinity matrix. � We used local scaling method for computing affinity, which does not include any tuning parameter. Zelnik-Manor & Perona (NIPS2004)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend