label embedding based on multi scale locality preservation
play

Label Embedding Based on Multi-Scale Locality Preservation - PowerPoint PPT Presentation

Label Embedding Based on Multi-Scale Locality Preservation Cheng-Lun Peng, An Tao, Xin Geng Reporter: Cheng-Lun Peng Date: July 17, 2018 Outline 1 Background 2 Proposed Method: MSLP 3 Experiment 4 Conclusion 1 Background: LE


  1. Label Embedding Based on Multi-Scale Locality Preservation Cheng-Lun Peng, An Tao, Xin Geng Reporter: Cheng-Lun Peng Date: July 17, 2018

  2. Outline 1 Background 2 Proposed Method: MSLP 3 Experiment 4 Conclusion

  3. 1 Background: LE & LDL q Label Embedding (LE): A Learning Strategy u Usual Steps encoding process (encoder) learning process (predictor ) decoding process (decoder) q Label Distribution Learning (LDL): A Learning Paradigm 0.3 0 label label 1 label 1 0 0.65 label label 0 instance instance label instance 1 0.05 label label 1 label Single-label learning Multi-label learning Label Distribution learning q Our Work Propose a specially designed LE Method named MSLP for LDL, which is the first attempt of applying LE in LDL

  4. 1 Background: The Meaning of Our Work q Why Apply LE in LDL u The labels in LDL may encounter problems (e.g., redundancy, noise, …) u Effective exploitation of the label correlations is crucial for the success for LDL. u LE owns advantages in addressing problematic labels and capturing latent correlation between labels. q What’s The Challenges of Applying LE in LDL u There are no LE method for LDL proposed yet. Most existing LE methods are designed for SLL and MLL, i.e., focusing on the binary labels (0/1) . u Two main issues a) How to exploit the information of label distributions efficiently. b) How to design a decoder that restricts the recovered label vector to satisfy the constraints of the label distribution.

  5. 1 Background: Symbol Definition q Symbol Definition : Dataset : i-th instance : i-th label vector : i-th embedded label vector

  6. 2 MSLP: Motivation q Motivation u Locality Preserving Embedding for The Label Space Inspired by Laplacian Eigenmaps [Belkin and Niyogi, 2002], MSLP aims to make the data points with similar label distributions close to each other in the embedding space. Find 𝑙 " nearest neighbors for data point 𝒒 𝒋 in the label space among the given point set

  7. 2 MSLP: Explicit Assumption q Explicit Assumption Assume an explicit mapping from the features to the embedded labels Advantage : u Makes the process of label embedding feature-aware u Omits the additional learning process from to after completing embedding. L2 Regularization

  8. 2 MSLP: Explicit Assumption q Problem of Explicit Linear Assumption The solution for V will tend to be dominated by the large feature distances of data pairs where Data pairs which keep very close in the label space, but keep far away from each other in the feature space.

  9. 2 MSLP: Restriction q Multi-Scale Locality Preservation Restriction: The 𝑙 " nearest neighbors of one data point in label space should be found within its Ξ± 𝑙 " nearest neighbors in feature space. That is, utilizing different locality granularity in the label space and the feature space, the locality information of data points in both spaces are integrated.

  10. 2 MSLP: Robust to Noise q Smoothness Assumption [Chapelle et al. ,2006] Neighboring data points in feature space are more likely to share the similar labels. q Hetero-neighbors Data pairs which keep very close in the feature space, but keep far away from each other in the label space. 𝑛𝑏𝑦

  11. 2 MSLP: Objective q The objective of MSLP

  12. 2 MSLP: Solution

  13. 2 MSLP: Solution Applying the Lagrangian method, the problem can be transfered into a general eigen-decomposition problem. The optimal V consists of the first π‘š normalized eigenvectors corresponding to the top π‘š smallest eigenvalues.

  14. 2 MSLP: Decoder q Testing Phrase

  15. 3 Experiment: Configuration q Compared Methods u Eight popular LDL methods: IIS-LDL, CPNN, BFGS-LDL, LDSVR, AA-BP , AA-KNN, PT-SVM, PT-Bayes u Four typical Feature Embedding methods: CCA, NPE, PCA, LPP ( The Linear version of Laplacian Eigenmaps) The compared FE methods are allowed to be extended to their kernel version with the rbf kernel, which gives them full chances to beat MSLP . q Widely-used Metrics in LDL u Four distance metrics: Chebyshev, Clark, Kullback-Leibler, Canberra u Two similarity metrics: Cosine and Intersection q Other Settings u the embedding ratio of the dimensionality ranges over {10% , 20% , …, 100%} u Running each method with the best tuned parameters u 10-fold cross validation u Pairwise t-tests at 90% significance level

  16. 3 Experiment: Datasets Datasets #S #Lab #Feat Domain el ure facial s-JAFEE 213 6 243 expression recognition facial s-BU- 250 6 243 expression 3DEF 0 recognition 150 facial beauty SCUT-FBP 5 300 0 sense 124 facial beauty M * B 5 250 0 sense Nature_Sc 200 natural scene 9 294 ene 0 annotation

  17. 3 Experiment: Visualization u Different colors are used to display images according to the highest description degree of the basic emotions

  18. 3 Experiment: Quantitative Results

  19. 3 Experiment 3 Experiment: Quantitative Results Across all metrics, MSLP ranks 1st in 93.3% cases.

  20. 4 Conclusion q Conclusion u The first attempt of embedding LE into LDL. u MSLP is insensitive to the presence of hetero-neighbors and integrates the locality structure of points in both spaces with different granularity. u Experiments reveal the effectiveness of MSLP in gathering points with similar label distributions in the embedding space. q Future Work u Explore if there exist better ways to utilize the structure information described by the label distributions. u Shift MSLP to some other learning paradigms (e.g., multi-output regression) which own numerical labels.

  21. Thank you Q & A

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend