maximum margin based semi supervised spectral kernel
play

Maximum Margin based Semi-supervised Spectral Kernel Learning - PowerPoint PPT Presentation

Motivation A Framework of Spectral Kernel Learning Experiment and Discussion Conclusion and Future work Question and Answer Maximum Margin based Semi-supervised Spectral Kernel Learning Zenglin Xu, Jianke Zhu, Michael R. Lyu and Irwin King


  1. Motivation A Framework of Spectral Kernel Learning Experiment and Discussion Conclusion and Future work Question and Answer Maximum Margin based Semi-supervised Spectral Kernel Learning Zenglin Xu, Jianke Zhu, Michael R. Lyu and Irwin King Department of Computer Science and Engineering The Chinese University of Hong Kong Internet Joint Conference on Neural Networks 2007 Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  2. Motivation A Framework of Spectral Kernel Learning Experiment and Discussion Conclusion and Future work Question and Answer Outline Motivation 1 Kernel Learning Spectral Kernel Learning Approaches A Framework of Spectral Kernel Learning 2 Theoretical Foundation Semi-supervised Spectral Kernel Learning Framework Maximum Margin Based Spectral Kernel Learning Experiment and Discussion 3 Conclusion and Future work 4 Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  3. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Let’s Start from the Kernel Trick Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  4. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Kernel Learning Different kernel functions defines a different implicit mapping (linear kernel, RBF kernel, etc.) How to find an appropriate kernel? This leads to the kernel learning task. Definition Kernel Learning works by embedding data from the input space to a Hilbert space, and then searching for relations among the embedded data points to maximize a performance measure. Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  5. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Semi-supervised Kernel Learning We design a kernel using both: the label information of labeled data the unlabeled data Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  6. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Spectral Kernel Learning Given an input kernel K , a spectral kernel is obtained by adjusting the spectra of K n ¯ � g ( µ i ) φ i φ T K = i , (1) i = 1 where g ( · ) is a transformation function of the spectra of a kernel matrix, < µ i , φ i > is the i -th eigenvalue and eigenvector. Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  7. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Typical Approaches in Spectral Kernel Learning Diffusion kernels, [Kondor and Lafferty, 02] Regularization on graphs, [Smola and Kondor, 03] Non-parametric spectral kernel learning, [Zhu et al., 03] Fast decay spectral kernel, [Hoi et al., 06] Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  8. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer The Property and Limitation in Previous Approaches Property Distances on the graph can give a useful, more global, sense of similarity between objects Limitation The kernel designing process does not involve the bias or the decision boundary of a kernel-based learning algorithm Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  9. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Why the Bias is Important? Different kernel methods try to utilize different prior knowledge in order to derive the separating hyperplane SVM maximizes the margin between two classes of data in the kernel induced feature space Kernel Fisher Discriminant Analysis (KFDA) maximizes the between-class covariance while minimizes the within-class covariance Minimax Probability Machine (MPM) finds a hyperplane in the feature space, which minimizes the maximum Mahalanobis distances to two classes Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  10. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Our Supplement to Spectral Kernel Methods This motivates us to design spectral kernel learning algorithms: Keep the properties of spectral kernels Incorporate the decision boundary of a kernel-based classifier Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  11. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer Our Contributions We generalize the previous work in spectral kernel learning to a spectral kernel learning framework We incorporate the decision boundary of a classifier into the spectral kernel learning process Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  12. Motivation A Framework of Spectral Kernel Learning Kernel Learning Experiment and Discussion Spectral Kernel Learning Approaches Conclusion and Future work Question and Answer An Illustration 1 1 0.5 0.5 0 0 −0.5 −0.5 −1 −1 −1 −0.5 0 0.5 1 −1 −0.5 0 0.5 1 Figure: The decision boundaries on Relevance and Twocircles . The black (dark) line – regular RBF The magenta (doted) line – spectral kernel optimizing the kernel target alignment [Hoi et al., 06] The cyan (dashed) line – proposed spectral kernel attained by maximizing the margin Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  13. Motivation A Framework of Spectral Kernel Learning Theoretical Foundation Experiment and Discussion Semi-supervised Spectral Kernel Learning Framework Conclusion and Future work Maximum Margin Based Spectral Kernel Learning Question and Answer The Framework Theoretical foundation Semi-supervised spectral kernel learning framework Maximum-margin based spectral kernel learning Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  14. Motivation A Framework of Spectral Kernel Learning Theoretical Foundation Experiment and Discussion Semi-supervised Spectral Kernel Learning Framework Conclusion and Future work Maximum Margin Based Spectral Kernel Learning Question and Answer Spectral Kernel Design Rule We consider the following regularized linear prediction method on the Reproducing Kernel Hilbert Space (RKHS) H : ℓ 1 ˆ � L ( h ( x i ) , y i ) + r || h || 2 f = arg inf H , (2) ℓ f ∈H i = 1 where r is a regularization coefficient, ℓ is the number of labeled data points, and L is a loss function. Based on Representer Theorem, we have ℓ 1 ˆ � L ( f i , y i ) + rf T K − 1 f . f = arg inf (3) ℓ f ∈R n i = 1 Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  15. Motivation A Framework of Spectral Kernel Learning Theoretical Foundation Experiment and Discussion Semi-supervised Spectral Kernel Learning Framework Conclusion and Future work Maximum Margin Based Spectral Kernel Learning Question and Answer Spectral Kernel Design Rule The previous formulation is equivalent to a supervised learning model. A way of unsupervised kernel design is to replace the kernel matrix K with ¯ K , i.e., n ¯ � g ( µ i ) φ i φ T K = i . (4) i = 1 Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

  16. Motivation A Framework of Spectral Kernel Learning Theoretical Foundation Experiment and Discussion Semi-supervised Spectral Kernel Learning Framework Conclusion and Future work Maximum Margin Based Spectral Kernel Learning Question and Answer Spectral Kernel Design Rule Depending on different forms of g ( · ) , different kernel matrices can be learned. Table: Semi-supervised kernels achieved by different spectral transformation. g ( µ ) Kernels g ( µ ) = exp ( − σ 2 2 µ ) the diffusion kernel 1 g ( µ ) = the Gaussian field kernel µ + ǫ g ( µ ) = µ i , µ i ≤ µ i + 1 , i = 1 , . . . , n − 1 the order-constrained spectral kernel g ( µ ) = µ i , µ i ≥ w µ i + 1 , i = 1 , . . . , q − 1 the fast-decay spectral kernel Zenglin Xu, Jianke Zhu, Michael R. Lyu amd Irwin King Maximum Margin based Semi-supervised Spectral Kernel Learning

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend