non negative graph embedding n n ti g h e b ddi
play

Non-Negative Graph Embedding N N ti G h E b ddi Jianchao Yang - PowerPoint PPT Presentation

Non-Negative Graph Embedding N N ti G h E b ddi Jianchao Yang Shuicheng Yan Yun Fu Jianchao Yang, Shuicheng Yan, Yun Fu, Xuelong Li, Thomas Huang Department of ECE, Beckman Institute and CSL Department of ECE, Beckman Institute and CSL


  1. Non-Negative Graph Embedding N N ti G h E b ddi Jianchao Yang Shuicheng Yan Yun Fu Jianchao Yang, Shuicheng Yan, Yun Fu, Xuelong Li, Thomas Huang Department of ECE, Beckman Institute and CSL Department of ECE, Beckman Institute and CSL University of Illinois at Urbana-Champaign

  2. Outline • Non-negative Part-based Representation Non negative Part based Representation – Non-Negative Matrix Factorization • Non-negative Graph Embedding (NGE) Non negative Graph Embedding (NGE) – Graph Embedding framework – Our formulation • Experiment Results – Face recognition – Localized basis – Robust to image occlusion • Conclusions

  3. Non-negative Part-based Non negative Part based Representation • Why non-negativity? – Better physical interpretation of the non-negative data – Examples such as absolute temperatures, light intensities, probabilities, sound spectra, etc. p obab t es, sou d spect a, etc • Why part-based? – Psychological and physiological evidence for part-based representations in the human brain representations in the human brain. – Perception of the whole as perceptions of the parts. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  4. Non-negative Matrix Non negative Matrix Factorization • Formulation Form lation • Multiplicative update rules guarantee non negativity • Multiplicative update rules guarantee non-negativity IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  5. What NMF Learns? • NMF indeed learns part-based representation. p p • Problems: – Matrix factorization has no control on the properties of the parts. – Used in document clustering, but not good for recognition. • How the brain learns the discriminative parts is still unknown. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  6. Non-negative Graph Embedding Non negative Graph Embedding (NGE) • Motivation Motivation – Learn the non-negative part-based representation – Want it to be good for classification g • Method – Reconstruction for learning the part-based basis – Regularization with discriminant analysis

  7. A Better Scheme Reconstruction Discriminant Analysis Input I t Part-Based Basis P t B d B i O t Output t Use all available data for learning the basis, while guided by the labeling information. id d b th l b li i f ti IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  8. Learn the Discriminative Parts • One straightforward solution One straightforward solution : the data matrix : the part based basis matrix : the part-based basis matrix : the coefficient matrix : function encoding the discriminative power of : f ti di th di i i ti f coefficients • The problem is how to choose The problem is how to choose and to do and to do the optimization. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  9. Graph Embedding • Graph Embedding Framework [Yan, et al 2007] p g [ , ] – Intrinsic Graph: characterize the favorable relationship among training data. – Penalty Graph: characterize the unfavorable relationship among training data – Objective: • These graphs can be unsupervised, supervised or semi- supervised supervised. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  10. NGE Formulation • Divide the feature space into two parts--discriminant Di id th f t i t t t di i i t space and the complementary space for reconstruction. • The objective for is: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  11. NGE Formulation • To make the problem solvable, change the objective with T k th bl l bl h th bj ti ith the complementary space: • Given the intrinsic graph and penalty graph, the optimization problem can formulated as: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  12. Preliminaries • Definition 1: A matrix B is called M- matrix if 1) the off- ) diagonal entries are less than or equal to zeros; 2) the real parts of all eigen values are positive. • Lemma 1: If B is a M -matrix, its inverse is non-negative, L 1 If B i M t i it i i ti that is B (i,j) >= 0. • Definition 2: Function G(A, A’) is an auxiliary function for F(A) if G(A, A’)>= F(A) and G(A, A) = F(A). • Lemma 2: If G is an auxiliary function of F, F is non- increasing under the following update rule: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  13. Optimization Procedure • Initialize W and H with non-negative values, and the g , optimization is done by alternating between W and H. • Optimize W, fixing H . Define the auxiliary function as Thus the update rule for W is: Thus the update rule for W is: where is a diagonal element-wise positive g p matrix, which guarantees the non-negativity of W. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  14. Optimization Procedure • Optimize H, fixing W. The auxiliary function is defined p , g y as – To optimize : – To optimize : and are M -matrix, whose inverse are element-wise non-negative, hence guarantees non- element wise non negative, hence guarantees non negativity of H. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  15. General Framework Intrinsic and penalty graphs for Marginal Fisher Analysis • Our algorithm is a general framework, given the intrinsic O f and penalty graphs. – These graphs can be unsupervised, supervised or semi- g p p , p supervised. – We used supervised Marginal Fisher Analysis (MFA) graph to demonstrate the framework. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  16. Experiments • Face Recognition g – Tested on three databases: CMU PIE, ORL and FERET. – Compared with unsupervised algorithms PCA, NMF, LNMF (S. Li CVPR 2001) and supervised algorithms LDA and MFA Li, CVPR 2001) and supervised algorithms LDA and MFA. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  17. Experiments • Learned non-negative part-based basis g p NMF NMF LNMF NGE IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  18. Experiments • Robust to Occlusion Occlusion Examples IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  19. Conclusions • Contributions: – Proposed a general framework called Non-Negative Graph Embedding (NGE). – Supervised MFA graph is used to demonstrate the Supervised MFA graph is used to demonstrate the effectiveness of the algorithm. • Limitation: – Like other graph-based method, NGE suffers from speed and G ff f scalability during the off-line training. • Extension: – Unlabeled data can be incorporated into the basis learning, while guided by the available label information. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

  20. Thank you!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend