Non-Negative Graph Embedding N N ti G h E b ddi Jianchao Yang - - PowerPoint PPT Presentation

non negative graph embedding n n ti g h e b ddi
SMART_READER_LITE
LIVE PREVIEW

Non-Negative Graph Embedding N N ti G h E b ddi Jianchao Yang - - PowerPoint PPT Presentation

Non-Negative Graph Embedding N N ti G h E b ddi Jianchao Yang Shuicheng Yan Yun Fu Jianchao Yang, Shuicheng Yan, Yun Fu, Xuelong Li, Thomas Huang Department of ECE, Beckman Institute and CSL Department of ECE, Beckman Institute and CSL


slide-1
SLIDE 1

N N ti G h E b ddi Non-Negative Graph Embedding

Jianchao Yang Shuicheng Yan Yun Fu Jianchao Yang, Shuicheng Yan, Yun Fu, Xuelong Li, Thomas Huang Department of ECE, Beckman Institute and CSL Department of ECE, Beckman Institute and CSL University of Illinois at Urbana-Champaign

slide-2
SLIDE 2

Outline

  • Non-negative Part-based Representation

Non negative Part based Representation

– Non-Negative Matrix Factorization

  • Non-negative Graph Embedding (NGE)

Non negative Graph Embedding (NGE)

– Graph Embedding framework – Our formulation

  • Experiment Results

– Face recognition – Localized basis – Robust to image occlusion

  • Conclusions
slide-3
SLIDE 3

Non-negative Part-based Non negative Part based Representation

  • Why non-negativity?

– Better physical interpretation of the non-negative data – Examples such as absolute temperatures, light intensities, probabilities, sound spectra, etc. p obab t es, sou d spect a, etc

  • Why part-based?

– Psychological and physiological evidence for part-based representations in the human brain representations in the human brain. – Perception of the whole as perceptions of the parts.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-4
SLIDE 4

Non-negative Matrix Non negative Matrix Factorization

Form lation

  • Formulation
  • Multiplicative update rules guarantee non negativity
  • Multiplicative update rules guarantee non-negativity

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-5
SLIDE 5

What NMF Learns?

  • NMF indeed learns part-based representation.

p p

  • Problems:

– Matrix factorization has no control on the properties of the parts. – Used in document clustering, but not good for recognition.

  • How the brain learns the discriminative parts is still

unknown.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-6
SLIDE 6

Non-negative Graph Embedding Non negative Graph Embedding (NGE)

  • Motivation

Motivation

– Learn the non-negative part-based representation – Want it to be good for classification g

  • Method

– Reconstruction for learning the part-based basis – Regularization with discriminant analysis

slide-7
SLIDE 7

A Better Scheme

Reconstruction

P t B d B i I t O t t

Discriminant Analysis

Part-Based Basis Input Output Use all available data for learning the basis, while id d b th l b li i f ti

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

guided by the labeling information.

slide-8
SLIDE 8

Learn the Discriminative Parts

  • One straightforward solution

One straightforward solution : the data matrix : the part based basis matrix : the part-based basis matrix : the coefficient matrix : f

ti di th di i i ti f

: function encoding the discriminative power of

coefficients

  • The problem is how to choose

and to do The problem is how to choose and to do the optimization.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-9
SLIDE 9

Graph Embedding

  • Graph Embedding Framework [Yan, et al 2007]

p g [ , ]

– Intrinsic Graph: characterize the favorable relationship among training data. – Penalty Graph: characterize the unfavorable relationship among training data – Objective:

  • These graphs can be unsupervised, supervised or semi-

supervised supervised.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-10
SLIDE 10

NGE Formulation

Di id th f t i t t t di i i t

  • Divide the feature space into two parts--discriminant

space and the complementary space for reconstruction.

  • The objective for is:

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-11
SLIDE 11

NGE Formulation

T k th bl l bl h th bj ti ith

  • To make the problem solvable, change the objective with

the complementary space:

  • Given the intrinsic graph and penalty graph, the
  • ptimization problem can formulated as:

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-12
SLIDE 12

Preliminaries

  • Definition 1: A matrix B is called M-matrix if 1) the off-

) diagonal entries are less than or equal to zeros; 2) the real parts of all eigen values are positive. L 1 If B i M t i it i i ti

  • Lemma 1: If B is a M-matrix, its inverse is non-negative,

that is B(i,j) >= 0.

  • Definition 2: Function G(A, A’) is an auxiliary function

for F(A) if G(A, A’)>= F(A) and G(A, A) = F(A).

  • Lemma 2: If G is an auxiliary function of F, F is non-

increasing under the following update rule:

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-13
SLIDE 13

Optimization Procedure

  • Initialize W and H with non-negative values, and the

g ,

  • ptimization is done by alternating between W and H.
  • Optimize W, fixing H. Define the auxiliary function as

Thus the update rule for W is: Thus the update rule for W is: where is a diagonal element-wise positive g p matrix, which guarantees the non-negativity of W.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-14
SLIDE 14

Optimization Procedure

  • Optimize H, fixing W. The auxiliary function is defined

p , g y as

– To optimize : – To optimize :

and are M-matrix, whose inverse are element-wise non-negative, hence guarantees non- element wise non negative, hence guarantees non negativity of H.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-15
SLIDE 15

General Framework

Intrinsic and penalty graphs for Marginal Fisher Analysis

O f

  • Our algorithm is a general framework, given the intrinsic

and penalty graphs.

– These graphs can be unsupervised, supervised or semi- g p p , p supervised. – We used supervised Marginal Fisher Analysis (MFA) graph to demonstrate the framework.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-16
SLIDE 16

Experiments

  • Face Recognition

g

– Tested on three databases: CMU PIE, ORL and FERET. – Compared with unsupervised algorithms PCA, NMF, LNMF (S. Li CVPR 2001) and supervised algorithms LDA and MFA Li, CVPR 2001) and supervised algorithms LDA and MFA.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-17
SLIDE 17

Experiments

  • Learned non-negative part-based basis

g p NMF NMF LNMF NGE

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-18
SLIDE 18

Experiments

  • Robust to Occlusion

Occlusion Examples IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-19
SLIDE 19

Conclusions

  • Contributions:

– Proposed a general framework called Non-Negative Graph Embedding (NGE). – Supervised MFA graph is used to demonstrate the Supervised MFA graph is used to demonstrate the effectiveness of the algorithm.

  • Limitation:

G ff f – Like other graph-based method, NGE suffers from speed and scalability during the off-line training.

  • Extension:

– Unlabeled data can be incorporated into the basis learning, while guided by the available label information.

IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2008

slide-20
SLIDE 20

Thank you!