lecture 2 principal components and eigenfaces
play

Lecture 2: Principal Components and Eigenfaces Mark Hasegawa-Johnson - PowerPoint PPT Presentation

Outline Review Symmetric Images PCA Gram Summary Lecture 2: Principal Components and Eigenfaces Mark Hasegawa-Johnson ECE 417: Multimedia Signal Processing, Fall 2020 Outline Review Symmetric Images PCA Gram Summary Outline of


  1. Outline Review Symmetric Images PCA Gram Summary Lecture 2: Principal Components and Eigenfaces Mark Hasegawa-Johnson ECE 417: Multimedia Signal Processing, Fall 2020

  2. Outline Review Symmetric Images PCA Gram Summary Outline of today’s lecture 1 Review: Gaussians and Eigenvectors 2 Eigenvectors of symmetric matrices 3 Images as signals 4 Today’s key point: Principal components = Eigenfaces 5 How to make it work: Gram matrix, SVD 6 Summary 7

  3. Outline Review Symmetric Images PCA Gram Summary Outline Outline of today’s lecture 1 Review: Gaussians and Eigenvectors 2 Eigenvectors of symmetric matrices 3 Images as signals 4 Today’s key point: Principal components = Eigenfaces 5 How to make it work: Gram matrix, SVD 6 Summary 7

  4. Outline Review Symmetric Images PCA Gram Summary Outline of today’s lecture 1 MP 1 2 Review: Gaussians and Eigenvectors 3 Eigenvectors of a symmetric matrix 4 Images as signals 5 Principal components = eigenfaces 6 How to make it work: Gram matrix and SVD

  5. Outline Review Symmetric Images PCA Gram Summary Outline Outline of today’s lecture 1 Review: Gaussians and Eigenvectors 2 Eigenvectors of symmetric matrices 3 Images as signals 4 Today’s key point: Principal components = Eigenfaces 5 How to make it work: Gram matrix, SVD 6 Summary 7

  6. Outline Review Symmetric Images PCA Gram Summary Scalar Gaussian random variables σ 2 = E [( X − µ ) 2 ] µ = E [ X ] ,

  7. Outline Review Symmetric Images PCA Gram Summary Example: Instances of a Gaussian random vector Gaussian random vector   x 0 x = � · · ·   x D − 1   µ 0 µ = E [ � � x ] = · · ·   µ D − 1

  8. Outline Review Symmetric Images PCA Gram Summary Gaussian random vector Example: Instances of a Gaussian random vector ...   σ 2 ρ 01 0 ...   Σ = ρ 10 ρ D − 2 , D − 1     ... σ 2 ρ D − 1 , D − 2 D − 1 where ρ ij = E [( x i − µ i )( x j − µ j )] σ 2 i = E [( x i − µ i ) 2 ]

  9. Outline Review Symmetric Images PCA Gram Summary Sample Mean, Sample Covariance In the real world, we don’t know � µ and Σ! If we have M instances � x m of the Gaussian, we can estimate Examples of � x m − � µ M − 1 µ = 1 � � � x m M m =0 M − 1 1 � µ ) T Σ = ( � x m − � µ )( � x m − � M − 1 m =0 Sample mean and sample covariance are not the same as real mean and real covariance, but we’ll use the same letters ( � µ and Σ) unless the problem requires us to distinguish.

  10. Outline Review Symmetric Images PCA Gram Summary Review: Eigenvalues and eigenvectors The eigenvectors of a D × D square matrix, A , are the vectors � v such that A � v = λ� v (1) The scalar, λ , is called the eigenvalue. It’s only possible for Eq. (1) to have a solution if | A − λ I | = 0 (2)

  11. Outline Review Symmetric Images PCA Gram Summary Left and right eigenvectors Weve been working with right eigenvectors and right eigenvalues: A � v d = λ d � v d There may also be left eigenvectors, which are row vectors � u d and corresponding left eigenvalues κ d : u T u T � d A = κ d � d

  12. Outline Review Symmetric Images PCA Gram Summary Eigenvectors on both sides of the matrix You can do an interesting thing if you multiply the matrix by its eigenvectors both before and after: u T u T u T � i ( A � v j ) = � i ( λ j � v j ) = λ j � i � v j . . . but. . . u T u T u T ( � i A ) � v j = ( κ i � i ) � v j = κ i � i � v j There are only two ways that both of these things can be true. Either u T κ i = λ j or � i � v j = 0

  13. Outline Review Symmetric Images PCA Gram Summary Left and right eigenvectors must be paired!! There are only two ways that both of these things can be true. Either u T κ i = λ j or � i � v j = 0 Remember that eigenvalues solve | A − λ d I | = 0. In almost all cases, the solutions are all distinct ( A has distinct eigenvalues), i.e., λ i � = λ j for i � = j . That means there is at most one λ i that can equal each κ i : � u T i � = j � i � v j = 0 i = j κ i = λ i

  14. Outline Review Symmetric Images PCA Gram Summary Outline Outline of today’s lecture 1 Review: Gaussians and Eigenvectors 2 Eigenvectors of symmetric matrices 3 Images as signals 4 Today’s key point: Principal components = Eigenfaces 5 How to make it work: Gram matrix, SVD 6 Summary 7

  15. Outline Review Symmetric Images PCA Gram Summary Properties of symmetric matrices If A is symmetric with D eigenvectors, and D distinct eigenvalues, then VV T = V T V = I V T AV = Λ A = V Λ V T

  16. Outline Review Symmetric Images PCA Gram Summary Symmetric matrices: left=right If A is symmetric ( A = A T ), then the left and right eigenvectors and eigenvalues are the same, because i A = ( A T � u i ) T = ( A � u T u T u i ) T λ i � = � i u T . . . and that last term is equal to λ i � if and only if � u i = � v i . i

  17. Outline Review Symmetric Images PCA Gram Summary Symmetric matrices: eigenvectors are orthonormal Let’s combine the following facts: u T � i � v j = 0 for i � = j — any square matrix with distinct eigenvalues � u i = � v i — symmetric matrix v T � i � v i = 1 — standard normalization of eigenvectors for any matrix (this is what � � v i � = 1 means). Putting it all together, we get that � 1 i = j v T � i � v j = 0 i � = j

  18. Outline Review Symmetric Images PCA Gram Summary The eigenvector matrix So if A is symmetric with distinct eigenvalues, then its eigenvectors are orthonormal: � 1 i = j v T � i � v j = 0 i � = j We can write this as V T V = I where V = [ � v 0 , . . . , � v D − 1 ]

  19. Outline Review Symmetric Images PCA Gram Summary The eigenvector matrix is orthonormal V T V = I . . . and it also turns out that VV T = I Proof: VV T = VIV T = V ( V T V ) V T = ( VV T ) 2 , but the only matrix that satisfies VV T = ( VV T ) 2 is VV T = I .

  20. Outline Review Symmetric Images PCA Gram Summary Eigenvectors orthogonalize a symmetric matrix So now, suppose A is symmetric: � λ j , i = j v T v T v T � i A � v j = � i ( λ j � v j ) = λ j � i � v j = 0 , i � = j In other words, if a symmetric matrix has D eigenvectors with distinct eigenvalues, then its eigenvectors orthogonalize A : V T AV = Λ  0 0  λ 0 Λ = 0 . . . 0   0 0 λ D − 1

  21. Outline Review Symmetric Images PCA Gram Summary A symmetric matrix is the weighted sum of its eigenvectors: One more thing. Notice that A = VV T AVV T = V Λ V T The last term is  v T  �   λ 0 0 0 0 D − 1 . � v T . [ � v D − 1 ] 0 0  = v 0 , . . . , � . . . λ d � v d �   .   d  0 0 λ D − 1 v T d =0 � D − 1

  22. Outline Review Symmetric Images PCA Gram Summary Summary: properties of symmetric matrices If A is symmetric with D eigenvectors, and D distinct eigenvalues, then A = V Λ V T Λ = V T AV VV T = V T V = I

  23. Outline Review Symmetric Images PCA Gram Summary Outline Outline of today’s lecture 1 Review: Gaussians and Eigenvectors 2 Eigenvectors of symmetric matrices 3 Images as signals 4 Today’s key point: Principal components = Eigenfaces 5 How to make it work: Gram matrix, SVD 6 Summary 7

  24. Outline Review Symmetric Images PCA Gram Summary How do you treat an image as a signal? An RGB image is a signal in three dimensions: f [ i , j , k ] = intensity of the signal in the i th row, j th column, and k th color. f [ i , j , k ], for each ( i , j , k ), is either stored as an integer or a floating point number: Floating point: usually x ∈ [0 , 1], so x = 0 means dark, x = 1 means bright. Integer: usually x ∈ { 0 , . . . , 255 } , so x = 0 means dark, x = 255 means bright. The three color planes are usually: k = 0: Red k = 1: Blue k = 2: Green

  25. Outline Review Symmetric Images PCA Gram Summary How do you treat an image as a vectors? A vectorized RGB image is created by just concatenating all of the colors, for all of the columns, for all of the rows. So if the m th image, f m [ i , j , k ], is R ≈ 200 rows, C ≈ 400 columns, and K = 3 colors, then we set x m = [ x m 0 , . . . , x m , D − 1 ] T � where x m , ( iC + j ) K + k = f m [ i , j , k ] which has a total dimension of D = RCK ≈ 200 × 400 × 3 = 240 , 000

  26. Outline Review Symmetric Images PCA Gram Summary How do you classify an image? Suppose we have a test image, � x test. We want to figure out: who is this person? Test Datum � x test:

  27. Outline Review Symmetric Images PCA Gram Summary Training Data? In order to classify the test image, we need some training data. For example, suppose we have the following four images in our training data. Each image, � x m , comes with a label, y m , which is just a string giving the name of the individual. Training Training Training Training Datum: Datum Datum Datum y 0 = Colin y 1 = Gloria y 2 = Megawati y 3 = Tony Powell: Arroyo: Sukarnoputri: Blair: � x 0 = x 1 = � x 2 = � � x 3 =

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend