application of pca to facial recognition
play

Application of PCA to Facial Recognition Aaron Kosmatin, Clayton - PowerPoint PPT Presentation

Outline Introduction Principle Component Analysis Applying PCA to Images Application of PCA to Facial Recognition Aaron Kosmatin, Clayton Broman Math 45 December 17, 2010 Outline Introduction Principle Component Analysis Applying PCA to


  1. Outline Introduction Principle Component Analysis Applying PCA to Images Application of PCA to Facial Recognition Aaron Kosmatin, Clayton Broman Math 45 December 17, 2010

  2. Outline Introduction Principle Component Analysis Applying PCA to Images Outline 1 Introduction 2 Principle Component Analysis 3 Applying PCA to Images 4

  3. Outline Introduction Principle Component Analysis Applying PCA to Images Applications of Facial Recognition Photo Organization Googles’ Picasa Facebook’s Face Locator Apple’s iLife Sony’s Picture Motion Browser Automation Border crossings that check passports Replace passwords on computers Misc. Many digital cameras Microsoft Kinect

  4. Outline Introduction Principle Component Analysis Applying PCA to Images Challenges Computation Speed False Positives Organizing Information

  5. Outline Introduction Principle Component Analysis Applying PCA to Images Solutions Principle Component Analysis (PCA) Linear Discriminate Analysis Electron Bunch Graph Mapping Hidden Markov Model Dynamic Link Matching

  6. Outline Introduction Principle Component Analysis Applying PCA to Images Overview PCA uses a covariance matrix to analyze how dimensions of a space vary with respect to one another. Some dimensions of the covariance matrix will have a low variation, while others will have high, depending on how consistent patterns or variations are from the average data. The eigenvectors of the covariance matrix form a basis for the space contained by the data. When sorted from highest to lowest, the eigenvalues represent the dimensions with the highest to lowest respective variation.

  7. Outline Introduction Principle Component Analysis Applying PCA to Images Analysis of Datasets When compairing datasets, a more useful measure than just the mean is required. While [1 3 5 7 9], [3 4 5 6 7] and [5 5 5 5 5] have the same mean, the range of values is higher in some sets than others. This range can be shown by the standard deviation. � n � ( X − ¯ � X ) 2 � � i =1 � sdev = n − 1

  8. Outline Introduction Principle Component Analysis Applying PCA to Images Variance The variance is a related function to the standard deviation. n ( X − ¯ � X ) 2 i =1 variance = n − 1 Variance is the square of the standard deviation. n ( X − ¯ X )( Y − ¯ � Y ) i =1 covariance = n − 1 Covariance is the standard deviation of one dimension multiplied by the standard deviation of another dimension. The covariance shows the amount that two dimensions vary with respect to one another.

  9. Outline Introduction Principle Component Analysis Applying PCA to Images The Covariance Matrix The covariance matrix groups all of the covariances into a matrix.   cov ( x , x ) cov ( x , y ) cov ( x , z ) C = cov ( y , x ) cov ( y , y ) cov ( y , z )   cov ( z , x ) cov ( z , y ) cov ( z , z ) The main diagonal of the covariance matrix is the variance of that dimension. Since cov ( a , b ) = cov ( b , a ) the covariance matrix is symetric. The covariance matrix can be found by subtracting the average from each dimension and then: A T × A

  10. Outline Introduction Principle Component Analysis Applying PCA to Images Original Data x y − 0 . 4 0 . 4 0 . 6 0 . 0 0 . 7 1 . 0 1 . 8 1 . 8 1 . 4 2 . 2 2 . 7 2 . 2 3 . 0 2 . 3 3 . 0 3 . 5 3 . 8 4 . 0 4 . 2 3 . 5 Graph of Original Data Original Data

  11. Outline Introduction Principle Component Analysis Applying PCA to Images Adjusting Data x adjusted y adjusted x − ¯ y − ¯ x y x y − 0 . 4 0 . 4 − 2 . 5 − 1 . 7 0 . 6 0 . 0 − 1 . 5 − 2 . 1 0 . 7 1 . 0 − 1 . 4 − 1 . 2 1 . 8 1 . 8 − 0 . 2 − 0 . 3 1 . 4 2 . 2 − 0 . 6 0 . 1 2 . 7 2 . 2 0 . 6 0 . 1 3 . 0 2 . 3 0 . 9 0 . 2 3 . 0 3 . 5 1 . 0 1 . 4 3 . 8 4 . 0 1 . 7 1 . 9 4 . 2 3 . 5 2 . 1 1 . 4 Graph of Adjusted Data ¯ x = 2 . 1 y = 2 . 1 ¯

  12. Outline Introduction Principle Component Analysis Applying PCA to Images Creating the Covariance Matrix � 2 . 2573 � 1 . 8629 cov ( x adjusted , y adjusted ) = 1 . 8629 1 . 8240

  13. Outline Introduction Principle Component Analysis Applying PCA to Images Eigenvalues of the Covariance Matrix � 0 . 1652 � 0 eigenvalues = 0 3 . 9161 � � 0 . 6650 − 0 . 7468 eigenvectors = − 0 . 7468 − 0 . 6650 The dominant eigenvector is: � − 0 . 7468 � − 0 . 6650

  14. Outline Introduction Principle Component Analysis Applying PCA to Images Using Eigenvectors as Basis for Data FinalData = V T × AdjustedData T v 1 v 2 − 0 . 3691 2 . 9541 0 . 5401 2 . 4921 − 0 . 0670 1 . 8298 0 . 0366 0 . 4173 − 0 . 5360 0 . 3831 0 . 3221 − 0 . 5185 0 . 4381 − 0 . 8003 − 0 . 4276 − 1 . 6601 − 0 . 2500 − 2 . 5932 Eigenvectors as Basis 0 . 3128 − 2 . 5043

  15. Outline Introduction Principle Component Analysis Applying PCA to Images Reducing Dimensions v 2 2 . 9541 2 . 4921 1 . 8298 0 . 4173 0 . 3831 − 0 . 5185 − 0 . 8003 − 1 . 6601 − 2 . 5932 − 2 . 5043 Data in terms of the dominant eigenvector

  16. Outline Introduction Principle Component Analysis Applying PCA to Images Restoring Data ( V × v 2 Data ) + OriginalAverage x y − 0 . 1274 0 . 1378 0 . 2175 0 . 4452 0 . 7119 0 . 8858 1 . 7665 1 . 8255 1 . 7920 1 . 8482 2 . 4651 2 . 4481 2 . 6755 2 . 6356 3 . 3174 3 . 2076 4 . 0141 3 . 8284 Restored data in terms of the 3 . 9476 3 . 7692 dominant eigenvector

  17. Outline Introduction Principle Component Analysis Applying PCA to Images Why PCA PCA highlights where there is high variance in a face. It reduces the data to areas of high variance and recognizes faces based on that. The data being compared is the pixels. The images are converted to column vectors, so the data runs across the rows, not down the columns.

  18. Outline Introduction Principle Component Analysis Applying PCA to Images Γ The image is read into the computer and converted to a vector, Γ i . The vectors Γ i are added to a matrix such that: Γ = [Γ 1 Γ 2 ... Γ M ]

  19. Outline Introduction Principle Component Analysis Applying PCA to Images Ψ M Ψ = 1 � Γ i M i =1

  20. Outline Introduction Principle Component Analysis Applying PCA to Images Φ Φ centers Γ around the origin. Phi is created with: Φ i = Γ i − Ψ

  21. Outline Introduction Principle Component Analysis Applying PCA to Images A Large Dataset The data could be analyzed by solving for the covariance matrix, but...

  22. Outline Introduction Principle Component Analysis Applying PCA to Images A Large Dataset The data could be analyzed by solving for the covariance matrix, but... recall the rows of Φ are the pixels in the same location in each image. Therefore A = Φ T

  23. Outline Introduction Principle Component Analysis Applying PCA to Images A Large Dataset The data could be analyzed by solving for the covariance matrix, but... recall the rows of Φ are the pixels in the same location in each image. Therefore A = Φ T and C = A T A

  24. Outline Introduction Principle Component Analysis Applying PCA to Images A Large Dataset The data could be analyzed by solving for the covariance matrix, but... recall the rows of Φ are the pixels in the same location in each image. Therefore A = Φ T and C = A T A which means C = ΦΦ T and this will require a large amount of computation.

  25. Outline Introduction Principle Component Analysis Applying PCA to Images Eigenvectors of the Covariance Matrix If we start with: Φ T Φ v i = λ i v i

  26. Outline Introduction Principle Component Analysis Applying PCA to Images Eigenvectors of the Covariance Matrix If we start with: Φ T Φ v i = λ i v i And multiply on the left by Φ: ΦΦ T Φ v i = λ i Φ v i

  27. Outline Introduction Principle Component Analysis Applying PCA to Images Eigenvectors of the Covariance Matrix If we start with: Φ T Φ v i = λ i v i And multiply on the left by Φ: ΦΦ T Φ v i = λ i Φ v i The eigenvectors of the covariance matrix can be found as the eigenvectors of Φ T Φ multiplied on the left by Φ

  28. Outline Introduction Principle Component Analysis Applying PCA to Images Projecting The Training Set into Facespace The variation of the individual faces can be found with: α = V T × Φ

  29. Outline Introduction Principle Component Analysis Applying PCA to Images Identifying Faces The procedure for identifying a face not contained in the training set, Γ ′ follows closely the procedure for creating α .

  30. Outline Introduction Principle Component Analysis Applying PCA to Images Identifying Faces The procedure for identifying a face not contained in the training set, Γ ′ follows closely the procedure for creating α . First, subtract Ψ: Φ ′ = Γ ′ − Ψ

  31. Outline Introduction Principle Component Analysis Applying PCA to Images Identifying Faces The procedure for identifying a face not contained in the training set, Γ ′ follows closely the procedure for creating α . First, subtract Ψ: Φ ′ = Γ ′ − Ψ Second, project the image into facespace: α ′ = V T Φ ′

  32. Outline Introduction Principle Component Analysis Applying PCA to Images Identifying Faces The procedure for identifying a face not contained in the training set, Γ ′ follows closely the procedure for creating α . First, subtract Ψ: Φ ′ = Γ ′ − Ψ Second, project the image into facespace: α ′ = V T Φ ′ Third, find the closest point in α to α ′ using the standard euclidian distance formula.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend