in5490 advanced topics in artificial
play

IN5490 Advanced Topics in Artificial Intelligence for Intelligent - PowerPoint PPT Presentation

IN5490 Advanced Topics in Artificial Intelligence for Intelligent Systems Md. Zia Uddin 16/10/2018 Principal Components Analysis Principal Component Analysis (PCA) PCA is a way of identifying patterns in data, and expressing the data in


  1. IN5490 – Advanced Topics in Artificial Intelligence for Intelligent Systems Md. Zia Uddin 16/10/2018

  2. Principal Components Analysis

  3. Principal Component Analysis (PCA) PCA is a way of identifying patterns in data, and expressing the data in such a way as to highlight their similarities and differences. It’s a powerful tool to analyze data. Main advantage Compression of the data by reducing the number of dimensions, without much loss of information. This technique used in image compression, as we will see later.

  4. Principal Component Analysis (PCA) Original Variable B PC 2 PC 1 Original Variable A • Orthogonal directions of greatest variance in data • Projections along PC1 discriminate the data most along anyone axis

  5. Pri rincipal Components Analysis (P (PCA) 6 16.10.2017

  6. Principal Component Analysis (PCA) • First principal component is the direction of greatest variability (covariance) in the data • Second is the next orthogonal (uncorrelated) direction of greatest variability • So first remove all the variability along the first component, and then find the next direction of greatest variability • And so on …

  7. Principal Component Analysis (PCA)

  8. Principal Components

  9. Principal Components

  10. Reconstruction Using PCA

  11. Silhouettes

  12. Eigenvalues Top 150 Eigenvalues of eigenvectors

  13. Principal Components

  14. Principal Component Analysis (PCA) Steps 1) Convert each image to a row vector 2) Calculate the mean 3) Subtract the mean 4) Calculate covariance matrix 5) Eigenvalue decomposition 6) Choose top eigenvectors based on eigenvalues 7) Project each image vector to the PCA space

  15. Linear Discriminant Analysis

  16. Linear Discriminant Analysis(LDA) ▪ LDA seeks directions along which the classes are best separated. ▪ It takes into consideration the scatter within-classes S W but also the scatter between-classes S B . c  = − − T ( )( ) S J m m m m B i i i = i 1 c   = − − T ( )( ) S m m m m W k i k i =  i 1 m C k i ▪ LDA computes a transformation that maximizes the between-class scatters while minimizing the within-class scatters. T D S D B = arg max D LDA T D D S D W − 1 ▪ It can be solved by where is the eigenvalues of . − =  S S  1 S S D D W B W B 17

  17. Linear Discriminant Analysis(LDA) Walking Running 0.15 Skipping Right hand waving 0.1 Both hand waving LDC3 0.05 0 -0.05 0.2 0.1 0.2 0.1 0 0 -0.1 -0.1 -0.2 -0.2 LDC2 LDC1 3-D plot of LDA of the binary silhouettes of All activity binary silhouettes different activities.

  18. Independent Components Analysis

  19. What is ICA? “Independent component analysis (ICA) is a method for finding underlying factors or components from multivariate (multi-dimensional) statistical data. What distinguishes ICA from other methods is that it looks for components that are both statistically independent , and nonGaussian .” A.Hyvarinen, A.Karhunen, E.Oja ‘Independent Component Analysis’

  20. ICA Blind Signal Separation (BSS) or Independent Component Analysis (ICA) is the identification & separation of mixtures of sources with little prior information. • Applications include: • Audio Processing • Medical data • Finance • Array processing (beamforming) • Coding • … and most applications where Factor Analysis and PCA is currently used. • While PCA seeks directions that represents data best in a Σ |x 0 - x| 2 sense, ICA seeks such directions that are most independent from each other. Often used on Time Series separation of Multiple Targets

  21. The simple “Cocktail Party” Problem Mixing matrix A x 1 s 1 Observations Sources x 2 s 2 x = As n sources, m= n observations

  22. ICA Observing signals Original source signal 0.2 0.10 0.1 0.05 V1 V3 0.0 0.00 -0.1 -0.05 -0.2 -0.10 0 50 100 150 200 250 0 50 100 150 200 250 ICA 0.2 0.10 0.1 0.05 V2 0.0 0.00 V4 -0.1 -0.05 -0.2 -0.10 0 50 100 150 200 250 0 50 100 150 200 250

  23. Motivation Two Independent Sources Mixture at two Mics = + x ( t ) a s a s 1 11 1 12 2 = + x ( t ) a s a s 2 21 1 22 2 a IJ ... Depend on the distances of the microphones from the speakers

  24. ICA Model • Use statistical “latent variables“ system • Random variable s k instead of time signal • x j = a j1 s 1 + a j2 s 2 + .. + a jn s n , for all j x = As • IC‘s s are latent variables & are unknown AND Mixing matrix A is also unknown • Task: estimate A and s using only the observeable random vector x • Lets assume that no. of IC‘s = no of observable mixtures and A is square and invertible • So after estimating A, we can compute W=A -1 and hence s = Wx = A -1 x

  25. Illustration of ICA with 2 signals a 1 s 2 x 2 a 2 = + a 1 x ( t ) a s ( t ) a s ( t ) 1 11 1 12 2 = + x ( t ) a s ( t ) a s ( t ) s 1 x 1 2 21 1 22 2  = t 1 : T Original s Mixed signals Step2: Step1: Rotatation Sphering

  26. ICA Fixed Point Algorithm Input: X Random init of W Iterate until convergence: = T S W X = T W X ( S ) g − = T 1 W W W W ( ) Output: W , S

  27. ICA Model Basic steps of ICA • Collect data matrix • Whitening • eigenvectors and eigenvalue matrix of . • Distribute the un-mixing matrix W randomly. • Apply iterative procedure on each vector from un-mixing matrix W on Y to approximate the corresponding basis S until it converges . Enhanced ICA ▪ Apply PCA first. ▪ Apply ICA on the PCs ▪ Project the silhouette features on IC feature space

  28. ICs

  29. ICA on Binary Silhouettes ▪ The ICA looks for statistically independent basis images. ▪ ICA focuses on the local feature information. Ten ICs from all activity silhouettes All activity binary silhouettes 30

  30. Optical Flows Displacement u, v x+u, y+v I(x, y, t) = I(x+u, y+v,t-1) I(x, y, t-1) How to estimate pixel motion from image I t-1 to image I t ? Solve pixel correspondence problem – given a pixel in I t-1 , look for nearby pixels of the same color in I t Key assumptions color constancy: a point in I t-1 looks the same in I For grayscale images, this is brightness constancy 31

  31. Optical Flow Features walking running 0.05 skipping sitting down 0 standing up -0.05 -0.1 -0.15 -0.2 0 0.1 0.05 0 -0.05 0.2 (a) -0.1 (b) 3-D plot of LDA on the optical flows of different Sample optical flows from two (a) walking and activities. (b) running frames. Once optical flows of the silhouettes from two consecutive activity frames are obtained, the flow region is divided into 256 sub-blocks to compute the average flow vector of eac h sub-block where each one becomes a size of 4x4 . The average value is calculated as   K 1       = − =   th px 1 n 16 1 p 256 p sub block K p p   K n   i j , py p   The flows are augmented and represented as   K , K ,..., K 1 2 256 Finally, the averaged optical flow features are extended by PCA and LDA. 32

  32. Local Binary Pattern (LBP) ▪ LBP features are local binary patterns based on the intensity values of surrounding pixels of a center pixel. Then, the LBP pattern at the given pixel ( x c , y c ) can be represented as an ordered set of the binary comparisons as:   7 1 a 0  = − =  )2 i LBP x ( , y ) f g ( g , ( ) f a  c c i e  0 a 0 = i 0 ▪ where g e represent the intensity of the given pixel and intensity of the surrounding pixels. 33

  33. Local Binary Pattern (LBP) 43 85 26 0 1 1 53 41 101 ` 1 11011110=222 1 60 45 25 0 1 1 LBP Operator 34

  34. Local Binary Pattern (LBP) LBP Features A depth activity image is divided into small regions and the regions’ LBP histogr ams are concatenated to represent features for one image ▪ To reduce the high dimensionality, PCA is applied on LBP 35 35

  35. Local Directional Pattern (LDP) ▪ The Local Directional Pattern (LDP) assigns an eight-bit binary code to each pixel of an input depth image. ▪ The Kirsch edge detector detects the edges considering all eight neighbors. ▪ Given a central pixel in the image, the eight directional edge response values { m k }, k =0,1,..,7 are computed by Kirsch masks M k in eight different orientations centered on its position. 36

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend