introduction to machine learning 10701
play

Introduction to Machine Learning 10701 Independent Component - PowerPoint PPT Presentation

Introduction to Machine Learning 10701 Independent Component Analysis Barnabs Pczos & Aarti Singh Independent Component Analysis 2 Independent Component Analysis Model Observations (Mixtures) original signals ICA estimated signals


  1. Introduction to Machine Learning 10701 Independent Component Analysis Barnabás Póczos & Aarti Singh

  2. Independent Component Analysis 2

  3. Independent Component Analysis Model Observations (Mixtures) original signals ICA estimated signals 3

  4. Independent Component Analysys Model We observe We want Goal: 4

  5. The Cocktail Party Problem SOLVI NG WI TH PCA Sources Observation PCA Estimation Mixing x(t) = As(t) y(t)=Wx(t) s(t) 5

  6. The Cocktail Party Problem SOLVI NG WI TH I CA Sources Observation ICA Estimation Mixing x(t) = As(t) y(t)=Wx(t) s(t) 6

  7. ICA vs PCA, Similarities • Perform linear transformations • Matrix factorization PCA : low rank matrix factorization for compression M<N N S = U X Columns of U = PCA vectors M ICA : full rank matrix factorization to remove dependency among the rows N = S A X Columns of A = ICA vectors 7 N

  8. ICA vs PCA, Similarities  PCA: X= US, U T U= I  ICA: X= AS, A is invertible  PCA does compression • M< N  ICA does not do compression • same # of features (M= N)  PCA just removes correlations, not higher order dependence  ICA removes correlations, and higher order dependence  PCA: some components are more important than others (based on eigenvalues)  ICA: components are equally important 8

  9. ICA vs PCA Note • PCA vectors are orthogonal • ICA vectors are not orthogonal 9

  10. ICA vs PCA 10

  11. ICA basis vectors extracted from natural images Gabor wavelets, edge detection, receptive fields of V1 cells..., deep neural networks 11

  12. PCA basis vectors extracted from natural images 12

  13. Some ICA Applications TEMPORAL STATI C • Image denoising • Medical signal processing – fMRI, ECG, EEG • Microarray data processing • Brain Computer Interfaces • Decomposing the spectra of • Modeling of the hippocampus, galaxies place cells • Face recognition • Modeling of the visual cortex • Facial expression recognition • Time series analysis • Feature extraction • Financial applications • Clustering • Blind deconvolution • Classification • Deep Neural Networks 13

  14. ICA Application, Removing Artifacts from EEG  EEG ~ Neural cocktail party  Severe con ont am inat ion on of EEG activity by • eye movements • blinks • muscle • heart, ECG artifact • vessel pulse • electrode noise • line noise, alternating current (60 Hz)  ICA can improve signal • effectively det ect ct , separat e and rem ove activity in EEG records from a wide variety of artifactual sources. (Jung, Makeig, Bell, and Sejnowski)  ICA weights (mixing matrix) help find location of sources 14

  15. ICA Application, Removing Artifacts from EEG 15 Fig from Jung

  16. Removing Artifacts from EEG 16 Fig from Jung

  17. ICA for Image Denoising original noisy Wiener filtered ICA denoised (Hoyer, Hyvarinen) median filtered 17 17

  18. ICA for Motion Style Components  Method for analysis and synthesis of human motion from motion captured data  Provides perceptually meaningful “style” components  109 markers, (327dim data)  Motion capture ) data matrix Goal: Find motion style components. ICA ) 6 independent components (emotion, content,…) (Mori & Hoshino 2002, Shapiro et al 2006, Cao et al 2003) 18

  19. walk sneaky walk with sneaky sneaky with walk 19

  20. ICA Theory 20

  21. Statistical (in)dependence Definition (Independence) Definition (Shannon entropy) Definition (KL divergence) Definition (Mutual Information) 21

  22. Solving the ICA problem with i.i.d. sources 22

  23. Solving the ICA problem 23

  24. Whitening (We assumed centered data) 24

  25. Whitening We have, 25

  26. Whitening solves half of the ICA problem Note: The number of free parameters of an N by N orthogonal matrix is (N-1)(N-2)/2. ) whitening solves half of the ICA problem original mixed whitened 26

  27. Solving ICA I CA task: Given x ,  find y (the estimation of s ),  find W (the estimation of A -1 ) I CA solution : y= Wx  Remove mean, E[ x ]= 0  Whitening, E[ xx T ]= I  Find an orthogonal W optimizing an objective function • Sequence of 2-d Jacobi (Givens) rotations rotated original mixed whitened 27 (demixed)

  28. Optimization Using Jacobi Rotation Matrices p q q p 28

  29. ICA Cost Functions Lemma Proof: Homework Therefore, 29

  30. ICA Cost Functions Therefore, The covariance is fixed: I. Which distribution has the largest entropy? ) go away from normal distribution 30

  31. Central Limit Theorem The sum of independent variables converges to the normal distribution ) For separation go far away from the normal distribution ) Negentropy, |kurtozis| maximization 31 31 Figs from Ata Kaban

  32. ICA Algorithms 32

  33. Maximum Likelihood ICA Algorithm David J.C. MacKay (97) rows of W 33

  34. Maximum Likelihood ICA Algorithm 34

  35. ICA algorithm based on Kurtosis maximization Kurtosis = 4 th order cumulant Measures •the distance from normality •the degree of peakedness 35

  36. The Fast ICA algorithm (Hyvarinen) Probably the most famous ICA algorithm ( ¸ Lagrange multiplier) Solve this equation by Newton–Raphson’s method. 36

  37. Newton method for finding a root 37

  38. Newton Method for Finding a Root Goal: Linear Approximation ( 1 st order Taylor approx ) : Therefore, 38

  39. Illustration of Newton’s method Goal : finding a root In the next step we will linearize here in x 39

  40. Example: Finding a Root http://en.wikipedia.org/wiki/Newton%27s_method 40

  41. Newton Method for Finding a Root This can be generalized to multivariate functions Therefore, [Pseudo inverse if there is no inverse] 41

  42. Newton method for FastICA 42

  43. The Fast ICA algorithm (Hyvarinen) Solve : Note : The derivative of F : 43

  44. The Fast ICA algorithm (Hyvarinen) The Jacobian matrix becomes diagonal, and can easily be inverted. Therefore, 44

  45. Other Nonlinearities 45

  46. Other Nonlinearities Newton method: Algorithm: 46

  47. Fast ICA for several units 47

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend