Introduction to Machine Learning CMU-10701
- 20. Independent Component Analysis
Introduction to Machine Learning CMU-10701 20. Independent - - PowerPoint PPT Presentation
Introduction to Machine Learning CMU-10701 20. Independent Component Analysis Barnabs Pczos Contents ICA model ICA applications ICA generalizations ICA theory 2 Independent Component Analysis 3 Independent Component
2
3
4
Goal:
5
Observations (Mixtures)
ICA estimated signals
6
We observe Model We want Goal:
7
X U S X A S
N M
8
PCA: X=US, UTU=I ICA: X=AS PCA does compression
ICA does not do compression
PCA just removes correlations, not higher order dependence ICA removes correlations, and higher order dependence PCA: some components are more important than others (based on eigenvalues) ICA: components are equally important
9
10
11
12
13
ECG, EEG
place cells
14
records from a wide variety of artifactual sources.
(Jung, Makeig, Bell, and Sejnowski)
15
Fig from Jung
16
Fig from Jung
17 17
(Hoyer, Hyvarinen)
18
(Mori & Hoshino 2002, Shapiro et al 2006, Cao et al 2003)
19 19
20
21
22
23
24
25
Proof: Homework
Proof: Homework
Proof: Homework
26
27
28
29
30
31
32
whitened
mixed
33
Remove mean, E[x]=0 Whitening, E[xxT]=I Find an orthogonal W optimizing an objective function
find y (the estimation of s), find W (the estimation of A-1) ICA solution: y=Wx ICA task: Given x,
mixed whitened rotated (demixed)
34
35
However, this is the only ‘nice’ distribution that we cannot recover! No hope for recovery...
36 36
37 37
Figs borrowed from Ata Kaban
38
39
There are more than 100 different ICA algorithms…
40
David J.C. MacKay (97) rows of W
41
42
Probably the most famous ICA algorithm
43
44
45
46
47