Clustering & Unsupervised Learning
Ken Kreutz-Delgado (Nuno Vasconcelos)
ECE 175A – Winter 2012 – UCSD
Clustering & Unsupervised Learning Ken Kreutz-Delgado (Nuno - - PowerPoint PPT Presentation
Clustering & Unsupervised Learning Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Winter 2012 UCSD Statistical Learning Goal: Given a relationship between a feature vector x and a vector y , and iid data samples ( x i ,y i ), find an
ECE 175A – Winter 2012 – UCSD
2
3
wild or of a big city, there is probably no need to segment.
the image, then you would segment.
– Unfortunately, the segmentation mask is usually not available
4
5
| ( | ;
i X i Y i
( ) | ( ) |
i i i i
i i X Y i X Y i i
6
max
( | )
i i i X Y
( ) 2 | 2
i
p T i i i i i Y i i X
7
( i ) = {x1
(i) , ... , xni (i)} = set of examples from class i
( ) ( )
i i T i j i i i j j
( )
i i j i j
Y i
8
* |
Y X i
* |
X Y Y i
9
2 *( )
i i i i
2 1
i i T
d i i Y
10
nearest class “prototype” or “template” i
2 *( )
i i i
1 2( , )
T
i Y
11
with normal w. x0 can be any fixed point
choose it to have minimum norm, in which case w and x0 are then parallel
1 1 2 2
T
x 1 x 3 x 2 x n
w x x0
12
2 *( )
i i i
2 2( , )
i Y
13
1) Apply the optimal decision rule for the (estimated) class pdf’s
creating pseudo-labeled data
2) Update the pdf estimates by doing parameter estimation within each estimated (pseudo-labeled) class cluster found in step 1
14
2 * 1
k i i
( ) 1
i
n new i j i i j i
15
16
17
18
19
20
information criterion, minimum description length
21
( 1 ) .
(2) = ( 1 )
(2) = (1+e) ( 1) e << 1
(4) = 1 (2)
(4) = (1+e) 1 (2)
(4) = 2 (2)
(4) = (1+e) 2 (2)
22
23
prototypes! (256 ~ 1 byte of information)
24
25
26
27
* |
X Y Y i
28
29
# of mixture components cth component “weight” cth “mixture component” = Gaussian pdf
30
xk only to a single class via the MAP rule.)
31
(0)
(i) and observations in D,
(i) } = E{ (Zk = j
(i) }
(i+1)
Estimate parameters
(i+1)
Fill in class assignments hkj E-step M-step
32
( ) ( ) | ( ) ( ) | ( ) ( ) ( ) | ( ) ( ) | 1
i i k j k k Z X k k i i X Z k k Z k i X k i i X Z k k j C i i X Z k k c c
1 ( ) 1
i j k j j n k k j n k k
1 1
C C j j j j
33
( ) ( )
2 2 ( ) ( ) ( ) | ( ) ( ) 1
i i
i i k i k j Z X k k C i i k c c j j c j c
( 1)
( 1) 1 2 ( 1) 2 ( 1) 1 1
j
j i j k j j n n i i j k j k j k j k j j j n k k k
34
35