- AP & LLE Xiangliang Zhang King Abdullah University of - - PowerPoint PPT Presentation

ap lle
SMART_READER_LITE
LIVE PREVIEW

- AP & LLE Xiangliang Zhang King Abdullah University of - - PowerPoint PPT Presentation

- AP & LLE Xiangliang Zhang King Abdullah University of Science and Technology CNCC, Oct 25, 2018 Hangzhou, China Outline Affinity Propagation (AP) [Frey and Dueck,


slide-1
SLIDE 1

无监督学习中的 选代表和被代表问题

  • AP & LLE

张响亮

Xiangliang Zhang

King Abdullah University of Science and Technology

CNCC, Oct 25, 2018 Hangzhou, China

slide-2
SLIDE 2

Outline

2

  • Affinity Propagation (AP)

[Frey and Dueck, Science, 2007]

  • Locally Linear Embedding (LLE)

[Roweis and Saul, Science, 2000]

slide-3
SLIDE 3

Affinity Propagation

3

[Frey and Dueck, Science 2007]

slide-4
SLIDE 4

Affinity Propagation

4

[Frey and Dueck, NIPS 2005]

We describe a new method that, for the first time to our knowledge, combines the advantages of model-based clustering and affinity-based clustering.

Component Mixing coefficient

slide-5
SLIDE 5

Clustering: group the similar points together

2 1

) (

m i C x k m

x

m i

µ

  • S

S

Î =

i C x m m

x C

m iÎ

S = | | 1 µ Minimize where

2 1

) (

m i C x k m

x

m i

µ

  • S

S

Î =

} | {

m i i m

C x x Î = µ

Minimize where

K-medoids K-medians

slide-6
SLIDE 6

Inspired by greedy k-medoids

Data to cluster: The likelihood of belong to the cluster with center is the Bayesian prior probability that is a cluster center

The responsibility of k-th component generating xi Assign xi with center si Choose a new center

slide-7
SLIDE 7

Understanding the process

Message sent from xi to each center/exemplar: preference to be with each exemplar Hard decision for cluster centers/exemplars

Introduce: “availabilities”, which is sent from exemplars to other points, and provides soft evidence of the preference for each exemplar to be available as a center for each point

slide-8
SLIDE 8

The method presented in NIPS‘05

8

  • Responsibilities are computed using likelihoods and availabilities
  • Availabilities are computed using responsibilities, recursively

Affinities

slide-9
SLIDE 9

Interpretation by Factor Graph

9

is the index of the exemplar for

Objective function is

Constraints:

Should not be empty with a single exemplar An exemplar must select itself as exemplar

slide-10
SLIDE 10

Input and Output of AP in Science07

10

Preference (prior)

slide-11
SLIDE 11

AP: a message passing algorithm

11

slide-12
SLIDE 12

Iterations of Message passing in AP

12

slide-13
SLIDE 13

Iterations of Message passing in AP

13

slide-14
SLIDE 14

Iterations of Message passing in AP

14

slide-15
SLIDE 15

Iterations of Message passing in AP

15

slide-16
SLIDE 16

Iterations of Message passing in AP

16

slide-17
SLIDE 17

Iterations of Message passing in AP

17

slide-18
SLIDE 18

Iterations of Message passing in AP

18

slide-19
SLIDE 19

Iterations of Message passing in AP

19

slide-20
SLIDE 20

Summary AP

Xiangliang Zhang, KAUST CS340: Data Mining 20

slide-21
SLIDE 21

Extensive study of AP

21

slide-22
SLIDE 22

Outline

22

  • Affinity Propagation (AP)

[Frey and Dueck, Science, 2007]

  • Locally Linear Embedding (LLE)

[Roweis and Saul, Science, 2000]

slide-23
SLIDE 23

Locally Linear Embedding (LLE)

23

[Roweis and Saul, Science, 2000]

Saul and Roweis. Think globally, fit locally: unsupervised learning of low dimensional

  • manifolds. JMLR 2003
slide-24
SLIDE 24

LLE - motivations

24

slide-25
SLIDE 25

Inspired by MDS

25

Multidimensional Scaling (MDS), find embedding of

  • bjects in low-dim space, preserving pairwise distance

Given pairwise similarity Embedding to find

Eliminate the need to estimate pairwise distances between widely separated data points?

slide-26
SLIDE 26

LLE – general idea

— Locally, on a fine enough

scale, everything looks linear

— Represent object as linear

combination of its neighbors

— Assumption: same linear

representation will hold in the low dimensional space

— Find a low dimensional

embedding which minimizes reconstruction loss

26

slide-27
SLIDE 27

LLE – matrix representation

1.Select k nearest neighbors 2.Reconstruct xi by its K nearest neighbors

27

2 i j j ij i

|| x w x || (W) å

å

  • =

e

Find W by minimizing

slide-28
SLIDE 28

LLE – matrix representation

3.Need to solve system Y = W*Y

28

Find the embedding vectors Yi by minimizing:

) e convarianc unit with ( I Y Y N 1 and

  • rigin)
  • n the

(centered Y s.t. W) (I W) (I M where ) Y (Y M || Y W Y || (Y)

N i i T i N i i T N i N j j i ij 2 i j j ij i

å = å =

  • =

åå

  • =

å å

  • =

e

slide-29
SLIDE 29

LLE – algorithm summary

  • 1. Find k nearest neighbors in X space
  • 2. Solve for reconstruction weights W
  • 3. Compute embedding coordinates Y using weights W:

— Create a sparse matrix M = (I-W)T(I-W) — Set Y to be the eigenvectors corresponding to the bottom d

non-zero eigenvectors of M

29

neighbors nearest K s x'

  • f
  • ne

is and ), x ( ) x ( C where C C w

j k T j jk 1

  • T
  • 1

h h h

  • =

= 1 1 1

slide-30
SLIDE 30

Continuing, SNE

30

Allows “many-to-one” mappings in which a single ambiguous object really belongs in several disparate locations in the low-dimensional space, while LLE makes one-to-one mapping. pj|i is the asymmetric probability that i would pick j as its neighbor Gaussian Neighborhood in

  • riginal space

qj|i is induced probability that point i picks point j as its neighbor Gaussian Neighborhood in low- dim space

slide-31
SLIDE 31

Continuing, t-SNE

— uses a Student-t distribution (heavier tail) rather than a Gaussian to compute the similarity between two points in the low-dimensional space — symmetrized version of the SNE cost function with simpler gradients

31

slide-32
SLIDE 32

LLE - Follow up work

LLE output strongly depends on selection of k

Jing Chen and Yang Liu. Locally linear embedding: a survey. Artificial Intelligence Review (2011)

32 [Chen and Liu, 2011]

slide-33
SLIDE 33

[Ting and Jordan, 2018]

slide-34
SLIDE 34

Thank you for your attention!

Lab of Machine Intelligence and kNowledge Engineering (MINE): http://mine.kaust.edu.sa/