applied machine learning
play

APPLIED MACHINE LEARNING Methods for Clustering K-means, Soft - PowerPoint PPT Presentation

MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING APPLIED MACHINE LEARNING Methods for Clustering K-means, Soft K-means DBSCAN 1 MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Objectives Learn basic techniques for data


  1. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING APPLIED MACHINE LEARNING Methods for Clustering K-means, Soft K-means DBSCAN 1

  2. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Objectives Learn basic techniques for data clustering • K-means and soft K-means, GMM (next lecture) • DBSCAN Understand the issues and major challenges in clustering • Choice of metric • Choice of number of clusters 2

  3. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING What is clustering? Clustering is a type of multivariate statistical analysis also known as cluster analysis, unsupervised classification analysis, or numerical taxonomy. Clustering is a process of partitioning a set of data (or objects) in a set of meaningful sub-classes, called clusters. Cluster: a collection of data objects that are “similar” to one another and thus can be treated collectively as one group. 3

  4. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Classification versus Clustering Supervised Classification = Classification  We know the class labels and the number of classes. 1 3 2 2 1 3 Unsupervised Classification = Clustering  We do not know the class labels and may not know the number of classes. ? ? ? ? ? ? 4

  5. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Classification versus Clustering Unsupervised Classification = Clustering  Hard problem when no pair of objects have exactly the same feature.  Need to determine how similar two or more objects are to one another. ? ? ? ? ? 5

  6. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Which clusters can you create? Which two subgroups of pictures are similar and why? 6

  7. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Which clusters can you create? Which two subgroups of pictures are similar and why? 7

  8. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING What is Good Clustering? A good clustering method produces high quality clusters when: • The intra-class (that is, intra-cluster) similarity is high. • The inter-class similarity is low. • The quality measure of a cluster depends on the similarity measure used! 8

  9. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Exercise:  Person1 with glasses  Person1 without glasses  Person2 without glasses  Person2 with glasses Intra-class similarity is the highest when: a) you choose to classify images with and without glasses b) you choose to classify images of person1 against person2 9

  10. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Exercise:  Person1 with glasses  Person1 without glasses  Person2 without glasses  Person2 with glasses Projection onto first two principal components after PCA Intra-class similarity is the highest when: a) you choose to classify images with and without glasses b) you choose to classify images of person1 against person2 10

  11. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Exercise:  Person1 with glasses  Person1 without glasses  Person2 without glasses  Person2 with glasses e1 e2 Projection onto e1 against e2 The eigenvector e1 is composed of a mix between the main characteristics of the two faces and it is hence explanatory of both. However, since both faces have little in common, the two groups have different coordinates onto e1 but have quasi identical coordinates for the glasses in each subgroup. Projecting onto e1 hence offers a mean to compute a metric of similarity across the two persons. 11

  12. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Exercise:  Person1 with glasses  Person1 without glasses  Person2 without glasses  Person2 with glasses e1 e2 e3 Projection onto e1 against e3 When projecting onto e1 and e3, we can separate the image of the person1 with and without glasses, as the eigenvector e3 embeds features distinctive of person1 primarily. 12

  13. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Exercise: Projection onto first two principal components after PCA Design a method to find out the groups when you no longer have the class labels? 13

  14. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Sensitivity to Prior Knowledge Outliers (noise) x 3 Relevant Data x 1 x 2 Priors: • Data cluster within a circle • There are 2 clusters 14

  15. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Sensitivity to Prior Knowledge x 3 x 1 x 2 Priors: • Data follow a complex distribution • There are 3 clusters 15

  16. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Clusters’ Types DBSCAN K-means produces non- produces globular clusters globular clusters Globular Clusters Non-Globular Clusters 16

  17. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING What is Good Clustering? Requirements for good clustering: • Discovery of clusters with arbitrary shape • Ability to deal with noise and outliers • Insensitivity to input records’ ordering • Scalability • High dimensionality • Interpretability and reusability 17

  18. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING How to cluster? x 2 x 1 What choice of model (circle, ellipse) for the cluster? How many models? 18

  19. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering K-Means clustering generates a number K of disjoint clusters to miminize: 2   K        x 2 1 K J ,..., x i k   i k 1 x c k x i th data point i  k geometric centroid x 1 𝑑 𝒍 cluster label or number What choice of model (circle, ellipse) for the cluster? Circle How many models? Fixed number: K=2 Where to place them for optimal clustering? 19

  20. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering x 2 x 1 Initialization : initialize at random the positions of the centers of the clusters In mldemos; centroids are initialized on one datapoint with no overlap across centroids. 20

  21. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering      d x  k argmin , i k i k Responsibility of cluster for point k x i x 2   1 if k k   i k r i  0 otherwise x i th data point i x 1  k geometric centroid Assignment Step: • Calculate the distance from each data point to each centroid. • Assign the responsibility of each data point to its “closest” centroid. If a tie happens (i.e. two centroids are equidistant to a data point, one assigns the data point to the smallest winning centroid). 21

  22. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering      d x  k argmin , i k i k Responsibility of cluster for point k x i x 2   1 if k k   i k r i  0 otherwise    k r x i i i k x 1  k r i i Update step (M-Step): Recompute the position of centroid based on the assignment of the points 22

  23. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering      d x  k argmin , i k i k Responsibility of cluster for point k x i x 2   1 if k k   i k r i  0 otherwise    k r x i i i k x 1  k r i i Assignment Step: • Calculate the distance from each data point to each centroid. • Assign the responsibility of each data point to its “closest” centroid. If a tie happens (i.e. two centroids are equidistant to a data point, one assigns the data point to the smallest winning centroid). 23

  24. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering x 2 x 1 Update step (M-Step): Recompute the position of centroid based on the assignment of the points Stopping Criterion: Go back to step 2 and repeat the process until the clusters are stable. 24

  25. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering Intersection points x 2 x 1 K-means creates a hard partitioning of the dataset 25

  26. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING Effect of the distance metric on K-means L2-Norm L1-Norm L3-Norm L8-Norm 26

  27. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering: Algorithm 1. Initialization: Pick K arbitrary centroids and set their geometric means to random values (in mldemos; centroids are initialized on one datapoint with no overlap across centroids). 2. Calculate the distance from each data point to each centroid . 3. Assignment Step: Assign the responsibility of each data point to its “closest” centroid ( E-step). If a tie happens (i.e. two centroids are equidistant to a data point, one assigns the data point to the smallest winning centroid).     1 if k k     i  d x  k r k argmin , i k i  i 0 otherwise k 4. Update Step: Adjust the centroids to be the means of all data points    k r x i assigned to them (M-step) i i k  k r i i 5. Go back to step 2 and repeat the process until the clusters are stable. 27

  28. MACHINE LEARNING - MSc Course APPLIED MACHINE LEARNING K-means Clustering The algorithm of K-means is a simple version of Expectation-Maximization applied to a model composed of isotropic Gauss functions (see next lecture) 28

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend