taal en spraaktechnologie
play

Taal- en spraaktechnologie Sophia Katrenko (thanks to R. Navigli - PowerPoint PPT Presentation

Covered so far Today Taal- en spraaktechnologie Sophia Katrenko (thanks to R. Navigli and S. P . Ponzetto) Utrecht University, the Netherlands Sophia Katrenko Lecture 3 Covered so far Today Outline Covered so far 1 Today 2


  1. Covered so far Today Taal- en spraaktechnologie Sophia Katrenko (thanks to R. Navigli and S. P . Ponzetto) Utrecht University, the Netherlands Sophia Katrenko Lecture 3

  2. Covered so far Today Outline Covered so far 1 Today 2 Unsupervised Word Sense Disambiguation (WSD) Lexical acquisition Sophia Katrenko Lecture 3

  3. Covered so far Today Recap Last time, we discussed WSD resources (WordNet, SemCor, SemEval competitions), and also methods: dictionary-based (Lesk, 1986) supervised WSD (Gale et al., 1992) minimally supervised WSD (Yarowsky, 1995) noun categorization Sophia Katrenko Lecture 3

  4. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Today we discuss Chapter 19 (Jurafsky), and more precisely unsupervised word sense disambiguation 1 lexical acquisition 2 Sophia Katrenko Lecture 3

  5. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition WSD methods: an overview Source: Navigli and Ponzetto, 2010. Sophia Katrenko Lecture 3

  6. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Unsupervised WSD Most methods we have discussed so far focused on the classification, where the number of senses is fixed. Noun categorization has already shifted the focus to the unsupervised learning, whereby the learning itself was unsupervised , while the evaluation was done as for the supervised systems. We will move now more to the unsupervised learning, and discuss clustering (as a mechanism) in more detail. Sophia Katrenko Lecture 3

  7. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Unsupervised WSD The sense of a word can never be taken in isolation. The same sense of a word will have similar neighboring words. “You shall know a word by the company it keeps” (Firth, 1957). “For a large class of cases though not for all in which we employ the word meaning it can be defined thus: the meaning of a word is its use in the language.” (Witgenschtein, “ Philosophical Investigations (1953) ”). Sophia Katrenko Lecture 3

  8. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Unsupervised WSD The unsupervised WSD relies on the observations above: take word occurrences in some (possibly predefined) contexts cluster them assign new words to one of the clusters The noun categorization task followed only the first 2 steps (no assignment for new words). Sophia Katrenko Lecture 3

  9. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering clustering is a type of unsupervised machine learning which aims at grouping similar objects into groups no apriori output(i.e., no labels) a cluster is a collection of objects which are similar (in some way) Sophia Katrenko Lecture 3

  10. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering Types of clustering E XCLUSIVE CLUSTERING (= a certain datum belongs to a definite cluster, no overlapping clusters) O VERLAPPING CLUSTERING (= uses fuzzy sets to cluster data, so that each point may belong to two or more clusters with different degrees of membership) H IERARCHICAL CLUSTERING (= explores the union between the two nearest clusters) P ROBABILISTIC CLUSTERING Sophia Katrenko Lecture 3

  11. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering Hierarchical clustering is in turn of two types B OTTOM - UP ( AGGLOMERATIVE ) T OP - DOWN ( DIVISIVE ) Sophia Katrenko Lecture 3

  12. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering Hierarchical clustering for Dutch text Source: van de Cruys (2006) Sophia Katrenko Lecture 3

  13. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering Hierarchical clustering for Dutch dialects Source: Wieling and Nerbonne (2010) The Goeman-Taeldeman-Van Reenen-project data 1876 phonetically transcribed items for 613 dialect varieties in the Netherlands and Flanders Sophia Katrenko Lecture 3

  14. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering Now . . . clustering problems are NP-hard (it is impossible to try all possible clustering solutions). clustering algorithms look at a small fraction of all possible partitions of the data. the portions of the search space that are considered depend on the kind of algorithm used. Sophia Katrenko Lecture 3

  15. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering What is a good clustering solution? the intra-cluster similarity is high, and the inter-cluster similarity is low. the quality of clusters depends on the definition and the representation of clusters. the quality of clustering depends on the similarity measure. Sophia Katrenko Lecture 3

  16. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering A GGLOMERATIVE CLUSTERING works as follows: Assign each object to a separate cluster. 1 Evaluate all pair-wise distances between clusters. 2 Construct a distance matrix using the distance values. 3 Look for the pair of clusters with the shortest distance. 4 Remove the pair from the matrix and merge them. 5 Evaluate all distances from this new cluster to all other clusters, 6 and update the matrix. Repeat until the distance matrix is reduced to a single element. 7 Sophia Katrenko Lecture 3

  17. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering K-means algorithm Partitions n samples (objects) into k clusters. Each cluster c is represented by its centroid: µ ( c ) = 1 � x | c | x ∈ c The algorithm converges to stable centroids of clusters (= minimizes the sum of the squared distances to the cluster centers) k � � || x − µ i || 2 E = x ∈ c i i = 1 Sophia Katrenko Lecture 3

  18. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering K-means algorithm I NITIALIZATION : select k points into the space represented by the 1 objects that are being clustered (seed points) A SSIGNMENT : assign each object to the cluster that has the 2 closest centroid (mean) U PDATE : after all objects have been assigned, recalculate the 3 positions of the k centroids (means) T ERMINATION : go back to (2) until the centroids no longer move 4 i.e. there are no more new assignments Sophia Katrenko Lecture 3

  19. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering K-means algorithm I NITIALIZATION : select k points into the space represented by the 1 objects that are being clustered (seed points) A SSIGNMENT : assign each object to the cluster that has the 2 closest centroid (mean) U PDATE : after all objects have been assigned, recalculate the 3 positions of the k centroids (means) T ERMINATION : go back to (2) until the centroids no longer move 4 i.e. there are no more new assignments Sophia Katrenko Lecture 3

  20. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering K-means algorithm I NITIALIZATION : select k points into the space represented by the 1 objects that are being clustered (seed points) A SSIGNMENT : assign each object to the cluster that has the 2 closest centroid (mean) U PDATE : after all objects have been assigned, recalculate the 3 positions of the k centroids (means) T ERMINATION : go back to (2) until the centroids no longer move 4 i.e. there are no more new assignments Sophia Katrenko Lecture 3

  21. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering K-means algorithm I NITIALIZATION : select k points into the space represented by the 1 objects that are being clustered (seed points) A SSIGNMENT : assign each object to the cluster that has the 2 closest centroid (mean) U PDATE : after all objects have been assigned, recalculate the 3 positions of the k centroids (means) T ERMINATION : go back to (2) until the centroids no longer move 4 i.e. there are no more new assignments Sophia Katrenko Lecture 3

  22. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition Clustering Sophia Katrenko Lecture 3

  23. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition K-means: limitations sensitive to initial seed points (it does not specify how to initialize the mean values - randomly) need to specify k , the number of clusters, in advance (how do we chose the value of k ?) unable to handle noisy data and outliers unable to model the uncertainty in cluster assignment Sophia Katrenko Lecture 3

  24. Covered so far Unsupervised Word Sense Disambiguation (WSD) Today Lexical acquisition K-means: limitations sensitive to initial seed points (it does not specify how to initialize the mean values - randomly) need to specify k , the number of clusters, in advance (how do we chose the value of k ?) unable to handle noisy data and outliers unable to model the uncertainty in cluster assignment Sophia Katrenko Lecture 3

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend