generalizing word embeddings using bag of subwords
play

Generalizing Word Embeddings using Bag of Subwords Jinman Zhao , - PowerPoint PPT Presentation

Generalizing Word Embeddings using Bag of Subwords Jinman Zhao , Sidharth Mudgal, Yingyu Liang University of Wisconsin-Madison Nov. 2, 2018 @ EMNLP Word Embeddings the [ -0.1 0.1 0.3 ... ] Belgium officially the Kingdom of Belgium, is a


  1. Generalizing Word Embeddings using Bag of Subwords Jinman Zhao , Sidharth Mudgal, Yingyu Liang University of Wisconsin-Madison Nov. 2, 2018 @ EMNLP

  2. Word Embeddings the [ -0.1 0.1 0.3 ... ] Belgium officially the Kingdom of Belgium, is a country be [ 0.2 -0.3 0.2 ... ] in Western Europe bordered by France, the Netherlands, and [ 0.1 0.1 -0.1 ... ] Germany and Luxembourg. It covers an area of 30,528 square kilometres (11,787 sq mi) and has a population ... of more than 11.4 million. The capital and Belgium [ 0.3 -0.4 0.5 ] largest city is Brussels ; other major Train Brussels [ 0.2 -0.3 0.6 ] cities are Antwerp, Ghent, Charleroi and Liège. The ... sovereign state of Belgium is a federal constitutional monarchy with a parliamentary system of governance. Belgian [ 0.2 -0.6 0.4 ... ] Its institutional organisation is complex and is structured on both regional and linguistic grounds. decomposable [ ? ? ? ... ] preEMNLP [ ? ? ? ... ] Text corpus Model

  3. Word Embedding and Vocabulary Word embedding word word vector ↦ Learnt from large text corpus. Essential to many neural-network based approaches for NLP tasks. Many popular word embedding techniques assume fixed-size vocabularies. E.g. word2vec (Mikolov et al. , 2013), GloVe (Pennington et al. , 2014). They have little to do with out-of-vocabulary (OOV) words!

  4. Generalize to OOV words? 1. Estimating word vectors for rare or unseen words can be crucial. Understanding new trending terms. 2. We can often guess the meaning of the word from its spelling. “preEMNLP” probably means “before EMNLP”. +ese means the people of some place. Chemical names.

  5. Generalize to OOV words? 1. Estimating word vectors for rare or unseen words can be crucial. Understanding new trending terms. 2. We can often guess the meaning of the word from its spelling. “preEMNLP” probably means “before EMNLP”. +ese means the people of some place. Chemical names. 0. Existence of good pre-trained vectors (with fixed-size vocabularies).

  6. Our Approach: A Learning Task Generalizes pre-trained word embeddings Vocabulary → R n word word vector ↦ towards OOV words by using them as training data and learning a mapping spelling word vector ↦ No context is needed!

  7. Our Bag-of-Subwords Model Parameters: a lookup table maps character n-grams to vectors. Word vector = average of the vectors of all its character n-grams. Limit the sizes of character n-grams to be within l min and l max . Training: minimize mean square loss between BoS vector and target vector for all words in the vocabulary.

  8. Bag-of-Subwords Model v precedent precedent In-vocabulary word Bag of vectors Bag of subwords Minimize MSE for in-vocab words pre v pre Arbitrary rec v rec “word” average ... ... v precedent prec v prec “precedent” rece v rece ... ... ... ... ceden v ceden edent v edent

  9. Bag-of-Subwords Model v precedent precedent In-vocabulary word Bag of vectors Bag of subwords pre v pre Arbitrary reE v reE “word” average ... ... v preEMNLP preE v preE “preEMNLP” reEN v reEN ... ... ... ... eEMNL v eEMNL EMNLP v EMNLP

  10. Most Related Works MIMICK (Pinter et al. 2017) tacles the same task using a character-level bidirectional LSTM model. fastText (Bojanowski et al., 2017) uses the same subword-level character n-gram model but is trained over large text corpora. MIMICK (Pinter et al. 2017) subword-level model.

  11. Word Similarity Task Word pairs Human label Induced similarity love,sex 6.77 0.6 correlation tiger,cat 7.35 0.5 cos(v w1 , v w2 ) book,paper 7.46 0.6 computer,keyboard 7.62 0.8 ... ...

  12. Correlation Our method almost triples the correlation score on common and rare words compared to MIMICK.

  13. Correlation Our method matches the performance with fastText on rare words without access to contexts. Spelling is effective!

  14. Word Similarity Task Target vectors: - English PolyGlot vectors - Google word2vec vectors Evaluation sets: - RW = Stanford RareWord - WS = WordSim353 Other approach: - Edit distance - fastText over Wikipedia dump

  15. Joint Prediction of Part-of-Speech Tags and Morphosyntactic Attributes POS tags VERB PART VERB NOUN ADP PROPN Sentence ... traveled to attend conference in Belgium ... Mood=Ind Morpho- Person=1 VerbForm= Number= syntactic Tense=Past Inf Sing Attributes VerbForm= Fin

  16. Joint Prediction of Part-of-Speech Tags and Morphosyntactic Attributes POS tags VERB PART VERB NOUN ADP PROPN Bi-LSTM Sentence ... traveled to attend conference in Belgium ... Mood=Ind Morpho- Person=3 VerbForm= Number= syntactic Tense=Past Inf Sing Attributes VerbForm= Fin MIMICK (Pinter et al. 2017).

  17. / ar / bg / cs / da / el / en / es / eu / fa / he / hi / hu / id / it / kk / lv / ro / ru / sv / ta / tr / vi / zh / 23 languages Our method consistently outperforms MIMICK in all the 23 languages tested within the universal dependency (UD) dataset.

  18. Efficiency Training time.

  19. 3.5 s/epoch Our model takes only 3.5 s/epoch to train over English PolyGlot vectors with a naive single-thread CPU-only Python implementation and a usual desktop PC.

  20. Conclusion A surprisingly simple and fast method to extend pre-trained word vectors towards out-of-vocabulary words, without using any context . The intrinsic and extrinsic evaluations show that our model’s ability in capturing lexical knowledge and generating good vectors, using only spellings . Can we do more or better with spellings only or with minimal extra context?

  21. Thanks for listening! Q & A

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend