recursive neural networks and its applications
play

Recursive Neural Networks and Its Applications LU Yangyang - PowerPoint PPT Presentation

Outline RNNs RNNs-FQA RNNs-NEM Recursive Neural Networks and Its Applications LU Yangyang luyy11@sei.pku.edu.cn KERE Seminar Oct. 29, 2014 Outline RNNs RNNs-FQA RNNs-NEM Outline Recursive Neural Networks RNNs for Factoid Question


  1. Outline RNNs RNNs-FQA RNNs-NEM Recursive Neural Networks and Its Applications LU Yangyang luyy11@sei.pku.edu.cn KERE Seminar Oct. 29, 2014

  2. Outline RNNs RNNs-FQA RNNs-NEM Outline Recursive Neural Networks RNNs for Factoid Question Answering RNNs for Quiz Bowl Experiments RNNs for Anormal Event Detection in Newswire Neural Event Model (NEM) Experiments

  3. Outline RNNs RNNs-FQA RNNs-NEM Outline Recursive Neural Networks RNNs for Factoid Question Answering RNNs for Anormal Event Detection in Newswire

  4. Outline RNNs RNNs-FQA RNNs-NEM Introduction Artificial Neural Networks: • For a single neuron: Input x , Output y , Parameters W, b , Activation Function f z = Wx + b, y = f ( z ) • For a simple ANN: z l = W l x l + b l , y l +1 = f ( z l )

  5. Outline RNNs RNNs-FQA RNNs-NEM Introduction (cont.) Using neural networks: learning word vector representations

  6. Outline RNNs RNNs-FQA RNNs-NEM Introduction (cont.) Using neural networks: learning word vector representations Word-level Representation → Sentence-level Representation?

  7. Outline RNNs RNNs-FQA RNNs-NEM Introduction (cont.) Using neural networks: learning word vector representations Word-level Representation → Sentence-level Representation? A kind of solutions: • Composition: Using syntactical information

  8. Outline RNNs RNNs-FQA RNNs-NEM Recursive AutoEncoder 1 Given a sentence s , we can get its binary parsing tree. 1R. Socher, E. H. Huang, J. Pennington, A. Y. Ng, and C. D. Manning. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection. NIPS’11

  9. Outline RNNs RNNs-FQA RNNs-NEM Recursive AutoEncoder 1 Given a sentence s , we can get its binary parsing tree. • Children nodes: c 1 , c 2 • Parent nodes: p = f ( W e [ c 1 ; c 2 ] + b ) • W e : Encoding weight, f : Activation function, b : Bias weight 1R. Socher, E. H. Huang, J. Pennington, A. Y. Ng, and C. D. Manning. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection. NIPS’11

  10. Outline RNNs RNNs-FQA RNNs-NEM Recursive AutoEncoder 1 Given a sentence s , we can get its binary parsing tree. • Children nodes: c 1 , c 2 • Parent nodes: p = f ( W e [ c 1 ; c 2 ] + b ) • W e : Encoding weight, f : Activation function, b : Bias weight Training: Encouraging decoding results to be near the original representations 1R. Socher, E. H. Huang, J. Pennington, A. Y. Ng, and C. D. Manning. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection. NIPS’11

  11. Outline RNNs RNNs-FQA RNNs-NEM Dependency Tree based RNNs 2 Given a sentence s , we can get its dependency tree. Then we add hidden nodes for each word node and get the reformed tree d . 2R. Socher, A. Karpathy, Q. V. Le, C. D. Manning, and A. Y. Ng. Grounded compositional semantics for finding and describing images with sentences. Transactions of the Association for Computational Linguistics’14

  12. Outline RNNs RNNs-FQA RNNs-NEM Dependency Tree based RNNs 2 Given a sentence s , we can get its dependency tree. Then we add hidden nodes for each word node and get the reformed tree d . 2R. Socher, A. Karpathy, Q. V. Le, C. D. Manning, and A. Y. Ng. Grounded compositional semantics for finding and describing images with sentences. Transactions of the Association for Computational Linguistics’14

  13. Outline RNNs RNNs-FQA RNNs-NEM Dependency Tree based RNNs 2 Given a sentence s , we can get its dependency tree. Then we add hidden nodes for each word node and get the reformed tree d . For each node h i in the tree t : h i = f ( z i ) (1) 1 ∑︂ z i = l ( i ) ( W v x i + l ( j ) W pos ( i,j ) h j )) (2) j ∈ C ( i ) where x i , h i , z i ∈ R n , W v , W pos ( i,j ) ∈ R n × n l ( i ) : the number of leaf nodes under h i C ( i ) : the set of hidden nodes under h i pos ( i, j ) : the position of h j respect to h i , such as l 1 , r 1 W l = ( W l 1 , W l 2 , ..., W lkl ) ∈ R kl × n × n , W r = ( W r 1 , W r 2 , ..., W rkr ) ∈ R kr × n × n k l , k r : the max left, right width in the dataset 2R. Socher, A. Karpathy, Q. V. Le, C. D. Manning, and A. Y. Ng. Grounded compositional semantics for finding and describing images with sentences. Transactions of the Association for Computational Linguistics’14

  14. Outline RNNs RNNs-FQA RNNs-NEM Tasks using RNNs I • R. Socher, A. Karpathy, Q. V. Le, C. D. Manning, and A. Y. Ng. 2014. Grounded com- positional semantics for finding and describing images with sentences. Transactions of the Association for Computational Linguistics. • M. Luong, R. Socher, and C. D. Manning. 2013. Better word representations with recursive neural networks for morphology. In CoNLL. • R. Socher, J. Bauer, C. D. Manning, and A. Y. Ng. 2013a. Parsing With Compositional Vector Grammars. In ACL. • R. Socher, A. Perelygin, J. Wu, J. Chuang, C. Manning, A. Ng, and C. Potts. 2013d. Recursive deep models for semantic compositionality over a sentiment treebank. In EMNLP. • E. H. Huang, R. Socher, C. D. Manning, and A. Y. Ng. 2012. Improving Word Representations via Global Context and Multiple Word Prototypes. In ACL. • R. Socher, B. Huval, B. Bhat, C. D. Manning, and A. Y. Ng. 2012a. Convolutional- Recursive Deep Learning for 3D Object Classification. In NIPS. • R. Socher, B. Huval, C. D. Manning, and A. Y. Ng. 2012b. Semantic Compositionality Through Recursive Matrix-Vector Spaces. In EMNLP.

  15. Outline RNNs RNNs-FQA RNNs-NEM Tasks using RNNs II • R. Socher, E. H. Huang, J. Pennington, A. Y. Ng, and C. D. Manning. 2011a. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection. In NIPS. • R. Socher, C. Lin, A. Y. Ng, and C.D. Manning. 2011b. Parsing Natural Scenes and Natural Language with Recursive Neural Networks. In ICML. • R. Socher, J. Pennington, E. H. Huang, A. Y. Ng, and C. D. Manning. 2011c. Semi- Supervised Recursive Autoencoders for Predicting Sentiment Distributions. In EMNLP. • R. Socher, C. D. Manning, and A. Y. Ng. 2010. Learning continuous phrase representa- tions and syntactic parsing with recursive neural networks. In NIPS-2010 Deep Learning and Unsupervised Feature Learning Workshop. • R. Socher and L. Fei-Fei. 2010. Connecting modalities: Semi-supervised segmentation and annotation of images using unaligned text corpora. In CVPR. • L-J. Li, R. Socher, and L. Fei-Fei. 2009. Towards total scene understanding: classification, annotation and segmentation in an automatic framework. In CVPR.

  16. Outline RNNs RNNs-FQA RNNs-NEM Outline Recursive Neural Networks RNNs for Factoid Question Answering RNNs for Quiz Bowl Experiments RNNs for Anormal Event Detection in Newswire

  17. Outline RNNs RNNs-FQA RNNs-NEM RNN for Factoid Question Answering • A Neural Network for Factoid Question Answering over Paragraphs • EMNLP’14 • Mohit Iyyer 1 , Jordan Boyd-Graber 2 , Leonardo Claudino 1 , Richard Socher 3 , Hal Daum ´ e III 1 1 University of Maryland, Department of Computer Science and UMIACS 2 University of Colorado, Department of Computer Science 3 Stanford University, Department of Computer Science

  18. Outline RNNs RNNs-FQA RNNs-NEM Introduction Factoid Question Answering: • Given a description of an entity, identify the person, place, or thing discussed. Quiz Bowl : • A Task: Mapping natural language text to entities • A challenging natural language problem with large amounts of diverse and compositional data Answer: the Holy Roman Empire → QANTA: A question answering neural network with trans-sentential averaging

  19. Outline RNNs RNNs-FQA RNNs-NEM Quiz Bowl • A Game: Mapping raw text to a large set of well-known entities • Questions: 4 ∼ 6 sentences • Every sentence in a quiz bowl question is guaranteed to contain clues that uniquely identify its answer, even without the context of previous sentences. • A property “pyramidality”: sentences early in a question contain harder, more obscure clues, while later sentences are “giveaways”. • Answering the question correctly requires an actual understanding of the sentence. • Factoid answers: e.g., history questions ask players to identify specific battles, presi- dents, or events

  20. Outline RNNs RNNs-FQA RNNs-NEM Quiz Bowl • A Game: Mapping raw text to a large set of well-known entities • Questions: 4 ∼ 6 sentences • Every sentence in a quiz bowl question is guaranteed to contain clues that uniquely identify its answer, even without the context of previous sentences. • A property “pyramidality”: sentences early in a question contain harder, more obscure clues, while later sentences are “giveaways”. • Answering the question correctly requires an actual understanding of the sentence. • Factoid answers: e.g., history questions ask players to identify specific battles, presi- dents, or events Solutions: Bag-of-Words V.S. Recursive Neural Networks

  21. Outline RNNs RNNs-FQA RNNs-NEM Outline Recursive Neural Networks RNNs for Factoid Question Answering RNNs for Quiz Bowl Experiments RNNs for Anormal Event Detection in Newswire Neural Event Model (NEM) Experiments

  22. Outline RNNs RNNs-FQA RNNs-NEM How to represent question sentences? For a single sentence: • A sentence – A dependency tree • Each node n : associated with a word w , a word vector x w ∈ R d , and a hidden vector h n ∈ R d • Weights: W r ∈ R d × d : with each dependency relation r W v ∈ R d × d : to incorporate x w at a node into the node vector h n ( d = 100 in the experiments)

  23. Outline RNNs RNNs-FQA RNNs-NEM How to represent a single sentence?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend