introduction to artificial intelligence neural networks
play

Introduction to Artificial Intelligence Neural Networks - Deep - PowerPoint PPT Presentation

Introduction to Artificial Intelligence Neural Networks - Deep Learning for NLP Janyl Jumadinova November 21, 2016 Neural Networks 2/20 Neural Networks 3/20 Neural Networks Neural computing requires a number of neurons , to be connected


  1. Introduction to Artificial Intelligence Neural Networks - Deep Learning for NLP Janyl Jumadinova November 21, 2016

  2. Neural Networks 2/20

  3. Neural Networks 3/20

  4. Neural Networks Neural computing requires a number of neurons , to be connected together into a neural network . Neurons are arranged in layers. 4/20

  5. Activation Functions ◮ The activation function is generally non-linear. ◮ Linear functions are limited because the output is simply proportional to the input. 5/20

  6. Activation Functions 6/20

  7. Network structures Feed-forward networks: ◮ Single-layer perceptrons ◮ Multi-layer perceptrons 7/20

  8. Feed-forward example 8/20

  9. Single-layer Perceptrons Output units all operate separately – no shared weights. Adjusting weights moves the location, orientation, and steepness of 9/20 cliff.

  10. Multi-layer Perceptrons ◮ Layers are usually fully connected. ◮ Numbers of hidden units typically chosen by hand. 10/20

  11. A neural network for learning word vector ◮ Idea : A word and its context is a posiGve training sample ◮ A random word in that same context gives a negative training sample: 11/20

  12. A neural network for learning word vector 12/20

  13. A neural network for learning word vector These are the word features we want to learn . 13/20

  14. A neural network for learning word vector 14/20

  15. Deep Learning ◮ Most current machine learning works well because of human-designed representations and input features . ◮ Machine learning becomes just optimizing weights to best make a final prediction. 15/20

  16. Deep Learning ◮ Most current machine learning works well because of human-designed representations and input features . ◮ Machine learning becomes just optimizing weights to best make a final prediction. ◮ Deep learning algorithms attempt to learn multiple levels of representation of increasing complexity/abstraction. 15/20

  17. A Deep Architecture 16/20

  18. The Need for Distributed Representations Current NLP systems are incredibly fragile because of their atomic symbol representations 17/20

  19. Handling the recursivity of human language 18/20

  20. Recursive Deep Learning: Building on Word Vector Space Models 19/20

  21. How should we map phrases into a vector space? 20/20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend