recurrent neural networks
play

Recurrent Neural Networks CS 6956: Deep Learning for NLP Overview - PowerPoint PPT Presentation

Recurrent Neural Networks CS 6956: Deep Learning for NLP Overview 1. Modeling sequences 2. Recurrent neural networks: An abstraction 3. Usage patterns for RNNs 4. BiDirectional RNNs 5. A concrete example: The Elman RNN 6. The vanishing


  1. Recurrent Neural Networks CS 6956: Deep Learning for NLP

  2. Overview 1. Modeling sequences 2. Recurrent neural networks: An abstraction 3. Usage patterns for RNNs 4. BiDirectional RNNs 5. A concrete example: The Elman RNN 6. The vanishing gradient problem 7. Long short-term memory units 1

  3. Overview 1. Modeling sequences 2. Recurrent neural networks: An abstraction 3. Usage patterns for RNNs 4. BiDirectional RNNs 5. A concrete example: The Elman RNN 6. The vanishing gradient problem 7. Long short-term memory units 2

  4. Recurrent neural networks • First introduced by Elman 1990 • Provides a mechanism for representing sequences of arbitrary length into vectors that encode the sequential information • Currently, perhaps one of the most commonly used tool in the deep learning toolkit for NLP applications

  5. The RNN abstraction A high level overview that doesn’t go into details Output An RNN cell is a unit of differentiable An RNN compute that maps inputs to outputs cell Input 4

  6. The RNN abstraction A high level overview that doesn’t go into details Output An RNN cell is a unit So far, no way to of differentiable build a sequence of An RNN compute that maps such cells inputs to outputs cell Input 5

  7. The RNN abstraction A high level overview that doesn’t go into details Output To allow the ability to compose these cells, they take a recurrent input from a previous such cell An RNN Recurrent input cell Input 6

  8. The RNN abstraction A high level overview that doesn’t go into details Output In addition to the output, To allow the ability to they also produce a compose these cells, they recurrent output that can take a recurrent input serve as a memory of past from a previous such cell states for the next such cell An RNN Recurrent input Recurrent output cell Input 7

  9. The RNN abstraction A high level overview that doesn’t go into details Conceptually two operations Using the input and the recurrent input (also called the previous cell state), compute 1. The next cell state 2. The output 8

  10. The RNN abstraction: A simple example lives in Salt Lake City John This template is unrolled for each input 9

  11. The RNN abstraction: A simple example lives in Salt Lake City John Output 1 Initial state John This computation graph is used here 10

  12. The RNN abstraction: A simple example lives in Salt Lake City John Output 2 Output 1 Initial state John lives This computation graph is used here 11

  13. The RNN abstraction: A simple example lives in Salt Lake City John Output 2 Output 1 Output 3 Initial state John lives in This computation graph is used here 12

  14. The RNN abstraction: A simple example lives in Salt Lake City John Output 2 Output 1 Output 3 Output 4 Initial state John lives in Salt This computation graph is used here 13

  15. The RNN abstraction: A simple example lives in Salt Lake City John Output 2 Output 1 Output 3 Output 4 Output 5 Initial state John lives in Salt Lake This computation graph is used here 14

  16. The RNN abstraction: A simple example lives in Salt Lake City John Output 2 Output 6 Output 1 Output 3 Output 4 Output 5 Initial state John lives in Salt Lake City This computation graph is used here 15

  17. The RNN abstraction Sometimes this is represented as a “neural network with a loop”. But really, when unrolled, there are no loops. Just a big feedforward network. Output An RNN Recurrent input Recurrent output cell Input 16

  18. An abstract RNN :Notation • Inputs to cells: 𝐲 " at the 𝑢 $% step – These are vectors • Cell states (i.e. recurrent inputs and outputs): 𝐭 " at the 𝑢 $% step – These are also vectors • Outputs: 𝐳 " at the 𝑢 $% step – These are also vectors • At each step: – Compute the next cell state: 𝐭 "() = R(𝐲 " , 𝒕 " ) – Compute the output: 𝒛 " = O(𝐭 "() ) 17

  19. An abstract RNN :Notation • Inputs to cells: 𝐲 " at the 𝑢 $% step – These are vectors • Cell states (i.e. recurrent inputs and outputs): 𝐭 " at the 𝑢 $% step – These are also vectors • Outputs: 𝐳 " at the 𝑢 $% step – These are also vectors • At each step: – Compute the next cell state: 𝐭 "() = R(𝐲 " , 𝒕 " ) – Compute the output: 𝒛 " = O(𝐭 "() ) 18

  20. An abstract RNN :Notation • Inputs to cells: 𝐲 " at the 𝑢 $% step – These are vectors • Cell states (i.e. recurrent inputs and outputs): 𝐭 " at the 𝑢 $% step – These are also vectors • Outputs: 𝐳 " at the 𝑢 $% step – These are also vectors • At each step: – Compute the next cell state: 𝐭 "() = R(𝐲 " , 𝒕 " ) – Compute the output: 𝒛 " = O(𝐭 "() ) 19

  21. An abstract RNN :Notation • Inputs to cells: 𝐲 " at the 𝑢 $% step – These are vectors • Cell states (i.e. recurrent inputs and outputs): 𝐭 " at the 𝑢 $% step – These are also vectors • Outputs: 𝐳 " at the 𝑢 $% step – These are also vectors • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) 20

  22. An abstract RNN :Notation • Inputs to cells: 𝐲 " at the 𝑢 $% step – These are vectors • Cell states (i.e. recurrent inputs and outputs): 𝐭 " at the 𝑢 $% step – These are also vectors • Outputs: 𝐳 " at the 𝑢 $% step – These are also vectors • At each step: Both these functions can be parameterized. – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) That is, they can be – Compute the output: 𝒛 " = O(𝐭 " ) neural networks whose parameters are trained. 21

  23. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) 22

  24. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) – 𝐭 4 = R(𝐭 ) , 𝐲 4 ) = R(R 𝐭 3 , 𝐲 ) , 𝐲 4 ) 23

  25. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) Encodes the sequence – 𝐭 4 = R(𝐭 ) , 𝐲 4 ) = R(R 𝐭 3 , 𝐲 ) , 𝐲 4 ) upto t=2 into a single vector 24

  26. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) – 𝐭 4 = R(𝐭 ) , 𝐲 4 ) = R(R 𝐭 3 , 𝐲 ) , 𝐲 4 ) – 𝐭 6 = R(𝐭 4 , 𝐲 6 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ) 25

  27. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) Encodes the sequence – 𝐭 4 = R(𝐭 ) , 𝐲 4 ) = R(R 𝐭 3 , 𝐲 ) , 𝐲 4 ) upto t=3 into a single vector – 𝐭 6 = R(𝐭 4 , 𝐲 6 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ) 26

  28. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) – 𝐭 4 = R(𝐭 ) , 𝐲 4 ) = R(R 𝐭 3 , 𝐲 ) , 𝐲 4 ) – 𝐭 6 = R(𝐭 4 , 𝐲 6 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ) – 𝐭 7 = R(𝐭 6 , 𝐲 7 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ), 𝐲 7 ) 27

  29. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) Encodes the sequence – 𝐭 4 = R(𝐭 ) , 𝐲 4 ) = R(R 𝐭 3 , 𝐲 ) , 𝐲 4 ) upto t=4 into a single vector – 𝐭 6 = R(𝐭 4 , 𝐲 6 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ) – 𝐭 7 = R(𝐭 6 , 𝐲 7 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ), 𝐲 7 ) 28

  30. What does unrolling the RNN do? • At each step: – Compute the next cell state: 𝐭 " = R(𝐭 "2) , 𝐲 " ) – Compute the output: 𝒛 " = O(𝐭 " ) • We can write this as: – 𝐭 ) = R(𝐭 3 , 𝐲 ) ) Encodes the sequence – 𝐭 4 = R(𝐭 ) , 𝐲 4 ) = R(R 𝐭 3 , 𝐲 ) , 𝐲 4 ) upto t=4 into a single vector – 𝐭 6 = R(𝐭 4 , 𝐲 6 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ) – 𝐭 7 = R(𝐭 6 , 𝐲 7 ) = R R R(𝐭 3 , 𝐲 ) , 𝐲 4 , 𝐲 6 ), 𝐲 7 ) … and so on 29

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend