lecture 14 recurrent neural networks
play

Lecture 14: Recurrent Neural Networks CS109B Data Science 2 Pavlos - PowerPoint PPT Presentation

Lecture 14: Recurrent Neural Networks CS109B Data Science 2 Pavlos Protopapas, Mark Glickman, and Chris Tanner Online lectures guidelines We would prefer you have your video on, but it is OK if you have it off. We would prefer you have


  1. Lecture 14: Recurrent Neural Networks CS109B Data Science 2 Pavlos Protopapas, Mark Glickman, and Chris Tanner

  2. Online lectures guidelines We would prefer you have your video on, but it is OK if you have • it off. We would prefer you have your real name. • • All lectures, labs and a-sections will be live streamed and als available for viewing later on canvas/zoom. • We will have course staff in the chat online and during lecture you can also make use of this spreadsheet to enter your own questions or 'up vote' those of your fellow students. • Quizzed will be available for 24 hours. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 2

  3. Outline Why Recurrent Neural Networks (RNNs) Main Concept of RNNs More Details of RNNs RNN training Gated RNN CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 3

  4. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 4 1

  5. Background Many classification and regression tasks involve data that is assumed to be in independent and id identica ically ly dis istrib ibuted (i. i.i. i.d.). .). For example: De Detecting l lung c cancer Fa Face ce reco ecogni nition Ri Risk of heart attack CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 5

  6. Background Much of our data is inherently se sequential scale examples Na Natural disasters (e.g., earthquakes) WORLD LD Climate c Cl change St Stock ck market ket HUMA HU MANITY Vi Virus outbreaks Sp Speech eech reco ecogni nition INDIVIDUAL L PEOPLE LE Ma Machine Tra ranslation (e.g., Engli lish -> F > French) Ca Cancer t treatment CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 6

  7. Background Much of our data is inherently seq sequenti uential PR PREDICTI TING EA EARTHQU HQUAKES KES CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 7

  8. Background Much of our data is inherently seq sequenti uential STO STOCK MA MARKET PR PREDICTI TIONS CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 8

  9. Background Much of our data is inherently seq sequenti uential SPE SPEECH RECOGNITI TION “What is the weather today?” “What is the weather two day?” “What is the whether too day?” “What is, the Wrether to Dae?” CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 9

  10. Sequence Modeling: Handwritten Text Input : Image • Output: Text • https://towardsdatascience.com/build-a-handwritten-text-recognition-system-using-tensorflow- 2326a3487cd5 CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 10

  11. Sequence Modeling: Text-to-Speech Input : Text • Output: Audio • CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 11

  12. Sequence Modeling: Machine Translation Input : Text • Output: Translated Text • CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 12

  13. Outline Why RNNs Main Concept of RNNs (part 1) More Details of RNNs RNN training Gated RNN CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 13

  14. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 14

  15. What can my NN do? Training : Present to the NN examples and learn from them. [George, Mary, Tom, Suzie] NN CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 15

  16. What can my NN do? Prediction : Given an example NN George NN Mary CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 16

  17. What my NN can NOT do? ? WHO IS IT? CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 17

  18. Learn from previous examples Time CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 18

  19. Recurrent Neural Network (RNN) George NN CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 19

  20. Recurrent Neural Network (RNN) I have seen George moving in this way before. George NN RNNs recognize the data's sequential characteristics and use patterns to predict the next likely scenario. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 20

  21. Recurrent Neural Network (RNN) I do not know. I need to know WHO IS He told me I could have it who said that and what he HE? said before. Can you tell me more? Our model requires context - or contextual information - to understand the subject (he) and the direct object (it) in the sentence. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 21

  22. RNN – Another Example with Text I see what you mean now! - Hellen: Nice sweater Joe. WHO IS The noun “he” stands for - Joe: Thanks, Hellen. It used Joe’s brother while ”it” for to belong to my brother and he HE? the sweater. told me I could have it . After providing sequential information, the model recognize the subject (Joe’s brother) and the object (sweater) in the sentence. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 22

  23. Batch_size = 2048 CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 23

  24. Sequences • We want a machine learning model to understand sequences, not isolated samples. • Can MLP do this? • Assume we have a sequence of temperature measurements and we want to take 3 sequential measurements and predict the next one features 35 1 32 2 45 3 samples 48 4 41 5 39 6 36 7 … … CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 24

  25. Sequences • We want a machine learning model to understand sequences, not isolated samples. • Can MLP do this? • Assume we have a sequence of temperature measurements and we want to take 3 sequential measurements and predict the next one features 35 35 1 1 32 32 2 2 45 45 3 3 samples 48 48 4 4 41 5 39 6 36 7 … … CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 25

  26. Sequences • We want a machine learning model to understand sequences, not isolated samples. • Can MLP do this? • Assume we have a sequence of temperature measurements and we want to take 3 sequential measurements and predict the next one features 35 35 32 1 1 2 32 32 45 2 2 3 45 45 48 3 3 4 samples 48 48 41 4 4 5 41 5 39 6 36 7 … … CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 26

  27. Sequences • We want a machine learning model to understand sequences, not isolated samples. • Can MLP do this? • Assume we have a sequence of temperature measurements and we want to take 3 sequential measurements and predict the next one features 35 35 32 45 1 1 2 3 32 32 45 48 2 2 3 4 45 45 48 41 3 3 4 5 samples 48 48 41 39 4 4 5 6 41 5 39 6 36 7 … … CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 27

  28. Sequences • We want a machine learning model to understand sequences, not isolated samples. • Can MLP do this? • Assume we have a sequence of temperature measurements and we want to take 3 sequential measurements and predict the next one features 35 35 32 45 1 1 2 3 32 32 45 48 2 2 3 4 45 45 48 41 3 3 4 5 samples 48 48 41 39 4 4 5 6 41 5 39 6 36 7 … … CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 28

  29. Windowed dataset This is called overlapping windowed dataset, since we’re windowing observations to create new. We can easily do using a MLS: 1 3 10 ReLU 10 ReLU 1 ReLU But re-arranging the order of the inputs like: 45 32 41 3 2 5 35 48 48 1 4 4 32 45 45 2 3 3 4 5 6 will produce the same results CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 29

  30. Why not CNNs or MLPs? 1. MLPs/CNNs require fixed input and output size 2. MLPs/CNNs can’t classify inputs in multiple places CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 30

  31. Windowed dataset What follows after: `I got in the car and’ ? `drove away’ What follows after : `In car the and I got’ ? Not obvious that it should be ` drove away’ The order of words matters. This is true for most sequential data. A fully connected network will not distinguish the order and therefore missing some information. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 31

  32. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 32

  33. Outline Why RNNs Main Concept of RNNs More Details of RNNs RNN training Gated RNN CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 33

  34. Memory Somehow the computational unit should remember what it has seen before. 𝑍 " Should remember 𝑌 $ … 𝑌 "&' Unit 𝑌 " CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 34

  35. Memory Somehow the computational unit should remember what it has seen before. 𝑍 " Unit Internal memory 𝑌 " CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 35

  36. Memory Somehow the computational unit should remember what it has seen before. We’ll call the information the unit’s state . 𝑍 " RNN Internal memory 𝑌 " CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 36

  37. Memory In neural networks, once training is over, the weights do not change. This means that the network is done learning and done changing. Then, we feed in values, and it simply applies the operations that make up the network, using the values it has learned. But the RNN units can remember new information after training has completed. That is, they’re able to keep changing after training is over. CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 37

  38. Memory Question : How can we do this? How can build a unit that remembers the past? The memory or state can be written to a file but in RNNs, we keep it inside the recurrent unit. In an array or in a vector! Work with an example: Anna Sofia said her shoes are too ugly. Her here means Anna Sofia. Nikolas put his keys on the table. His here means Nikolas CS109B, P ROTOPAPAS , G LICKMAN , T ANNER 38

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend