Neural Translation with Pytorch
GTC 2017 JEREMY HOWARD @JEREMYPHOWARD
Neural Translation with Pytorch GTC 2017 JEREMY HOWARD - - PowerPoint PPT Presentation
Neural Translation with Pytorch GTC 2017 JEREMY HOWARD @JEREMYPHOWARD Im assuming some knowledge of Python Jupyter Numpy Word RNNs vectors Some review today course.fast.ai https://github.com/ jph00/part2 Our destination
GTC 2017 JEREMY HOWARD @JEREMYPHOWARD
Some review today
https://github.com/jph00/part2
Created by Chris Callison-Burch Crawled millions of web pages Used 'a set of simple heuristics’
Assume that these documents are translations of each other
Because we are translating at word level, we need to tokenize the text first. There are many tokenizers available, but we found we got best results using these simple heuristics.
Unrolled stacked RNNs for sequences
word 1 input word 2 input word 3 input
Input Hidden Output InputHidden HiddenOutput HiddenHidden
Equivalent recursive diagram
char n input
Repeat for 1n-1 Initialize to zeros Repeat for 1n-1 Initialize to zeros
This and following 3 slides thanks to Chris Manning (Stanford) https://simons.berkeley.edu/talks/christopher-manning-2017-3-27
* Equation from: “Grammar as a Foreign Language”
GTC 2017 JEREMY HOWARD @JEREMYPHOWARD