Code Completion with Neural Attention and Pointer Networks
Jian Li, Yue Wang, Irwin King, and Michael R. Lyu
The Chinese University of Hong Kong Presented by Ondrej Skopek
Code Completion with Neural Attention and Pointer Networks Jian - - PowerPoint PPT Presentation
Code Completion with Neural Attention and Pointer Networks Jian Li, Yue Wang, Irwin King, and Michael R. Lyu The Chinese University of Hong Kong Presented by Ondrej Skopek Goal: Predict out-of-vocabulary words using local context
The Chinese University of Hong Kong Presented by Ondrej Skopek
2 Credits: van Kooten, P. neural_complete. https://github.com/kootenpv/neural_complete. (2017). (illustrative image)
Goal: Predict out-of-vocabulary words using local context
3 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
Joint RNN Mixture Attention Pointer network
4
5 Credits: Olah, C. Understanding LSTM Networks. colah’s blog (2015).
6 Credits: Olah, C. Understanding LSTM Networks. colah’s blog (2015).
7 Credits: Hochreiter, S. & Schmidhuber, J. Long Short-term Memory. Neural Computation 9, 1735–1780 (1997). Olah, C. Understanding LSTM Networks. colah’s blog (2015).
Cell state Hidden state Forget gate New memory generation Output gate
8 Credits: Olah, C. Understanding LSTM Networks. colah’s blog (2015).
predicting
9 Credits: Bahdanau, D., Cho, K. & Bengio, Y. Neural Machine Translation by Jointly Learning to Align and Translate. (2014).
10 Credits: QI, X. Seq2seq. https://xiandong79.github.io/seq2seq-基础知识. (2017). Bahdanau, D., Cho, K. & Bengio, Y. Neural Machine Translation by Jointly Learning to Align and Translate. (2014).
11 Credits: Vinyals, O., Fortunato, M. & Jaitly, N. Pointer Networks. (2015).
the next output token
12 Credits: Vinyals, O., Fortunato, M. & Jaitly, N. Pointer Networks. (2015). Bahdanau, D., Cho, K. & Bengio, Y. Neural Machine Translation by Jointly Learning to Align and Translate. (2014).
13
○ Parsed using a context-free grammar
○ Non-leaf value: EMPTY, unknown value: UNK, end of program: EOF
14 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
○ Predict the “next” node ○ Two separate tasks (type and value)
○ In-order depth-first search + 2 bits of information on children/siblings
15 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
Joint RNN Mixture Attention Pointer network
○ L – input window size (L = 50) ○ V – vocabulary size (differs) ○ k – size of hidden state (k = 1500)
16 Credits: Vinyals, O., Fortunato, M. & Jaitly, N. Pointer Networks. (2015). Bahdanau, D., Cho, K. & Bengio, Y. Neural Machine Translation by Jointly Learning to Align and Translate. (2014).
○
Produce two distributions at time t
17
○ Condition on both the hidden state and context vector
○ Reuses Attention outputs
Credits: Vinyals, O., Fortunato, M. & Jaitly, N. Pointer Networks. (2015). Bahdanau, D., Cho, K. & Bengio, Y. Neural Machine Translation by Jointly Learning to Align and Translate. (2014).
18
into one
where
Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
19
Data
○ http://plml.ethz.ch
consecutive tokens
○ Last segment padded with EOF
○ Type embedding (300 dimensions) ○ Value embedding (1200 dimensions)
Model & training parameters
decay)
○ Initialized to 0 ○ All other parameters ~ Unif([-0.05, 0.05])
20 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
Training conditions
considered incorrect
Labels
attention position
○ If not, labeled as UNK
21 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
22 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
23 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).
24 Credits: Li, J., Wang, Y., King, I. & Lyu, M. R. Code Completion with Neural Attention and Pointer Networks. (2017).