neural program synthesis
play

Neural Program Synthesis Rishabh Singh, Google Brain Great - PowerPoint PPT Presentation

Neural Program Synthesis Rishabh Singh, Google Brain Great Collaborators! Deep Learning and Evolutionary Progression Vision Perceptual Tasks Speech Language Algorithmic Tasks Programming Neural Program Learning M ore Complex Tasks G


  1. Neural Program Synthesis Rishabh Singh, Google Brain

  2. Great Collaborators!

  3. Deep Learning and Evolutionary Progression Vision Perceptual Tasks Speech Language Algorithmic Tasks Programming

  4. Neural Program Learning M ore Complex Tasks G eneralizability I nterpretability

  5. Long term Vision Agent to win programming contests [T opCoder] Program Representations Fuzzing/Security Testing Program Repair Program optimization

  6. Neural Program Induction

  7. Differentiable Neural Computer [Graves et al. Nature 2016]

  8. Neural RAM [Kurach et al. ICLR 2016] 14 modules An LSTM Controller choosing modules and arguments Differentiable Semantics

  9. Neural Program Meta- Neural Neural Program Induction Program Synthesis Synthesis F unctional Abstractions D ifferentiable memory, stack Functional Abstractions G eneralizes Better D ifficult to Generalize Strong Generalization L ots of Examples L ots of Examples Few Examples S ingle-task learning S ingle-task learning Multi-task learning N on-Interpretable programs I nterpretable programs Interpretable programs E xamples: NTM, DNC, etc. E xamples: QuickSort

  10. Spectrum of Program Meta-Induction J. Devlin, R. Bunel, R. Singh, M. Hausknecht, P .Kohli [NIPS 2017]

  11. Neural Program Meta- Neural Neural Program Induction Program Synthesis Synthesis F unctional Abstractions D ifferentiable memory, stack Functional Abstractions G eneralizes Better D ifficult to Generalize Strong Generalization L ots of Examples L ots of Examples Few Examples S ingle-task learning S ingle-task learning Multi-task learning N on-Interpretable programs I nterpretable programs Interpretable programs E xamples: NTM, DNC, etc. E xamples: QuickSort

  12. Neuro-Symbolic Program Synthesis [ICLR 2017] Emilio Parisotto, Abdelrahman Mohamed, Rishabh Singh, Lihong Li, Dengyong Zhou, Pushmeet Kohli

  13. FlashFill in Excel 2013 * Taken from Gulwani, Polozov, Singh NOW 2017 Gulwani, Harris, Singh [CACM Research Highlight 2012]

  14. FlashFill DSL

  15. Example FlashFill Task Input (v) Output William Henry Charles Charles, W. Michael Johnson Johnson, M. Barack Rogers Rogers, B. Martha D. Saunders Saunders, M. Concat(f 1 , ConstStr(“, ”), f 2 , ConstStr(“.”)) f 1 = SubStr(v, (Word,-1,Start), (Word,-1,End)) f 2 = SubStr(v, CPos(0), CPos(1))

  16. General Methodology Sampler – DSL Training Data Synthesizer Neural Model

  17. Synthetic Training Data

  18. Real-world Test Data

  19. Neural Architecture Tree Decoder I/O Encoder Examples

  20. Key Idea: Guided Enumeration CFG/DSL: S -> e + e e -> x e -> 1 e -> 0 Non-Terminals = {S, e} Terminals = {x, 1, 0, +} S -> e + e S -> e + e S -> e + e a 1 a 1 a 5 S + + + a 1 : S -> e + e e -> 1 e -> x e e e e -> 1 a 4 : e -> x a 1 : e -> x a 1 : e -> x a 5 : e -> 1 a 2 : e -> 1 a 2 : e -> 1 a 6 : e -> 0 a 3 : e -> 0 a 3 : e -> 0 1 x 1 f(x) = x + 1

  21. Key Idea: Guided Enumeration Problem How to assign probabilities to each action a i such that the global tree state is taken into account? S -> e + e S -> e + e S -> e + e a 1 a 1 a 5 S + + + a 1 : S -> e + e e -> 1 e -> x e e e e -> 1 a 4 : e -> x a 1 : e -> x a 1 : e -> x a 5 : e -> 1 a 2 : e -> 1 a 2 : e -> 1 a 6 : e -> 0 a 3 : e -> 0 a 3 : e -> 0 1 x 1 f(x) = x + 1

  22. Neural- Guided Enumeration f ( , I-O ) =

  23. 2 Key Challenges Program Representation Example Representation I-O

  24. Recursive-Reverse-Recursive Neural Network (R3NN) •

  25. Recursive Input: Distributed representations of each leaf’s symbol. Output: Global root representation.

  26. Reverse-Recursive Input: root representation from recursive pass Output: Global leaf representations.

  27. Conditioning on I/O Examples 1. An LSTM produces a vector for each I/O pair. 2. All I/O pair LSTM vectors are combined into a single conditioning vector c. 3. The R3NN model takes additional input c when generating the program tree. The whole model is trained end-to-end.

  28. Cross-Correlation Encoder

  29. Synthetic Data Results (< 13 AST)

  30. FlashFill Benchmarks Batching Trees for larger programs R3NN for contextual program embeddings

  31. RobustFill [ICML 2017] J. Devlin, J. Uesato, S. Bhuptiraju, R. Singh, A. Mohamed, P . Kohli

  32. Multiple I/O Examples

  33. Extended DSL

  34. 92% Generalization Accuracy

  35. Robustness with Noise

  36. Incorrect Generalization

  37. Program Induction Model

  38. Induction vs Synthesis

  39. Other Synthesis Domains More Complex DSLs FlashFill (Functional) Karel (Imperative with Control Flow) Python & R Scripts (Stateful Variables) Grammar Learning (CFG s & CSGs) Specification Modalities Natural Language (NL2Excel) Partial Programs (Sketching)

  40. Karel the Robot Input Output Program

  41. Karel DSL

  42. Synthesis Architecture CNNs for Encoder, LSTMs for decoder

  43. Supervised Learning Top-1 Top-5 Supervised 71.91 80.00

  44. Multiple Consistent Programs Input Output

  45. Reinforcement Learning 1. First Supervised Training 2. Sample Program from the model 3. Run the program on I/O 4. Positive Reward if Output matches Top-1 Top-5 Supervised 71.91 80.00 REINFORCE 71.99 74.11 Beam REINFORCE 77.68 82.73

  46. Stanford CS106a Test 7/16 problems = 43% Neural Symbolic

  47. Neural Program Representations for Software Engineering Applications

  48. Fuzzing for Security Bugs Crash! Execute Random Binary Seed Mutations Input Coverage guided — AFL

  49. Neural Grammar-based Fuzzing More coverage, Bugs! Patrice Godefroid, Hila Peleg Rishabh Singh. Learn&Fuzz: Machine Learning for Input Fuzzing. ASE 2017

  50. Learning where to Fuzz More coverage Identify useful bytes from past fuzzing More crashes Mohit Rajpal, William Blum, Rishabh Singh. Not all bytes are equal: Neural byte sieve for fuzzing.

  51. Neural Program Repair Sahil Bhatia, Rishabh Singh. Automated Correction for Syntax Errors in Programming Assignments using RNNs.

  52. Neural Programmer Rishabh Singh, rising@google.com Input/Output Examples Long T erm Vision: An agent to Natural Language win programming contests Partial Programs [T opCoder] Neural Architectures for Program and Spec Representation Neural Synthesis [ICLR2017 , ICML2017] Neural Repair [ICSE2018, ICLRW 2018] Program Induction [NIPS2017] Neural Fuzzing [ASE2017 , arxiv2017]

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend