machine learning techniques to probe theoretical physics
play

Machine learning techniques to probe theoretical physics Intro In - PowerPoint PPT Presentation

Akinori Tanaka (RIKEN AIP/iTHEMS) Machine learning techniques to probe theoretical physics Intro In inSPIRE, search find t machine learning OR deep learning and date 20xx->20xx+1 year number of results Intro Mainly experimental, a few


  1. Akinori Tanaka (RIKEN AIP/iTHEMS) Machine learning techniques to probe theoretical physics

  2. Intro In inSPIRE, search find t machine learning OR deep learning and date 20xx->20xx+1

  3. year number of results Intro Mainly experimental, a few theoretical “Deep learning shock” in ILSVRC

  4. Agenda II. Reviews on selected papers I. Reviews on Machine Learning III. Summary

  5. I. Reviews on Machine Learning “supervised” “un-supervised” “reinforcement” ● Rough classification

  6. I. Reviews on Machine Learning “supervised” “un-supervised” “reinforcement” ● Rough classification

  7. I. Reviews on Machine Learning “supervised” ● Rough classification “machine” 2 “machine” 5 “machine” 0 ↑MNIST dataset

  8. I. Reviews on Machine Learning “supervised” ● Rough classification well trained machine 6 bad machine 5 I wrote it f : X → Y In general, trying to learn a “concept”   

  9. I. Reviews on Machine Learning “supervised” “un-supervised” “reinforcement” ● Rough classification

  10. I. Reviews on Machine Learning ● Rough classification “un-supervised” “machine” “machine” “machine” No answer given

  11. I. Reviews on Machine Learning ● Rough classification “un-supervised” well trained machine 1. Feature extraction “local coupling consts” (called features ) “RBM”

  12. I. Reviews on Machine Learning ● Rough classification “un-supervised” well trained machine 2. Generating data

  13. I. Reviews on Machine Learning ● Rough classification (2016) in hep-th All titles Using machine 2. Generating data well trained “un-supervised” towards a non-perturbative corrections to quantum gravity emergent entanglement entropy for 4d superconformal theory Joking demo 😝 chiral transport and Bowman, et al. (2015) entanglement entropy in general relativity

  14. I. Reviews on Machine Learning “supervised” “un-supervised” “reinforcement” ● Rough classification

  15. I. Reviews on Machine Learning ● Rough classification “reinforcement” ������������� ������ ��������� “machine” environment

  16. I. Reviews on Machine Learning ● Rough classification “reinforcement” ������������� ������ ��������� “machine” environment

  17. I. Reviews on Machine Learning ● Shock of Deep Learning “supervised” “un-supervised” “reinforcement” Super fine generated images % ↓DL AlphaGo zero ILSVRC top errors T. Karras, et al. (2017) D. Silver, et al. (2017)

  18. I. Reviews on Machine Learning ● Deep Learning “machine” 2 = Multi layered perceptron

  19. I. Reviews on Machine Learning ● Deep Learning “machine” 2 = Linear ∈ tunable params

  20. I. Reviews on Machine Learning ● Deep Learning “machine” 2 = Non Linear

  21. I. Reviews on Machine Learning ● Deep Learning “machine” = ~ ~ y x

  22. I. Reviews on Machine Learning “Error function” input ● Deep Learning ~ ~ y x answer ~ ~ d d y, ~ y − ~ d | 2 E ( ~ d ) = | ~

  23. I. Reviews on Machine Learning input params “Error function” ● Deep Learning E ~ ~ y x answer ~ ~ d d y, ~ E ( ~ d )

  24. I. Reviews on Machine Learning ● Deep Learning Data ~ ~ d x E W ← W − ✏@ W E

  25. I. Reviews on Machine Learning ● Deep Learning easy to start TensorFlow Keras Chainer … Many users … Easy to write … Pythonic and others… https://developer.nvidia.com/deep-learning-frameworks

  26. I. Reviews on Machine Learning ● Deep Learning Comments: For more details: The most famous DL book DL book by a string theorist 1. DL works very well. 2. MLP ≠ DL. 3. MLP + tips = DL.

  27. I. Reviews on Machine Learning ● hep-th & machine learning ? 1. applications 2. proposals = 3. ?? hep-th stat.ML

  28. Agenda 1. TH ← ML 2. TH → ML 3. TH = ML II. Reviews on I. Reviews on Machine Learning

  29. 1. TH ← ML ● Drawing phase diagrams ● ML Landscape ● Supporting MC simulations

  30. 1. TH ← ML ● Drawing phase diagrams “machine” Idea ↑ Configurations generated by MC simulations ~ ~ y x

  31. 1. TH ← ML ● Drawing phase diagrams input ・Cold Cold Hot Carrasquilla, Melko (2016) 1 . 72 5 . 00 2 . 50 output ・Hot 3 . 84 T 2 . 27

  32. 1. TH ← ML ● Drawing phase diagrams Cold Hot Test acc > 90% ←Given explicitly Carrasquilla, Melko (2016) T 2 T c = 2) = 2 . 27 . . . √ log(1 +

  33. 1. TH ← ML ● Drawing phase diagrams Training Known Model Application Other (similar) Models Carrasquilla, Melko (2016) T 4 T c = log(3) = 3 . 64 . . .

  34. 1. TH ← ML Data Update F, W ● Drawing phase diagrams draw AT, Tomiya (2016) 1 . 72 5 . 00 2 . 50 3 . 84 T ... W = F a ... T ...

  35. 1. TH ← ML ● Drawing phase diagrams Tc ~ 2.27 AT, Tomiya (2016) ... W F a = ... T ... ... ... T

  36. 1. TH ← ML ● Drawing phase diagrams of 3D TI → 2D image 4-layered MLP phases training consistent w/ result by transfer matrix Ohtsuki, Ohtsuki (2016) | ψ ( x, y, z ) | 2

  37. 1. TH ← ML ● Drawing phase diagrams ● ML Landscape ● Supporting MC simulations

  38. 1. TH ← ML ● Supporting MC simulations Integrable Aut(X) ⊃ “good” symmetry otherwise (usually) Non-integrable → Z dx P ( x ) O ( x ) X 😃 MC 😄 😅

  39. 1. TH ← ML ● Supporting MC simulations How? Z dx P ( x ) O ( x ) X Sampling x [ i ] ∼ P ( x )( i.i.d. ) N 1 X N O ( x [ i ]) ∼ i =1

  40. 1. TH ← ML ● Supporting MC simulations change ~ P(x) … N 1 X N O ( x [ i ]) ∼ i =1 x [0] x [1] x [ N ] x [ i ] x [ i + 1] ˜ x [ i + 1] Metropolis Test( x [ i ] , ˜ x [ i + 1]) =

  41. 1. TH ← ML ● Supporting MC simulations … similar similar similar N 1 X N O ( x [ i ]) ∼ i =1 x [0] x [1] x [ N ]

  42. 1. TH ← ML ● Supporting MC simulations Ising Model : one spin random flip Autocorrelation ⇣ ⇣ (similarity) : Γ ( τ ) τ

  43. 1. TH ← ML ● Supporting MC simulations Ising Model : one spin random flip ↑∃ Faster update Big picture: ML → Fast update ? ⇣ ⇣

  44. 1. TH ← ML ● Supporting MC simulations Self Learning Monte Carlo MC w/ global update ML Liu, Qi, Meng, Fu (2016) H e ff H

  45. 1. TH ← ML ● Supporting MC simulations Self Learning Monte Carlo … … Liu, Qi, Meng, Fu (2016) x [0] x [1] x [ N ] x [1] ˜ x [0] ˜ x [ n ] ˜ ↑update by H e ff ˜ Metropolis Test(˜ x [0] , ˜ x [ n ]) = x [ i + 1]

  46. 1. TH ← ML ● Supporting MC simulations Self Learning Monte Carlo Liu, Qi, Meng, Fu (2016)

  47. 1. TH ← ML ● Supporting MC simulations choose j1 s.t. decreases. (QMC, S = vertices on imaginary time circle) Self Learning Monte Carlo Using MLP Liu, Qi, Meng, Fu (2016) X X H e ff ( S ) = E 0 − j 1 S i S j − j 2 S i S j − . . . <ij> 1 <ij> 2 | H e ff ( S data ) − H ( S data ) | 2 Nagai, Okumura, AT (2018) H e ff ( S ) = MLP ( S )

  48. 1. TH ← ML … ● Supporting MC simulations usual update Using Boltzmann Machines Huang, Wang (2016) AT, Tomiya (2017) x [0] x [1] x [ N ] x 0 [ i ] x [ i ] x [ i + 1] ˜ x [ i + 1] Metropolis Test( x [ i ] , ˜ x [ i + 1]) =

  49. 1. TH ← ML ● Supporting MC simulations Using Boltzmann Machines Scalar lattice QFT Huang, Wang (2016) AT, Tomiya (2017)

  50. 1. TH ← ML ● Drawing phase diagrams ● ML Landscape ● Supporting MC simulations

  51. 1. TH ← ML ● ML Landscape topic diagram,… polytope, … Invariants ↑ geometric data ↑ Idea “machine” Landscape → SM-like theory ~ ~ y x h 1 , 2 h 2 , 1 χ

  52. 1. TH ← ML ● ML Landscape ・Usage of Mathematica package ・CY3s ∈ WP^4, CICY3s, CICY4s, Quivers MLP not MLP ・F-theory compactifications MLP ・Toric diagram → min(vol(SE)) He (2017) Carifio, Halverson, Krioukov, Nelson (2017) Krefl, Seong (2017)

  53. Agenda 1. TH ← ML 2. TH → ML 3. TH = ML II. Reviews on I. Reviews on Machine Learning

  54. 2. TH → ML ● Boltzmann machines Machine” “Boltzmann … Design H Imitate Hinton, Sejnowski (1983) n n P hand-written ( x ) P ising ( x ) = e − H ( x ) Z n n ⇢ +1 = − 1

  55. 2. TH → ML ● Boltzmann machines Naive BM How to train W ? → Maximize relative entropy ↑hard to compute (for non-local H) Hinton, Sejnowski (1983) X H = x i W ij x j i,j h log e − H Z i P

  56. 2. TH → ML ● Boltzmann machines Restricted BM h integrate out ~ ~ Hinton, Sejnowski (1983) P true ( x ) P ( x ) ↑ ~ P ( x, h ) = e − H ( x,h ) h ∈ Z 2 Z H ( x, h ) = x T Wh + x T B x + B T h h

  57. 2. TH → ML ● Boltzmann machines Riemann-Theta BM ~ ~ h integrate out Hinton, Sejnowski (1983) Krefl, Carrazza, Haghighat, Kahlen (2016) θ ( x T W | σ − 1 ) ∝ ˜ P true ( x ) P ( x ) ↑ ~ P ( x, h ) = e − H ( x,h ) h ∈ Z Z H ( x, h ) = x T Wh + x T Σ − 1 x + h T σ − 1 h

  58. Agenda 1. TH ← ML 2. TH → ML 3. TH = ML II. Reviews on I. Reviews on Machine Learning

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend