Machine learning techniques to probe theoretical physics
Akinori Tanaka (RIKEN AIP/iTHEMS)
Machine learning techniques to probe theoretical physics Intro In - - PowerPoint PPT Presentation
Akinori Tanaka (RIKEN AIP/iTHEMS) Machine learning techniques to probe theoretical physics Intro In inSPIRE, search find t machine learning OR deep learning and date 20xx->20xx+1 year number of results Intro Mainly experimental, a few
Machine learning techniques to probe theoretical physics
Akinori Tanaka (RIKEN AIP/iTHEMS)
find t machine learning OR deep learning and date 20xx->20xx+1
In inSPIRE, search
year number
results
Mainly experimental, a few theoretical “Deep learning shock” in ILSVRC
“supervised” “un-supervised” “reinforcement”
“supervised” “un-supervised” “reinforcement”
“supervised”
“machine” 2 “machine” 5 “machine” ↑MNIST dataset
“supervised”
well trained
machine 6 bad machine 5
I wrote it
In general, trying to learn a “concept”
“supervised” “un-supervised” “reinforcement”
“un-supervised” “machine” “machine” “machine”
No answer given
“un-supervised”
well trained
machine
“local coupling consts” (called features) “RBM”
“un-supervised”
well trained
machine
“un-supervised”
well trained
machine
towards a non-perturbative corrections to quantum gravity emergent entanglement entropy for 4d superconformal theory chiral transport and entanglement entropy in general relativity
All titles in hep-th (2016) Joking demo 😝
Bowman, et al. (2015)
Using
“supervised” “un-supervised” “reinforcement”
“reinforcement”
“reinforcement”
“supervised” “un-supervised” “reinforcement”
Super fine generated images
ILSVRC top errors %
↓DL
AlphaGo zero
“machine” 2 = Multi layered perceptron
“machine” 2 = Linear ∈ tunable params
“machine” 2 = Non Linear
“machine” =
input answer ~
“Error function”
y − ~ d|2
input answer ~
“Error function”
params E
Data
W ← W − ✏@W E
easy to start TensorFlow Keras Chainer … Many users … Easy to write … Pythonic and others…
https://developer.nvidia.com/deep-learning-frameworks
Comments: For more details: The most famous DL book DL book by a string theorist
“machine”
Idea ↑ Configurations generated by MC simulations
input
・Cold
Cold Hot
Carrasquilla, Melko (2016)
T
Cold Hot Test acc > 90% ←Given explicitly Tc = 2 log(1 + √ 2) = 2.27 . . .
Carrasquilla, Melko (2016)
Carrasquilla, Melko (2016)
Training Known Model Application Other (similar) Models
T
Tc = 4 log(3) = 3.64 . . .
1.72
5.00
2.50
3.84
Data
T
draw
F a
=
...
... ...
W
T
Update F, W
AT, Tomiya (2016)
F a
=
...
... ...
W
T
...
...
T
Tc ~ 2.27
AT, Tomiya (2016)
Ohtsuki, Ohtsuki (2016)
4-layered MLP phases training consistent w/ result by transfer matrix |ψ(x, y, z)|2
Z
X
dx P(x)O(x)
Integrable
Aut(X) ⊃ “good” symmetry
(usually) Non-integrable
→ MC 😄
Z
X
dx P(x)O(x)
Sampling x[i] ∼ P(x)(i.i.d.) ∼
N
X
i=1
1 N O(x[i])
∼ …
x[0] x[1] x[N]
N
X
i=1
1 N O(x[i])
change x[i + 1] x[i] ˜ x[i + 1] Metropolis Test(x[i], ˜ x[i + 1]) = ~ P(x)
∼ …
x[0] x[1] x[N]
similar similar similar
N
X
i=1
1 N O(x[i])
Ising Model
: one spin random flip ⇣ ⇣ Autocorrelation (similarity) : Γ(τ)
Ising Model : one spin random flip ⇣ ⇣ ↑∃ Faster update
Self Learning Monte Carlo
Liu, Qi, Meng, Fu (2016)
MC w/ global update ML
Self Learning Monte Carlo
Liu, Qi, Meng, Fu (2016)
…
x[0] x[1] x[N]
…
˜ x[n]
↑update by Heff
˜ Metropolis Test(˜ x[0], ˜ x[n]) = x[i + 1]
Self Learning Monte Carlo
Liu, Qi, Meng, Fu (2016)
Self Learning Monte Carlo
Liu, Qi, Meng, Fu (2016)
Using MLP
Nagai, Okumura, AT (2018)
Heff(S) = E0 − j1 X
<ij>1
SiSj − j2 X
<ij>2
SiSj − . . . |Heff(Sdata) − H(Sdata)|2 choose j1 s.t. decreases. Heff(S) = MLP(S) (QMC, S = vertices on imaginary time circle)
Using Boltzmann Machines
AT, Tomiya (2017)
…
x[0] x[1] x[N]
usual update x[i + 1] ˜ x[i + 1] Metropolis Test(x[i], ˜ x[i + 1]) = x[i] x0[i]
Huang, Wang (2016)
Using Boltzmann Machines
AT, Tomiya (2017) Huang, Wang (2016)
Scalar lattice QFT
“machine”
Idea ↑ geometric data ↑ Invariants h1,2 h2,1 χ … polytope, topic diagram,…
He (2017) Carifio, Halverson, Krioukov, Nelson (2017)
・Usage of Mathematica package ・CY3s ∈ WP^4, CICY3s, CICY4s, Quivers MLP not MLP ・F-theory compactifications
Krefl, Seong (2017)
MLP ・Toric diagram → min(vol(SE))
…
= ⇢ +1 −1 Imitate Design H “Boltzmann Machine” Phand-written(x) Pising(x) = e−H(x) Z
Hinton, Sejnowski (1983)
Naive BM
H = X
i,j
xiWijxj
How to train W ? → Maximize relative entropy
hlog e−H Z iP
↑hard to compute (for non-local H)
Hinton, Sejnowski (1983)
Restricted BM
h integrate out ~ ~ ~ P(x, h) = e−H(x,h) Z Ptrue(x) P(x) H(x, h) = xT Wh + xT Bx + BT
h h
Hinton, Sejnowski (1983)
h ∈ Z2
Riemann-Theta BM ~ ~
h integrate out ~ P(x, h) = e−H(x,h) Z Ptrue(x) P(x)
Hinton, Sejnowski (1983)
H(x, h) = xT Wh + xT Σ−1x + hT σ−1h
∝ ˜ θ(xT W|σ−1)
Krefl, Carrazza, Haghighat, Kahlen (2016)
gravity gauge scalar D-brane AdS/CFT correspondence D3 → IIB/SYM
Maldancena (1997) Aharony, Bergman, Jafferis, Maldacena (2008)
D3→ D2 → M2 → M/SCS
AdS/CFT correspondence
No-stringy picture (?) Gravity A QFT B
Kitaev (2015)
AdS2 + dilaton/ random ψ4 “holographic principle”
AdS/CFT correspondence
(Assuming holographic principle)
ML
QFT data → → consistency bulk gravity
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
AdS/CFT correspondence S[φ] = Z dx p −detg h − 1 2(∂µφ)2 − 1 2m2φ2 − 1 4!λφ4i “parameters” for the theory ・mass ・4-point coupling ・metric m2 λ g ・bulk action acquired after training
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
AdS/CFT correspondence ・EOM ・Metric ansatz = ‘BH’-like metric ds2 = −f(⌘)dt2 + d⌘2 + g(⌘)d~ x2 η φin πin
πout φout
bdry horizon
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
AdS/CFT correspondence η φin πin
πout φout
bdry horizon
b.c. ・EOM ・Metric ansatz = ‘BH’-like metric ds2 = −f(⌘)dt2 + d⌘2 + g(⌘)d~ x2
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
φin
πin
πout φout
= 0 or not ”Answer”
φin πin
πout φout
”MLP” m2, λ, h(← gµν) ˜ m2, ˜ λ, ˜ h
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
φin
πin
πout φout
= 0 or not ”Answer”
φin πin
πout φout
”MLP” ↓feed m2, λ, h(← gµν) ˜ m2, ˜ λ, ˜ h ↓train
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
φin πin + smoothing regularization
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
Good / Bad ”Answer”
φin πin
πout φout
”MLP” ↓train ˜ m2, ˜ λ, ˜ h
H M
Experiment
H M
Zero/non-zero
∆±, l, L
Hashimoto, Sugishita, Tanaka, Tomiya (2018)
M H smooth regularization + 1/ηregularization
experimental data →metric, mass, coupling
Hashimoto, Sugishita, Tanaka, Tomiya (2018)