Tensor Networks for Generative Modeling From Boltzmann machines to - - PowerPoint PPT Presentation

tensor networks for generative modeling
SMART_READER_LITE
LIVE PREVIEW

Tensor Networks for Generative Modeling From Boltzmann machines to - - PowerPoint PPT Presentation

Tensor Networks for Generative Modeling From Boltzmann machines to Born machines, and back ang ( ) Lei W https://wangleiphy.github.io Institute of Physics, Beijing Han, Wang, Fan, LW, Zhang, PRX18 Chen, Cheng, Xie, LW, Xiang, PRB


slide-1
SLIDE 1

Institute of Physics, Beijing Chinese Academy of Sciences

Tensor Networks for Generative Modeling

Lei W ang ()

https://wangleiphy.github.io

From Boltzmann machines to Born machines, and back

Han, Wang, Fan, LW, Zhang, PRX18’ Chen, Cheng, Xie, LW, Xiang, PRB 18’ Cheng, Chen, LW, Entropy 18’+unpublished

slide-2
SLIDE 2

Ψ

Quantum circuits architecture and parametrization

TNSAA: Live long and prosper

Neural networks and Probabilistic graphical models

slide-3
SLIDE 3

TNSAA in the era of quantum computing

Kim and Swingle, 1711.07500

ψi+1 = ψi |0i |0i |0i |0i · · ·

Noise-resilient quantum circuits with TNS architecture

+ quantum tomography, quantum annealing, quantum error correction, holographic duality, and more

Simulation and validation of quantum circuits

Villalonga et al, 1811.09599

slide-4
SLIDE 4

Tensor Networks Graphical Models

  • Widely used in speech recognition, bioinformatics, cryptography…
  • Learning, inference, and sampling algorithms (Forward-backward message

passing, Viterbi, Baum-Welch) => Think of positive MPS

Chen, Cheng, Xie, LW, Xiang, 1701.04831 Critch, Morton, 1210.2812 Robeva, Seigal, 1710.01437

TNSAA in the era of deep learning

Temme, Verstraete,1003.2545

Hidden Markov Model

p(ht|ht−1) p(ot|ht)

… Hidden Observation

Zhao, Jaeger, Neural Comput. 2010 Bailly, J. Mach. Learn. Res. 2011

slide-5
SLIDE 5

Unleash the power of tensor network states and algorithms for generative modeling

slide-6
SLIDE 6

Discriminative and generative learning

  • r

p(x, y)

y = f(x)

p(y|x)

slide-7
SLIDE 7

Discriminative and generative learning

“What I can not create, I do not understand”

slide-8
SLIDE 8

Generated Arts

https://www.christies.com/Features/A-collaboration-between-two-artists-one-human-one-a-machine-9332-1.aspx

$432,500 25 October 2018 Christie’s New York

slide-9
SLIDE 9

Generated Arts

https://www.christies.com/Features/A-collaboration-between-two-artists-one-human-one-a-machine-9332-1.aspx

Gaussian Noise Generative Network

$432,500 25 October 2018 Christie’s New York

slide-10
SLIDE 10

Generate Molecules

Physical configurations Latent attributes

Sanchez-Lengeling & Aspuru-Guzik, Science 2018

Simple Distributions Complex Distribution Generate Inference

slide-11
SLIDE 11

p(x)

<latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit><latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit><latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit>

Probabilistic Generative Modeling

How to express, learn, and sample from a high-dimensional probability distribution ?

  • “random” images

“natural” images

slide-12
SLIDE 12

p(x)

<latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit><latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit><latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit>

Probabilistic Generative Modeling

How to express, learn, and sample from a high-dimensional probability distribution ?

  • “random” images

“natural” images

Page 159 “… the images encountered in AI applications occupy a negligible proportion of the volume of image space.”

slide-13
SLIDE 13

p(x)

<latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit><latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit><latexit sha1_base64="l+MU6Wop3O4gDLGg4JnxmuonPhM=">ACEHicbVDNSsNAGNzUv1r/oj16WSxCvZREBPVW9OKxgrGFNpTNZtMu3WzC7kYaQp7CB/Cqj+BJvPoGPoGv4abNwbYOLDvMfB/fMF7MqFSW9W1U1tY3Nreq27Wd3b39A/Pw6FGicDEwRGLRM9DkjDKiaOoYqQXC4JCj5GuN7kt/O4TEZJG/EGlMXFDNOI0oBgpLQ3NOoybAy9ivkxD/WXT/AwOzYbVsmaAq8QuSQOU6AzNn4Ef4SQkXGpOzbVqzcDAlFMSN5bZBIEiM8QSPS15SjkEg3m4XP4alWfBhEQj+u4Ez9u5GhUBbh9GSI1Fgue4X4n9dPVHDlZpTHiSIczw8FCYMqgkUT0KeCYMVSTRAWVGeFeIwEwkr3tXBFTYtoMq/pZuzlHlaJc96bln3F432TVlRFRyDE9AENrgEbXAHOsABGKTgBbyCN+PZeDc+jM/5aMUod+pgAcbXL/rwnUs=</latexit>

Probabilistic Generative Modeling

How to express, learn, and sample from a high-dimensional probability distribution ?

https://blog.openai.com/generative-models/

latent space

slide-14
SLIDE 14

Boltzmann Machines

statistical physics

p(x) = e−E(x) Z

slide-15
SLIDE 15

Generative Modeling using Boltzmann Machines

x1 x2 . . . xN h1 h2 h3 . . . hM

Learn

Negative log-likelihood loss ℒ = − 1 |풟| ∑

x∈풟

ln p(x)

slide-16
SLIDE 16

Generative Modeling using Boltzmann Machines

x1 x2 . . . xN h1 h2 h3 . . . hM

Learn

Negative log-likelihood loss ℒ = − 1 |풟| ∑

x∈풟

ln p(x)

slide-17
SLIDE 17

Generate

Generative Modeling using Boltzmann Machines

x1 x2 . . . xN h1 h2 h3 . . . hM

Learn

Negative log-likelihood loss ℒ = − 1 |풟| ∑

x∈풟

ln p(x)

slide-18
SLIDE 18

Generate

Generative Modeling using Boltzmann Machines

x1 x2 . . . xN h1 h2 h3 . . . hM

Learn

Negative log-likelihood loss

= ⟨E(x)⟩x∼풟 + ln Z

ℒ = − 1 |풟| ∑

x∈풟

ln p(x)

slide-19
SLIDE 19

Generate

Generative Modeling using Boltzmann Machines

x1 x2 . . . xN h1 h2 h3 . . . hM

Learn

Negative log-likelihood loss

= ⟨E(x)⟩x∼풟 + ln Z

ℒ = − 1 |풟| ∑

x∈풟

ln p(x) ∇ℒ = ⟨∇E⟩x∼풟 − ⟨∇E⟩x∼p(x)

slide-20
SLIDE 20

Generate

Generative Modeling using Boltzmann Machines

x1 x2 . . . xN h1 h2 h3 . . . hM

Learn

Negative log-likelihood loss

= ⟨E(x)⟩x∼풟 + ln Z

ℒ = − 1 |풟| ∑

x∈풟

ln p(x) ∇ℒ = ⟨∇E⟩x∼풟 − ⟨∇E⟩x∼p(x)

slide-21
SLIDE 21 W W W W 1 2000 RBM 2 2000 1000 500 500 RBM Pretraining 1000 RBM 3 4 30 RBM Top

Reducing the Dimensionality of Data with Neural Networks

  • G. E. Hinton* and R. R. Salakhutdinov
High-dimensional data can be converted to low-dimensional codes by training a multilayer neural network with a small central layer to reconstruct high-dimensional input vectors. Gradient descent can be used for fine-tuning the weights in such ‘‘autoencoder’’ networks, but this works well only if the initial weights are close to a good solution. We describe an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data.

D

imensionality reduction facilitates the classification, visualization, communi- cation, and storage of high-dimensional
  • data. A simple and widely used method is
principal components analysis (PCA), which finds the directions of greatest variance in the data set and represents each data point by its coordinates along each of these directions. We describe a nonlinear generalization of PCA that uses an adaptive, multilayer Bencoder[ network Y 2006 VOL 313 SCIENCE www.sciencemag.org

Renaissance of deep learning

Wavefunctions ansatz Quantum state tomography Quantum error correction Monte Carlo updates Renormalization group …… see next talk !

Feedback to physics

slide-22
SLIDE 22

State-of-the-Art: Autoregressive Models

p(x) = ∏

i

p(xi|x<i)

1601.06759, 1606.05328

x1 xi xn

xn2 xn2

WaveNet 1609.03499,1711.10433 PixelCNN Speech data Image data

slide-23
SLIDE 23

State-of-the-Art: Autoregressive Models

p(x) = ∏

i

p(xi|x<i)

1601.06759, 1606.05328

x1 xi xn

xn2 xn2

WaveNet 1609.03499,1711.10433 PixelCNN Speech data Image data

= p(x1)p(x2|x1)p(x3|x1, x2)⋯

slide-24
SLIDE 24

WaveNet in the Real World

https://deepmind.com/blog/wavenet-generative-model-raw-audio/ https://deepmind.com/blog/high-fidelity-speech-synthesis-wavenet/ https://deepmind.com/blog/wavenet-launches-google-assistant/

2018 Google I/O

slide-25
SLIDE 25

Born Machines

quantum physics p(x) = |Ψ(x)|2 Z

slide-26
SLIDE 26

Born Machines

quantum physics Hilbert Space Area Law Entanglement p(x) = |Ψ(x)|2 Z

slide-27
SLIDE 27

Quantum inspired generative modeling

Ψ

Han, Wang, Fan, LW, Zhang, 1709.01662, PRX 18’

slide-28
SLIDE 28

Learn

Quantum inspired generative modeling

Ψ

Han, Wang, Fan, LW, Zhang, 1709.01662, PRX 18’

slide-29
SLIDE 29

Learn Generate

Quantum inspired generative modeling

Ψ

Han, Wang, Fan, LW, Zhang, 1709.01662, PRX 18’

slide-30
SLIDE 30

“Teach a quantum state to write digits”

Learn Generate

Quantum inspired generative modeling

Ψ

Han, Wang, Fan, LW, Zhang, 1709.01662, PRX 18’

slide-31
SLIDE 31

Generative modeling using Tensor Network States

Stoudenmire, Schwab NIPS 2016 Stoudenmire Q. Sci. Tech. 2018 Liu et al 1710.04833 Liu et al 1803.09111 Glasser et al 1806.05964 Hallam et al 1711.03357 Gallego, Orus 1708.01525 Pestun et al 1711.01416…

Tensor Network Machine Learning

Cichocki et al 1604.05271,1609.00893,1708.09165… Novikov et al 1509.06569 Kossaifi et al 1707.08308

slide-32
SLIDE 32

50 100 150 200 250 300 350 10 20 30 40 50 60 70 Dmax

L

ln(|T |)

The ultimate goal is “learn to forget”

ℒ = − 1 |풟| ∑

x∈풟

ln p(x)

Lower bound ln(|풟|)

Name of the game

Bond dimension

| |2 /Z

slide-33
SLIDE 33

50 100 150 200 250 300 350 10 20 30 40 50 60 70 Dmax

L

ln(|T |)

The ultimate goal is “learn to forget”

ℒ = − 1 |풟| ∑

x∈풟

ln p(x)

Lower bound ln(|풟|)

Name of the game

25 50 75 10 15 20 25 30 35 40 45 50 55 L #loops

Test Train

Bond dimension

| |2 /Z

slide-34
SLIDE 34

50 100 150 200 250 300 350 10 20 30 40 50 60 70 Dmax

L

ln(|T |)

The ultimate goal is “learn to forget”

ℒ = − 1 |풟| ∑

x∈풟

ln p(x)

Lower bound ln(|풟|)

Name of the game

25 50 75 10 15 20 25 30 35 40 45 50 55 L #loops

Test Train

Bond dimension

| |2 /Z

slide-35
SLIDE 35

Nice features of an MPS Born Machine

Expressibility Learning Inference Sampling

Glasser, Clark, Deng, Gao, Chen, Huang… 17’

p(x) = | |2 /Z

slide-36
SLIDE 36

Tractable Likelihood

Efficient & unbiased learning compared to models with intractable partition functions

Z =

tractable via efficient tensor contraction

*applies to TTN and MERA as well

∂Z/∂ ( ) = 2 ×

slide-37
SLIDE 37

Adaptive Learning

Adaptively grows the bond dimensions, thus dynamically tuning the expressibility Bond dimensions Training images

*applies to 2-site

  • ptimization
slide-38
SLIDE 38

Direct Generation

p(x<i) =

<latexit sha1_base64="WmXede8Mu/xpKop4s+yCMAqs8k=">ACF3icbVDLSsNAFJ34rPUVdaebwSLUTUlFUEGh6MZlBWMLbQiTyaQdOsmEmYm0hIDf4Qe41U9wJW5d+gX+hpM2C9t6YZjDOfdyz1ezKhUlvVtLCwuLa+sltbK6xubW9vmzu6D5InAxMacdH2kCSMRsRWVDHSjgVBocdIyxvc5HrkQhJeXSvRjFxQtSLaEAxUpyzX0YV7seZ74chfpLh5mbXtLsGF5B16xYNWtcB7UC1ABRTVd86frc5yEJFKYISk7dStWToqEopiRrNxNJIkRHqAe6WgYoZBIJx3fkMEjzfgw4EK/SMEx+3ciRaHMPerOEKm+nNVy8j+tk6jg3ElpFCeKRHiyKEgYVBzmgUCfCoIVG2mAsKDaK8R9JBWOrapLWqYW5NZWSdTn81hHtgntYuadXdaVwXEZXATgEVAHZ6ABbkET2ACDJ/ACXsGb8Wy8Gx/G56R1wShm9sBUGV+/+YSf6g=</latexit><latexit sha1_base64="WmXede8Mu/xpKop4s+yCMAqs8k=">ACF3icbVDLSsNAFJ34rPUVdaebwSLUTUlFUEGh6MZlBWMLbQiTyaQdOsmEmYm0hIDf4Qe41U9wJW5d+gX+hpM2C9t6YZjDOfdyz1ezKhUlvVtLCwuLa+sltbK6xubW9vmzu6D5InAxMacdH2kCSMRsRWVDHSjgVBocdIyxvc5HrkQhJeXSvRjFxQtSLaEAxUpyzX0YV7seZ74chfpLh5mbXtLsGF5B16xYNWtcB7UC1ABRTVd86frc5yEJFKYISk7dStWToqEopiRrNxNJIkRHqAe6WgYoZBIJx3fkMEjzfgw4EK/SMEx+3ciRaHMPerOEKm+nNVy8j+tk6jg3ElpFCeKRHiyKEgYVBzmgUCfCoIVG2mAsKDaK8R9JBWOrapLWqYW5NZWSdTn81hHtgntYuadXdaVwXEZXATgEVAHZ6ABbkET2ACDJ/ACXsGb8Wy8Gx/G56R1wShm9sBUGV+/+YSf6g=</latexit><latexit sha1_base64="WmXede8Mu/xpKop4s+yCMAqs8k=">ACF3icbVDLSsNAFJ34rPUVdaebwSLUTUlFUEGh6MZlBWMLbQiTyaQdOsmEmYm0hIDf4Qe41U9wJW5d+gX+hpM2C9t6YZjDOfdyz1ezKhUlvVtLCwuLa+sltbK6xubW9vmzu6D5InAxMacdH2kCSMRsRWVDHSjgVBocdIyxvc5HrkQhJeXSvRjFxQtSLaEAxUpyzX0YV7seZ74chfpLh5mbXtLsGF5B16xYNWtcB7UC1ABRTVd86frc5yEJFKYISk7dStWToqEopiRrNxNJIkRHqAe6WgYoZBIJx3fkMEjzfgw4EK/SMEx+3ciRaHMPerOEKm+nNVy8j+tk6jg3ElpFCeKRHiyKEgYVBzmgUCfCoIVG2mAsKDaK8R9JBWOrapLWqYW5NZWSdTn81hHtgntYuadXdaVwXEZXATgEVAHZ6ABbkET2ACDJ/ACXsGb8Wy8Gx/G56R1wShm9sBUGV+/+YSf6g=</latexit>

No thermalization issue compared to slow mixing Gibbs sampling of Boltzmann Machines

Ferris & Vidal 2012

p(x) = ∏

i

p(xi|x<i) = ∏

i

p(x<i+1) p(x<i)

*applies to TTN and MERA as well

slide-39
SLIDE 39

Direct Generation

p(x<i+1) =

<latexit sha1_base64="dW4IEdQOc/wx/A8XdGHJWO82h0w=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpREBUim5cVjC20IYwmUzaoZMHMxNpCdn4HX6AW/0EV+LWlV/gbzhps7CtF4Y5nHMv9zjxowKaRjfWmlhcWl5pbxaWVvf2NzSt3ceRJRwTCwcsYi3XSQIoyGxJWMtGNOUOAy0nIHN7neiRc0Ci8l6OY2AHqhdSnGElFOfo+jGtdN2KeGAXqS4eZk17SYzM7glfQ0atG3RgXnAdmAaqgqKaj/3S9CcBCSVmSIiOacTSThGXFDOSVbqJIDHCA9QjHQVDFBhp+MrMnioGA/6EVcvlHDM/p1IUSByl6ozQLIvZrWc/E/rJNI/t1MaxokIZ4s8hMGZQTzSKBHOcGSjRAmFPlFeI+4ghLFdzUFjnMrYmsopIxZ3OYB9ZJ/aJu3J1WG9dFRGWwBw5ADZjgDTALWgC2DwBF7AK3jTnrV37UP7nLSWtGJmF0yV9vUL6TKgWg=</latexit><latexit sha1_base64="dW4IEdQOc/wx/A8XdGHJWO82h0w=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpREBUim5cVjC20IYwmUzaoZMHMxNpCdn4HX6AW/0EV+LWlV/gbzhps7CtF4Y5nHMv9zjxowKaRjfWmlhcWl5pbxaWVvf2NzSt3ceRJRwTCwcsYi3XSQIoyGxJWMtGNOUOAy0nIHN7neiRc0Ci8l6OY2AHqhdSnGElFOfo+jGtdN2KeGAXqS4eZk17SYzM7glfQ0atG3RgXnAdmAaqgqKaj/3S9CcBCSVmSIiOacTSThGXFDOSVbqJIDHCA9QjHQVDFBhp+MrMnioGA/6EVcvlHDM/p1IUSByl6ozQLIvZrWc/E/rJNI/t1MaxokIZ4s8hMGZQTzSKBHOcGSjRAmFPlFeI+4ghLFdzUFjnMrYmsopIxZ3OYB9ZJ/aJu3J1WG9dFRGWwBw5ADZjgDTALWgC2DwBF7AK3jTnrV37UP7nLSWtGJmF0yV9vUL6TKgWg=</latexit><latexit sha1_base64="dW4IEdQOc/wx/A8XdGHJWO82h0w=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpREBUim5cVjC20IYwmUzaoZMHMxNpCdn4HX6AW/0EV+LWlV/gbzhps7CtF4Y5nHMv9zjxowKaRjfWmlhcWl5pbxaWVvf2NzSt3ceRJRwTCwcsYi3XSQIoyGxJWMtGNOUOAy0nIHN7neiRc0Ci8l6OY2AHqhdSnGElFOfo+jGtdN2KeGAXqS4eZk17SYzM7glfQ0atG3RgXnAdmAaqgqKaj/3S9CcBCSVmSIiOacTSThGXFDOSVbqJIDHCA9QjHQVDFBhp+MrMnioGA/6EVcvlHDM/p1IUSByl6ozQLIvZrWc/E/rJNI/t1MaxokIZ4s8hMGZQTzSKBHOcGSjRAmFPlFeI+4ghLFdzUFjnMrYmsopIxZ3OYB9ZJ/aJu3J1WG9dFRGWwBw5ADZjgDTALWgC2DwBF7AK3jTnrV37UP7nLSWtGJmF0yV9vUL6TKgWg=</latexit>

No thermalization issue compared to slow mixing Gibbs sampling of Boltzmann Machines

Ferris & Vidal 2012

p(x) = ∏

i

p(xi|x<i) = ∏

i

p(x<i+1) p(x<i)

*applies to TTN and MERA as well

slide-40
SLIDE 40

Direct Generation

p(x<i+2) =

<latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="pBqueQR2QKJA0h8T342m+OG4y9E=">AB7XicbVDLSsNAFL2prxqr1rWbwSK4KokbdSe4cVnB2EINZTKZtEMnkzBzI5aQH3DrJ7gSv8gv8DectF3Y1gMDh3Pu5Z45US6FQc/7dhpb2zu7e8196DlHh4dt1tPJis04wHLZKYHETVcCsUDFCj5INecpHk/Wh6V/v9F6NyNQjznIepnSsRCIYRSv1Ru2O1/XmIJvEX5IOLDFq/zHGStSrpBJaszQ93IMS6pRMkr97kwPKdsSsd8aKmiKTdhOY9ZkXOrxCTJtH0KyVz9u1HS1JhZGtnJlOLErHu1+J83LDC5Dkuh8gK5YotDSEJZqT+M4mF5gzlzBLKtLBZCZtQTRnaZlau4GsdzVSuLcZfr2GTBJfdm6734ETuEMLsCHK7iFe+hBAxieIN3p3I+nM9Ffw1nWeQJrMD5+gUIeZMC</latexit><latexit sha1_base64="nTJ+XvHptU8v/H8DjngZ49OftYc=">ACDnicbVDNSgMxGPzW31qrl71ECxCRSjbXlRQELx4rODaQrs2Wy2Dc3+kGSlZdmLz+EDeNVH8CS+gU/ga5hte7CtAyHDTD6+yXgJZ1JZ1rexsrq2vrFZ2ipvV3Z298z9yqOMU0GoTWIei46HJeUsorZitNOIigOPU7b3vC28NtPVEgWRw9qnFAnxP2IBYxgpSXPEJrefF3JfjUF/ZKHezK3bWzE/RNXLNqlW3JkDLpDEjVZih5Zo/PT8maUgjRTiWstuwEuVkWChGOM3LvVTSBJMh7tOuphEOqXSyS9ydKIVHwWx0CdSaKL+nchwKIuU+mWI1UAueoX4n9dNVXDhZCxKUkUjMl0UpBypGBWVIJ8JShQfa4KJYDorIgMsMFG6uLktalREk3lZN9NY7GZ2M36Zd26t6AEh3AMNWjAOdzAHbTABgLP8Apv8G68GB/G57TCFWPW5QHMwfj6BQckntc=</latexit><latexit sha1_base64="nTJ+XvHptU8v/H8DjngZ49OftYc=">ACDnicbVDNSgMxGPzW31qrl71ECxCRSjbXlRQELx4rODaQrs2Wy2Dc3+kGSlZdmLz+EDeNVH8CS+gU/ga5hte7CtAyHDTD6+yXgJZ1JZ1rexsrq2vrFZ2ipvV3Z298z9yqOMU0GoTWIei46HJeUsorZitNOIigOPU7b3vC28NtPVEgWRw9qnFAnxP2IBYxgpSXPEJrefF3JfjUF/ZKHezK3bWzE/RNXLNqlW3JkDLpDEjVZih5Zo/PT8maUgjRTiWstuwEuVkWChGOM3LvVTSBJMh7tOuphEOqXSyS9ydKIVHwWx0CdSaKL+nchwKIuU+mWI1UAueoX4n9dNVXDhZCxKUkUjMl0UpBypGBWVIJ8JShQfa4KJYDorIgMsMFG6uLktalREk3lZN9NY7GZ2M36Zd26t6AEh3AMNWjAOdzAHbTABgLP8Apv8G68GB/G57TCFWPW5QHMwfj6BQckntc=</latexit><latexit sha1_base64="oXO/t8/o6sZTWbRfvx+OxMSLk=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkGxUim5cVjC20IYwmUzaoZMHMxNpCdn4HX6AW/0EV+LWlV/gbzhps7CtF4Y5nHMv9zjxowKaRjfWmlpeWV1rbxe2djc2t7Rd/ceRJRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naHN7nefiRc0Ci8l+OY2AHqh9SnGElFOfohjGs9N2KeGAfqS0eZk17S0Z2Aq+go1eNujEpuAjMAlRBUS1H/+l5EU4CEkrMkBd04ilnSIuKWYkq/QSQWKEh6hPugqGKCDCTidXZPBYMR70I65eKOGE/TuRokDkLlVngORAzGs5+Z/WTaR/bqc0jBNJQjxd5CcMygjmkUCPcoIlGyuAMKfK8QDxBGWKriZLXKUWxNZRSVjzuewCKxG/aJu3BnV5nURURkcgCNQAyY4A01wC1rAhg8gRfwCt60Z+1d+9A+p60lrZjZBzOlf0C6Y+gVw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit>

No thermalization issue compared to slow mixing Gibbs sampling of Boltzmann Machines

Ferris & Vidal 2012

p(x) = ∏

i

p(xi|x<i) = ∏

i

p(x<i+1) p(x<i)

*applies to TTN and MERA as well

slide-41
SLIDE 41

Direct Generation

p(x<i+2) =

<latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="pBqueQR2QKJA0h8T342m+OG4y9E=">AB7XicbVDLSsNAFL2prxqr1rWbwSK4KokbdSe4cVnB2EINZTKZtEMnkzBzI5aQH3DrJ7gSv8gv8DectF3Y1gMDh3Pu5Z45US6FQc/7dhpb2zu7e8196DlHh4dt1tPJis04wHLZKYHETVcCsUDFCj5INecpHk/Wh6V/v9F6NyNQjznIepnSsRCIYRSv1Ru2O1/XmIJvEX5IOLDFq/zHGStSrpBJaszQ93IMS6pRMkr97kwPKdsSsd8aKmiKTdhOY9ZkXOrxCTJtH0KyVz9u1HS1JhZGtnJlOLErHu1+J83LDC5Dkuh8gK5YotDSEJZqT+M4mF5gzlzBLKtLBZCZtQTRnaZlau4GsdzVSuLcZfr2GTBJfdm6734ETuEMLsCHK7iFe+hBAxieIN3p3I+nM9Ffw1nWeQJrMD5+gUIeZMC</latexit><latexit sha1_base64="nTJ+XvHptU8v/H8DjngZ49OftYc=">ACDnicbVDNSgMxGPzW31qrl71ECxCRSjbXlRQELx4rODaQrs2Wy2Dc3+kGSlZdmLz+EDeNVH8CS+gU/ga5hte7CtAyHDTD6+yXgJZ1JZ1rexsrq2vrFZ2ipvV3Z298z9yqOMU0GoTWIei46HJeUsorZitNOIigOPU7b3vC28NtPVEgWRw9qnFAnxP2IBYxgpSXPEJrefF3JfjUF/ZKHezK3bWzE/RNXLNqlW3JkDLpDEjVZih5Zo/PT8maUgjRTiWstuwEuVkWChGOM3LvVTSBJMh7tOuphEOqXSyS9ydKIVHwWx0CdSaKL+nchwKIuU+mWI1UAueoX4n9dNVXDhZCxKUkUjMl0UpBypGBWVIJ8JShQfa4KJYDorIgMsMFG6uLktalREk3lZN9NY7GZ2M36Zd26t6AEh3AMNWjAOdzAHbTABgLP8Apv8G68GB/G57TCFWPW5QHMwfj6BQckntc=</latexit><latexit sha1_base64="nTJ+XvHptU8v/H8DjngZ49OftYc=">ACDnicbVDNSgMxGPzW31qrl71ECxCRSjbXlRQELx4rODaQrs2Wy2Dc3+kGSlZdmLz+EDeNVH8CS+gU/ga5hte7CtAyHDTD6+yXgJZ1JZ1rexsrq2vrFZ2ipvV3Z298z9yqOMU0GoTWIei46HJeUsorZitNOIigOPU7b3vC28NtPVEgWRw9qnFAnxP2IBYxgpSXPEJrefF3JfjUF/ZKHezK3bWzE/RNXLNqlW3JkDLpDEjVZih5Zo/PT8maUgjRTiWstuwEuVkWChGOM3LvVTSBJMh7tOuphEOqXSyS9ydKIVHwWx0CdSaKL+nchwKIuU+mWI1UAueoX4n9dNVXDhZCxKUkUjMl0UpBypGBWVIJ8JShQfa4KJYDorIgMsMFG6uLktalREk3lZN9NY7GZ2M36Zd26t6AEh3AMNWjAOdzAHbTABgLP8Apv8G68GB/G57TCFWPW5QHMwfj6BQckntc=</latexit><latexit sha1_base64="oXO/t8/o6sZTWbRfvx+OxMSLk=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkGxUim5cVjC20IYwmUzaoZMHMxNpCdn4HX6AW/0EV+LWlV/gbzhps7CtF4Y5nHMv9zjxowKaRjfWmlpeWV1rbxe2djc2t7Rd/ceRJRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naHN7nefiRc0Ci8l+OY2AHqh9SnGElFOfohjGs9N2KeGAfqS0eZk17S0Z2Aq+go1eNujEpuAjMAlRBUS1H/+l5EU4CEkrMkBd04ilnSIuKWYkq/QSQWKEh6hPugqGKCDCTidXZPBYMR70I65eKOGE/TuRokDkLlVngORAzGs5+Z/WTaR/bqc0jBNJQjxd5CcMygjmkUCPcoIlGyuAMKfK8QDxBGWKriZLXKUWxNZRSVjzuewCKxG/aJu3BnV5nURURkcgCNQAyY4A01wC1rAhg8gRfwCt60Z+1d+9A+p60lrZjZBzOlf0C6Y+gVw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit><latexit sha1_base64="CgXMFVcs2dVE/GD4ia9sQJiqRn0=">ACGXicbVDLSsNAFJ3UV62vqEtdDBahIpSkCoFN24rGBsoQ1hMpm0QycPZibSErLxO/wAt/oJrsStK7/A3DSZmGrF4Y5nHMv9zjxowKaRhfWmlhcWl5pbxaWVvf2NzSt3fuRZRwTCwcsYh3XCQIoyGxJWMdGJOUOAy0naH17nefiBc0Ci8k+OY2AHqh9SnGElFOfo+jGs9N2KeGAfqS0eZk17Q40Z2BC+ho1eNujEp+BeYBaiColqO/t3zIpwEJSYISG6phFLO0VcUsxIVuklgsQID1GfdBUMUCEnU6uyOChYjzoR1y9UMIJ+3siRYHIXarOAMmBmNdy8j+tm0j/zE5pGCeShHi6yE8YlBHMI4Ee5QRLNlYAYU6V4gHiCMsVXAzW+QotyayikrGnM/hL7Aa9fO6cXtSbV4VEZXBHjgANWCU9AEN6AFLIDBI3gGL+BVe9LetHftY9pa0oqZXTBT2ucP6s+gWw=</latexit>

“Zipper Sampling” No thermalization issue compared to slow mixing Gibbs sampling of Boltzmann Machines

Ferris & Vidal 2012

p(x) = ∏

i

p(xi|x<i) = ∏

i

p(x<i+1) p(x<i)

*applies to TTN and MERA as well

slide-42
SLIDE 42

Image Restoration

Han, Wang, Fan, LW, Zhang, 1709.01662, PRX 18’

slide-43
SLIDE 43

Arbitrary order, in contrast to autoregressive models PixelCNN

Image Restoration

Han, Wang, Fan, LW, Zhang, 1709.01662, PRX 18’

slide-44
SLIDE 44

?

Quantum Perspective on Deep Learning

slide-45
SLIDE 45

(a)

Quantum Perspective on Deep Learning

Q: How to quantify our inductive biases ? A: Information pattern of probability distributions

slide-46
SLIDE 46

(a)

Quantum Perspective on Deep Learning

Q: How to quantify our inductive biases ? A: Information pattern of probability distributions

slide-47
SLIDE 47

(a) (a)

Quantum Perspective on Deep Learning

Q: How to quantify our inductive biases ? A: Information pattern of probability distributions

slide-48
SLIDE 48

p(

) p( )

p(

) p( )

(b)

×

(a)

×

Quantum Perspective on Deep Learning

slide-49
SLIDE 49

Classical mutual information Quantum Renyi entanglement entropy

Striking similarity implies common inductive bias

Cheng, Chen, LW, 1712.04144, Entropy 18’

+Quantitative & interpretable approaches +Principled structure design & learning

Quantum Perspective on Deep Learning

I = − ⟨ln ⟨ p(x, y′)p(x′, y) p(x′, y′)p(x, y)⟩x′,y′⟩

x,y

S = − ln ⟨⟨ Ψ(x, y′)Ψ(x′, y) Ψ(x′, y′)Ψ(x, y)⟩x′,y′⟩

x,y

slide-50
SLIDE 50 Published as a conference paper at ICLR 2018

DEEP LEARNING AND QUANTUM ENTANGLEMENT: FUNDAMENTAL CONNECTIONS WITH IMPLICATIONS

TO NETWORK DESIGN

Yoav Levine, David Yakira, Nadav Cohen & Amnon Shashua The Hebrew University of Jerusalem {yoavlevine,davidyakira,cohennadav,shashua}@cs.huji.ac.il

ABSTRACT

Formal understanding of the inductive bias behind deep convolutional networks, i.e. the relation between the network’s architectural features and the functions it is able to model, is limited. In this work, we establish a fundamental connection between the fields of quantum physics and deep learning, and use it for obtain- ing novel theoretical observations regarding the inductive bias of convolutional
  • networks. Specifically, we show a structural equivalence between the function re-
alized by a convolutional arithmetic circuit (ConvAC) and a quantum many-body wave function, which facilitates the use of quantum entanglement measures as quantifiers of a deep network’s expressive ability to model correlations. Further- more, the construction of a deep ConvAC in terms of a quantum Tensor Network is enabled. This allows us to perform a graph-theoretic analysis of a convolutional network, tying its expressiveness to a min-cut in its underlying graph. We demon- strate a practical outcome in the form of a direct control over the inductive bias via the number of channels (width) of each layer. We empirically validate our findings on standard convolutional networks which involve ReLU activations and max pooling. The description of a deep convolutional network in well-defined graph-theoretic tools and the structural connection to quantum entanglement, are two interdisciplinary bridges that are brought forth by this work.

1 INTRODUCTION

slide-51
SLIDE 51 Published as a conference paper at ICLR 2018

DEEP LEARNING AND QUANTUM ENTANGLEMENT: FUNDAMENTAL CONNECTIONS WITH IMPLICATIONS

TO NETWORK DESIGN

Yoav Levine, David Yakira, Nadav Cohen & Amnon Shashua The Hebrew University of Jerusalem {yoavlevine,davidyakira,cohennadav,shashua}@cs.huji.ac.il

ABSTRACT

Formal understanding of the inductive bias behind deep convolutional networks, i.e. the relation between the network’s architectural features and the functions it is able to model, is limited. In this work, we establish a fundamental connection between the fields of quantum physics and deep learning, and use it for obtain- ing novel theoretical observations regarding the inductive bias of convolutional
  • networks. Specifically, we show a structural equivalence between the function re-
alized by a convolutional arithmetic circuit (ConvAC) and a quantum many-body wave function, which facilitates the use of quantum entanglement measures as quantifiers of a deep network’s expressive ability to model correlations. Further- more, the construction of a deep ConvAC in terms of a quantum Tensor Network is enabled. This allows us to perform a graph-theoretic analysis of a convolutional network, tying its expressiveness to a min-cut in its underlying graph. We demon- strate a practical outcome in the form of a direct control over the inductive bias via the number of channels (width) of each layer. We empirically validate our findings on standard convolutional networks which involve ReLU activations and max pooling. The description of a deep convolutional network in well-defined graph-theoretic tools and the structural connection to quantum entanglement, are two interdisciplinary bridges that are brought forth by this work.

1 INTRODUCTION

slide-52
SLIDE 52

Mean Field Theory

Physicists’ gifts to Machine Learning

U

Quantum Computing Tensor Networks Monte Carlo Methods

Ψ

slide-53
SLIDE 53

Quantum Circuit Born Machine

Train the quantum circuit as a probabilistic generative model Quantum sampling complexity underlines the “quantum supremacy”

With Liu, Zeng, Wu, Hu 1804.04168, 1808.03425

slide-54
SLIDE 54

Qubit efficient scheme

Product state Measured qubits

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537

see also Cramer et al, Nat. Comm. 2010

Tensor network inspired quantum circuit architecture

slide-55
SLIDE 55

Qubit efficient scheme

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537

Product state Measured qubits

see also Cramer et al, Nat. Comm. 2010

slide-56
SLIDE 56

Qubit efficient scheme

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537

Product state Measured qubits

see also Cramer et al, Nat. Comm. 2010

Reuse!

slide-57
SLIDE 57

Qubit efficient scheme

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537

Product state Measured qubits

(a)

see also Cramer et al, Nat. Comm. 2010

Reuse!

slide-58
SLIDE 58

Qubit efficient scheme

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537

Product state Measured qubits

(a)

D = 23

see also Cramer et al, Nat. Comm. 2010

Reuse!

MPS with exponentially large bond dimensions

slide-59
SLIDE 59

W e!, back to Boltzmann Machines

slide-60
SLIDE 60

Quantitative Performance

PixelCNN RBM ~84.6 DBM 81.3 Binarized MNIST test NLL, lower is better* ~86.3

x1 x2 . . . xN h1 h2 h3 . . . hM

*Lower test NLL may not imply better sample quality, Theis et al, 1511.01844

slide-61
SLIDE 61

Quantitative Performance

PixelCNN RBM ~84.6 DBM 81.3 Binarized MNIST test NLL, lower is better* ~86.3

x1 x2 . . . xN h1 h2 h3 . . . hM

*Lower test NLL may not imply better sample quality, Theis et al, 1511.01844

MPS (D=100) 101.5

Han et al, PRX 18’

slide-62
SLIDE 62

Quantitative Performance

PixelCNN RBM ~84.6 DBM 81.3 Binarized MNIST test NLL, lower is better* ~86.3

x1 x2 . . . xN h1 h2 h3 . . . hM

*Lower test NLL may not imply better sample quality, Theis et al, 1511.01844

TTN (D=50) 94.3

With Song Cheng & Pan Zhang, unpublished

(a)

MPS (D=100) 101.5

Han et al, PRX 18’

slide-63
SLIDE 63

“Positive” PEPS

p(x) =

/Z

= ∑

σ

σ Probability model

  • Smaller truncation error
  • No need for double layer tensor

Why positive? aka hidden Markov random field model

p(x) = ∑

h ∏ i

f i(xi, h)/Z

slide-64
SLIDE 64

“Positive” PEPS

p(x) =

/Z

= ∑

σ

σ

ℒ = − ⟨ln ( )⟩x∼풟 + ln ( )

Contract for each given sample Contract once Loss function Probability model

  • Smaller truncation error
  • No need for double layer tensor

Why positive? aka hidden Markov random field model

p(x) = ∑

h ∏ i

f i(xi, h)/Z

slide-65
SLIDE 65

“Positive” PEPS

p(x) =

/Z

= ∑

σ

σ

ℒ = − ⟨ln ( )⟩x∼풟 + ln ( )

Contract for each given sample Contract once Loss function Probability model Contract 28x28 PEPS 60000+1 times per epoch!

  • Smaller truncation error
  • No need for double layer tensor

Why positive? aka hidden Markov random field model

p(x) = ∑

h ∏ i

f i(xi, h)/Z

slide-66
SLIDE 66

Quantitative Performance

PixelCNN RBM ~84.6 DBM 81.3 ~86.3 MPS (D=100) 101.5 TTN (D=50) 94.3 Binarized MNIST test NLL, lower is better

ℒ = − ⟨ln ( )⟩x∼풟 + ln ( )

Contract for each given sample Contract once Loss function Contract 28x28 PEPS 60000+1 times per epoch!

slide-67
SLIDE 67

Quantitative Performance

PixelCNN RBM ~84.6 DBM 81.3 ~86.3 MPS (D=100) 101.5 TTN (D=50) 94.3 PEPS+ (D=5) 92 Binarized MNIST test NLL, lower is better

ℒ = − ⟨ln ( )⟩x∼풟 + ln ( )

Contract for each given sample Contract once Loss function Contract 28x28 PEPS 60000+1 times per epoch!

With Song Cheng & Pan Zhang, unpublished

slide-68
SLIDE 68

Quantitative Performance

PixelCNN RBM ~84.6 DBM 81.3 ~86.3 MPS (D=100) 101.5 TTN (D=50) 94.3 PEPS+ (D=5) 92 Binarized MNIST test NLL, lower is better

slide-69
SLIDE 69

Quantitative Performance

PixelCNN RBM ~84.6 DBM 81.3 ~86.3 MPS (D=100) 101.5 TTN (D=50) 94.3 PEPS+ (D=5) 92 Binarized MNIST test NLL, lower is better

slide-70
SLIDE 70

Bonus: What does deep learning offer to tensor network states & algorithms?

Differentiable tensor libraries with GPU/TPU/FPGA/… support

slide-71
SLIDE 71

ℒ Loss

∂ℒ ∂xℓ = ∂ℒ ∂xℓ+1 ( ∂xℓ+1 ∂xℓ )

… …

The engine of deep learning: Back-Propagation algorithm

Jacobian-Vector Product

Computes gradients efficiently & accurately Via reverse mode automatic differentiation

f` f`+1 x` x`+1 x`+2

computation graph

slide-72
SLIDE 72 https://medium.com/@karpathy/software-2-0-a64152b37c35
  • Computationally homogeneous

Benefits

  • Simple to bake into silicon
  • Constant running time
  • Constant memory usage
  • Highly portable & agile
  • Modules can meld into an optimal whole
  • Better than humans

Andrej Karpathy

Director of AI at Tesla. Previously Research Scientist at OpenAI and PhD student at Stanford. I like to train deep neural nets on large datasets.

Differentiable Programming

Writing software 2.0 by gradient search in the program space

slide-73
SLIDE 73

Differentiable Scientific Programming

  • Most linear algebra operations (Eigen, SVD!) are differentiable
  • Loop/Condition/Sort/Permutations are also differentiable
  • Differentiable ray tracer
  • Differentiable Monte Carlo/Tensor Network/Functional RG/

Dynamical Mean Field Theory/Density Functional Theory…

  • ODE integrators are differentiable with O(1) memory

Differentiable fluid simulations and

Differentiable programming is more than training neural networks

slide-74
SLIDE 74

Differentiable Eigensolver

HΨ = ΨΛ

Forward mode: What happen if H = H + dH Perturbation theory Reverse mode: How should I change ? ∂ℒ/∂Ψ ∂ℒ/∂Λ and ? Inverse perturbation theory

H given

Hamiltonian engineering via differentiable programming

https://github.com/wangleiphy/DL4CSRC/tree/master/2-ising

slide-75
SLIDE 75

Differentiable Levin-Nave TRG

computation graph ln Z β

https://github.com/ wangleiphy/TRG

SVD w/ truncation contraction

Automatic differentiation for high order gradients

  • Exact gradient
  • Not finite-difference
  • No additional codes
  • Efforts comparable

to the forward pass

slide-76
SLIDE 76

Z = ∑

{σ}

exp ( 1 2 Kijσiσj)

Contraction ln Z

Differentiable spin glass solver

random couplings Kij

Wang, Qin, Zhou, PRB 2014 Rams et al, 1811.06518 TNS studies of Ising spin glasses:

slide-77
SLIDE 77

Z = ∑

{σ}

exp ( 1 2 Kijσiσj)

Contraction ln Z

Differentiable spin glass solver

random couplings Kij

Wang, Qin, Zhou, PRB 2014

d ln Z dKij = ⟨σiσj⟩

Differentiation correlations

Rams et al, 1811.06518 TNS studies of Ising spin glasses:

slide-78
SLIDE 78

Z = ∑

{σ}

exp ( 1 2 Kijσiσj)

Contraction ln Z

Differentiable spin glass solver

random couplings Kij

Wang, Qin, Zhou, PRB 2014

d ln Z dKij = ⟨σiσj⟩

Differentiation correlations

Rams et al, 1811.06518 TNS studies of Ising spin glasses:

Tensor network solver for inverse Ising problems Gradient descend

slide-79
SLIDE 79

Gradient based variational optimization

Human only cares about tensor contraction Differentiable programing takes care of the gradients

Vanderstraeten et al, PRB 16’

ℒ = ln⟨Ψ| ̂ H|Ψ⟩ − ln ⟨Ψ|Ψ⟩

ℒ = ⟨E(x)⟩x∼풟 + ln Z

Gradient optimization works fine Generative Modeling Variational ground state, why not ?

grad = + + + + + + + + + + + + + + + + + + + + + + + ,
slide-80
SLIDE 80

Pan Zhang Song Cheng Jing Chen Tao Xiang Zhiyuan Xie Jin-Guo Liu

Thank You!