Dualities in and from Machine Learning Sven Krippendorf Deep - - PowerPoint PPT Presentation

dualities in and from machine learning
SMART_READER_LITE
LIVE PREVIEW

Dualities in and from Machine Learning Sven Krippendorf Deep - - PowerPoint PPT Presentation

Dualities in and from Machine Learning Sven Krippendorf Deep Learning and Physics 2019 Yukawa Institute for Theoretical Physics, Kyoto October 31 st 2019 1 Spend 2 more slides on Current ML applications in high energy 2


slide-1
SLIDE 1

Dualities in and from Machine Learning

Sven Krippendorf
 Deep Learning and Physics 2019
 Yukawa Institute for Theoretical Physics, Kyoto October 31st 2019

  • 1
slide-2
SLIDE 2

Current ML applications
 in high energy

2

Spend 2 more slides on

slide-3
SLIDE 3

Improving sensitivity

  • ML-techniques heavily used in experimental bounds.
  • Brief example: Improving sensitivity for ultra-light axion-

like particles, compared to previous bounds.

  • ML algorithms good at classification. Detecting particles

is a classification problem. Our classifiers:
 
 
 


  • Training: Simulate data with and without axions for

appropriate X-ray sources

  • Bounds: Compare fake & real data performance
  • Algorithms (sklearn): decision trees, boosted decision

trees, random forests, Gaussian Naive Bayes, Gaussian Process classifier, SVM, …

Day, SK 1907.07642

Classifier

0 (no axions) 1 (axions) Spectrum

Previous bounds:
 NGC1275: 1605.01043, Other sources: 1704.05256, Athena bounds: 1707.00176 with: Conlon, Day, Jennings; Berg, Muia, Powell, Rummel

AGN

Cluster

B

axion
 conversion

γ X-ray telescope

galactic absorption

|γ(E)i ! α|γ(E)i + β|a(E)i

3

slide-4
SLIDE 4

Constraining ALPs

  • Photon-axion interconversion in background magnetic fields:

  • One interesting parameter region can be obtained for

photons from sources in and behind galaxy cluster magnetic fields.


4

ℒ ⊃ − gaγγ 4 aF ˜ F = gaγγE ⋅ B

0.010 0.100 1 10 100 ω [keV] 0.05 0.10 0.15 0.20 0.25 Pγ → a

sweet spot

too rapid oscillations suppressed conversion

no oscillations

NGC1275: 1605.01043 Other sources: 1704.05256 Athena bounds: 1707.00176 with: Conlon, Day, Jennings, Rummel; Berg, Muia, Powell

Pγ→a = 1 2 Θ2 1 + Θ2 sin2 ⇣ ∆ p 1 + Θ2 ⌘

Θ = 0.28 ✓ B⊥ 1 µG ◆ ⇣ ω 1 keV ⌘ ✓10−3cm−3 ne ◆ ✓1011GeV M ◆

∆ = 0.54 ⇣ ne 10−3cm−3 ⌘ ✓ L 10kpc ◆ ✓1keV ω ◆

slide-5
SLIDE 5

Improving sensitivity

  • Data: Chandra X-ray observations of bright point sources

(AGN, Quasar) in or behind galaxy clusters

  • Bounds for ALPs with m<10-12 eV due to absence of

characteristic spectral modulations caused by interconversion between photons and axions in cluster background magnetic field

Picture: spectral distortui Picture: bounds overview Table: our results

Axion Coupling |GAγγ | (GeV-1) Axion Mass mA (eV) 10-18 10-16 10-14 10-12 10-10 10-8 10-6 10-30 10-25 10-20 10-15 10-10 10-5 100

LSW (OSQAR) Helioscopes (CAST) Haloscopes (ADMX) Telescopes

Horizontal Branch Stars KSVZ DFSZ VMB (PVLAS) SN 1987A HESS NGC1275 - Chandra NGC1275 - Athena Fermi-LAT

x 10-12 GeV-1 AB C DTC GaussianNB QDA RFC Previous A1367 (resid.) 1.9

  • 2.4

A1367 (up-resid.) 2.0

  • 1.9
  • 2.4

A1795 Quasar (resid.)

  • 1.7
  • 1.4

>10.0 A1795 Quasar (up-resid.)

  • >10.0

A1795 Sy1 (resid.) 1.0 0.8 1.2 1.1 0.7 1.5 A1795 Sy1 (up-resid.) 1.1 1.1 1.1 1.0 0.8 1.5

5

slide-6
SLIDE 6

ML for the string landscape?

Many talks remove slides

many talks: Halverson, Ruehle, Shiu

6

slide-7
SLIDE 7

Other avenues?

7

slide-8
SLIDE 8

– Gary Shiu

“Don’t ask what ML can do for you, ask what you can do for ML.”

8

slide-9
SLIDE 9

Physics ⋂ ML

9

slide-10
SLIDE 10

Dualities

Betzler, SK: 191x.xxxxx

10

slide-11
SLIDE 11

The problem

  • Obtain correlators at high(er) accuracy:


  • In physics (incl. condensed matter, holography, string/field

theory), we often use clever data representations to evaluate correlators.

  • Multiple representations can be useful ➔ dualities
  • Dual representations are very good representations to

evaluate certain correlation functions (mapping strongly coupled data products to weakly coupled data products).

  • Examples …

⟨f(ϕi)⟩

think of this as properties of your data

11

slide-12
SLIDE 12

How are dualities useful in practice?


aka connecting physics questions to data questions

12

slide-13
SLIDE 13

Examples: dual representations

  • Discrete Fourier transformation:



 
 


  • Is there a signal under the noise?

pk =

n

j=1

xj e−2πijk/n

xk = 1 n

n

j=1

pj e2πijk/n

13

slide-14
SLIDE 14

Dual Original

  • 2D Ising model: high - low temperature self-duality

H = − J∑

⟨i,j⟩

sisj

Z = ∑ e−βH(s)

˜ β = − 1 2 log tanh β

β = 1 kBT

H = − J∑

⟨i,j⟩

σiσj

Z = ∑ e− ˜

βH(σ)

Tcritical

Ordered rep. ↔ Disordered rep.

Position space? Momentum space?

Krammers, Wannier 1941; Onsager 1943; review: Savit 1980

14

Examples: dual representations

slide-15
SLIDE 15

Which data problem?

  • Some correlation function which is easier evaluated on

dual variables.


  • Can we classify the temperature for low-temperature

configurations? Which temperature is a sample drawn from (at low temperatures)?

⟨σiσj⟩, ⟨E(σ)⟩, ⟨M(σ)⟩

They look rather similar. How about in the dual rep.?

Replace Images

15

slide-16
SLIDE 16

Data question on Ising

  • But at the dual temperatures, our data takes a different

shape:
 
 
 
 
 
 


  • It is easier to classify temperature of a low-temperature

configuration in the dual representation …

  • How come?

P(s) = eE/T Z , P(σ) = e ˜

E/ ˜ T

Z ΔT ≪ Δ ˜ T ⟨ΔE⟩ ≪ ⟨Δ ˜ E⟩

16

Duality

slide-17
SLIDE 17

Data question on Ising

  • Let’s look at the overlap of energy distributions in finite

size samples
 
 
 
 
 
 
 
 
 Let’s check for performance.

17

Original variables Dual variables

slide-18
SLIDE 18

Ising: simple network

  • Let’s confirm this

at simple networks

  • Side remark: way
  • utperforming

standard sklearn classifiers

18

Change figures

slide-19
SLIDE 19

Example: dual representations

  • 1D Ising with multiple spin interactions:


normal: (here: )


dual: where

  • Two data questions: 1) energy of spin configuration


2) metastable configuration

H(s) = − J

N−n+1

k=1 n−1

l=0

sk+l − B

N

k=1

sk

B = 0

H(σ) = − J

N−n+1

k=1

σk σk =

n−1

l=0

sk+l

  • cf. 1605.05199

+ + + +

s σ

N = 10 n = 3

Ghost spins (fixed value)

19

slide-20
SLIDE 20

Example: dual representations

  • Two data questions: 1) energy of spin configuration


2) metastable configuration



 
 
 
 
 
 
 


  • cf. 1605.05199

+ + + +

s σ

N = 10 n = 3

  • - + - - - + - - +

+ + + - + + + + - +

configuration Energy metastable stable

Add evaluation of metastability on dual and normal variables

20

slide-21
SLIDE 21
  • Performance on

metastability classification (single hidden layer)

  • Deeper networks or CNNs

perform better to a certain degree but at large N or n show the same feature.

Example: dual representations

n = 4 n = 5 n = 8 n = 9 n = 12 6 · 102 0.9113 0.8688 0.8788 0.8813 0.8803 3 · 103 − 0.9243 0.9215 0.9223 0.9295 9.5 · 103 − − 0.9424 0.9475 0.9739 n = 4 n = 5 n = 8 n = 9 n = 12 6 · 102 0.9911 0.9783 0.9819 0.9855 0.9909 3 · 103 − 0.9958 0.9977 0.9994 1.0000 9.5 · 103 − − 1.0000 1.0000 1.0000

600 training
 samples

(b) 3000 training samples.

3000 training
 samples Normal variables Dual variables

21

slide-22
SLIDE 22

Upshot: dual representations simplify

answering certain data questions
 (i.e. simple networks sufficient)

22

slide-23
SLIDE 23

Why interesting for ML?

23

slide-24
SLIDE 24
  • Finding good representations which allow to answer the data question

is hard (if not impossible).
 
 
 
 
 
 
 
 


  • In this talk we use physics examples. A generalisation to other data

products/questions would be interesting. Here, the data question on the dual network can be addressed with very simple networks.

Why interesting for ML?

Normal Frame Dual Frame Data Question ✘ ✔

Input Data:
 Normal Frame

Neural
 Network

Output: Answer
 to Data Question

Neural
 Network

Output

Neural
 Network

Dual Representation

24

slide-25
SLIDE 25

DFT: simple network

  • Supervised learning task (binary classification):



 


  • Network



 
 
 
 


{((xR, xI), y)} {((pR, pI), y)} N discrete values y = 0 y = 1 noise + signal noise

Layer Shape Parameters Conv1D (2000,2) 4 Activation (2000,2)

  • Dense

1 4001 Activation 1

  • For this network

classification works in momentum space,
 but not in position space.

25

slide-26
SLIDE 26

Utilising dual representation

  • Goal: improve performance on position space.
  • Deeper network? Can do the job in principle


[DFT can be implemented with a single dense layer]

  • However finding it dynamically is `impossible’ with standard
  • ptimisers, initialisations, and regularisers.



 
 
 
 


Layer Shape Parameters Dense (2000,2) 16000000 Conv1D (2000,2) 4 Activation (2000,2)

  • Dense

1 4001 Activation 1

  • DFT

Random starting point

26

slide-27
SLIDE 27

Can we improve the situation by favouring dual representations?

27

slide-28
SLIDE 28

How to utilise dualities?

  • Learn dual transformations when explicitly known (trivial)

and use them as intermediate step in architecture.

  • Enforce dual representations via feature separation.
  • When not explicitly known, we can match features of

distributions (example 2D Ising high-low-temperature duality)

  • Re-obtaining “dualities” (1D Ising) by demanding good

performance on medium-hard correlation in intermediate layer where no loss of information is present (beyond known dualities).

28

slide-29
SLIDE 29

DFT from modified loss

  • Adapt loss function to achieve feature separation, i.e. separating

the two classes of data (inspired by triplet loss)
 [towards generating dual representations dynamically]
 
 
 
 
 
 


  • Note: different data question (signal injected in position and

momentum space) can lead to multiple minima in the loss landscape, i.e. using momentum space and position space.

29

1503.03832

Loss = |ynoise|2 − |ysignal|2 + α

Loss = max {0, β − ∑ |w|2 }

DFT Assist finding DFT
 with modified loss Separation Decorrelation via weight regularisation

+∑

i≠j

max{0,(wi ⋅ wj)}

slide-30
SLIDE 30

DFT from feature separation

  • The final result is a simple network with few parameters

utilising `dual’ representation. (First layer in
 deeper network)

Fourier Network Rep. Input sample:

30

slide-31
SLIDE 31

Feature separation in Ising

  • Which neural network architecture? Several architectures,

so far most promising: U-Net (1505.04597)

s-configuration σ-configuration

MaxPooling

Conv.

UpSampling

UNet Pix2Pix

31

Matching dual energy distribution

slide-32
SLIDE 32

Enforcing dual representation

  • Observation: deep networks do not

find these good representation by themselves

  • Which ingredient is missing on the

neural network side?

  • Work around when task which is

accessible in both frames

  • Step 1: Find better representation
  • Step 2: Use this representation on

previously inaccessible task
 
 
 


DFT Assist finding DFT
 with modified loss

Original Different rep. Original Simple Task

Autoencoder fixed

Sophisticated Task

Add details on network

32

slide-33
SLIDE 33

Example: 1D Ising model

  • Strategy from previous page shows improvement on

performance:


n = 4 n = 5 n = 8 n = 9 n = 12 6 · 102 0.9113 0.8688 0.8788 0.8813 0.8803 3 · 103 − 0.9243 0.9215 0.9223 0.9295 9.5 · 103 − − 0.9424 0.9475 0.9739 n = 4 n = 5 n = 8 n = 9 n = 12 6 · 102 0.9911 0.9783 0.9819 0.9855 0.9909 3 · 103 − 0.9958 0.9977 0.9994 1.0000 9.5 · 103 − − 1.0000 1.0000 1.0000

Now with “dual” representation: test accuracy over 99%

33

slide-34
SLIDE 34

Intermediate representation

  • Visualise dependency of intermediate variables

and input variables :


  • Resembles structure from actual dual variables (results

depend on initialisation), similar but not identical islands
 
 
 
 


fi(s) s

Mij = ⟨(fi(s1, …, sj, …sN) − fi(s1, …, − sj, …sN))

2

1 N ∑N k=1 ⟨(fi(s1, …, sk, …sN) − fi(s1, …, − sk, …sN)) 2⟩

1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5

Actual Dual variables Intermediate variables

34

slide-35
SLIDE 35

Recap

35

slide-36
SLIDE 36

Perspective

  • Large test arsenal of physics dualities from CM, holography, field/string

theory.

  • Dualities are our best tools to describe strongly coupled systems.
  • Allow us to get EFT-operators at higher accuracy than normally allowed

(theory: large number of diagrams, experiment: large amount of data). Think about Yukawa couplings in heterotic standard embedding

  • Dual data representations can be obtained via ML. We can design the

dual representation to have particular properties, e.g. display certain quantities straight-forwardly. Can we find new useful physics dualities in this way, i.e. new ways to describe dynamical systems? Which correlators should we use to obtain dual representations?

  • Twist to DL procedures? Intermediate layer with no information loss and

representation suitable for “medium-hard” task. This pre-trained structure should be useful for more sophisticated tasks in these variables.

36

slide-37
SLIDE 37

“Therefore we can see that the dualities we have been dealing with for antisymmetric tensors are only particular cases of Fourier transforms and finding the dual action reduces to finding Fourier transforms. “

Dualities and Fourier transformation

  • Review on dualities and global symmetries (Quevedo: hep-

th/9706210), several dualities can be seen as Fourier transformation:
 
 


  • For instance duality between massive antisymmetric

tensors.
 
 
 


S = ∫ dDx (F(∂Hh) + G(Hh)) ˜ S = ∫ dDx ( ˜ F( ˜ Bd−h) + ˜ G(∂ ˜ Bd−h)) Z = ∫ 𝒠 ˜ Bd−h𝒠Hh e ∫ dDx(H⋅d ˜

Bd−h+G(Hh)+ ˜ F( ˜ Bd−h))

˜ Bd−h Hh e ˜

F ↔ eF

FT

37

slide-38
SLIDE 38

Conclusions

  • Dualities tightly connected to data representations.
  • Finding good data representations is at the heart of ML and
  • ften `good' representations are not found.
  • `Good’ representations can be enforced when known to exist.
  • This enables classification for previously `inaccessible’ tasks.


(1D Ising metastable configurations)

  • Interesting representations can be found from feature separation

(e.g. DFT)

  • Interesting aspect: reduced complexity of networks
  • Dualities in Physics motivate multiple minima in a different

landscape, those of the cost functions of neural networks.

  • Lots of different kinds of dualities in Physics as playground for

efficient neural network architectures.

38

slide-39
SLIDE 39

Thank you!

39

slide-40
SLIDE 40

40

slide-41
SLIDE 41

Neural Networks

  • Layout of neural nets:



 
 
 
 
 
 


  • Cost function depending on parameters of network

  • Parameters of network updated using gradient descent

. . . . . . Input Output Hidden layers Weights

yi = activation (wijxj + bi)

<latexit sha1_base64="Rx9s6iOT21Wjd0JzIm2aHR4aYjE=">ACF3icbVDLSgNBEJyNrxhfUY9eBoMQEcKuCHoRgl48RjAPyC7L7GQ2mWT2wUxvNCz5Cy/+ihcPinjVm3/jJNmDJhY0FXdHd5seAKTPbyC0tr6yu5dcLG5tb2zvF3b2GihJWZ1GIpItjygmeMjqwEGwViwZCTzBmt7geuI3h0wqHoV3MIqZE5BuyH1OCWjJLVZGLr9MbRlgQoEPp+rYFsyH8r2b8v74we2feC63Je/24NgtlsyKOQVeJFZGSihDzS1+2Z2IJgELgQqiVNsyY3BSIoFTwcYFO1EsJnRAuqytaUgCpx0+tcYH2mlg/1I6goBT9XfEykJlBoFnu4MCPTUvDcR/PaCfgXTsrDOAEW0tkiPxEYIjwJCXe4ZBTESBNCJde3YtojUkekoyzoEKz5lxdJ47RimRXr9qxUvcriyKMDdIjKyELnqIpuUA3VEUWP6Bm9ojfjyXgx3o2PWvOyGb20R8Ynz8lY6CD</latexit><latexit sha1_base64="Rx9s6iOT21Wjd0JzIm2aHR4aYjE=">ACF3icbVDLSgNBEJyNrxhfUY9eBoMQEcKuCHoRgl48RjAPyC7L7GQ2mWT2wUxvNCz5Cy/+ihcPinjVm3/jJNmDJhY0FXdHd5seAKTPbyC0tr6yu5dcLG5tb2zvF3b2GihJWZ1GIpItjygmeMjqwEGwViwZCTzBmt7geuI3h0wqHoV3MIqZE5BuyH1OCWjJLVZGLr9MbRlgQoEPp+rYFsyH8r2b8v74we2feC63Je/24NgtlsyKOQVeJFZGSihDzS1+2Z2IJgELgQqiVNsyY3BSIoFTwcYFO1EsJnRAuqytaUgCpx0+tcYH2mlg/1I6goBT9XfEykJlBoFnu4MCPTUvDcR/PaCfgXTsrDOAEW0tkiPxEYIjwJCXe4ZBTESBNCJde3YtojUkekoyzoEKz5lxdJ47RimRXr9qxUvcriyKMDdIjKyELnqIpuUA3VEUWP6Bm9ojfjyXgx3o2PWvOyGb20R8Ynz8lY6CD</latexit><latexit sha1_base64="Rx9s6iOT21Wjd0JzIm2aHR4aYjE=">ACF3icbVDLSgNBEJyNrxhfUY9eBoMQEcKuCHoRgl48RjAPyC7L7GQ2mWT2wUxvNCz5Cy/+ihcPinjVm3/jJNmDJhY0FXdHd5seAKTPbyC0tr6yu5dcLG5tb2zvF3b2GihJWZ1GIpItjygmeMjqwEGwViwZCTzBmt7geuI3h0wqHoV3MIqZE5BuyH1OCWjJLVZGLr9MbRlgQoEPp+rYFsyH8r2b8v74we2feC63Je/24NgtlsyKOQVeJFZGSihDzS1+2Z2IJgELgQqiVNsyY3BSIoFTwcYFO1EsJnRAuqytaUgCpx0+tcYH2mlg/1I6goBT9XfEykJlBoFnu4MCPTUvDcR/PaCfgXTsrDOAEW0tkiPxEYIjwJCXe4ZBTESBNCJde3YtojUkekoyzoEKz5lxdJ47RimRXr9qxUvcriyKMDdIjKyELnqIpuUA3VEUWP6Bm9ojfjyXgx3o2PWvOyGb20R8Ynz8lY6CD</latexit><latexit sha1_base64="Rx9s6iOT21Wjd0JzIm2aHR4aYjE=">ACF3icbVDLSgNBEJyNrxhfUY9eBoMQEcKuCHoRgl48RjAPyC7L7GQ2mWT2wUxvNCz5Cy/+ihcPinjVm3/jJNmDJhY0FXdHd5seAKTPbyC0tr6yu5dcLG5tb2zvF3b2GihJWZ1GIpItjygmeMjqwEGwViwZCTzBmt7geuI3h0wqHoV3MIqZE5BuyH1OCWjJLVZGLr9MbRlgQoEPp+rYFsyH8r2b8v74we2feC63Je/24NgtlsyKOQVeJFZGSihDzS1+2Z2IJgELgQqiVNsyY3BSIoFTwcYFO1EsJnRAuqytaUgCpx0+tcYH2mlg/1I6goBT9XfEykJlBoFnu4MCPTUvDcR/PaCfgXTsrDOAEW0tkiPxEYIjwJCXe4ZBTESBNCJde3YtojUkekoyzoEKz5lxdJ47RimRXr9qxUvcriyKMDdIjKyELnqIpuUA3VEUWP6Bm9ojfjyXgx3o2PWvOyGb20R8Ynz8lY6CD</latexit>

xi

<latexit sha1_base64="96Omxy49SPGED3hOl6/e89unjWM=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHsHM0nQj+hQ8pAzaqx0/9Tn/XLFrbpzkFXi5aQCORr98ldvELM0QmYoFp3PTcxfkaV4UzgtNRLNSaUjekQu5ZKGqH2s/mpU3JmlQEJY2VLGjJXf09kNJ6EgW2M6JmpJe9mfif101NeOVnXCapQckWi8JUEBOT2d9kwBUyIyaWUKa4vZWwEVWUGZtOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOE8+K8Ox+L1oKTzxzDHzifP2CQjdg=</latexit><latexit sha1_base64="96Omxy49SPGED3hOl6/e89unjWM=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHsHM0nQj+hQ8pAzaqx0/9Tn/XLFrbpzkFXi5aQCORr98ldvELM0QmYoFp3PTcxfkaV4UzgtNRLNSaUjekQu5ZKGqH2s/mpU3JmlQEJY2VLGjJXf09kNJ6EgW2M6JmpJe9mfif101NeOVnXCapQckWi8JUEBOT2d9kwBUyIyaWUKa4vZWwEVWUGZtOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOE8+K8Ox+L1oKTzxzDHzifP2CQjdg=</latexit><latexit sha1_base64="96Omxy49SPGED3hOl6/e89unjWM=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHsHM0nQj+hQ8pAzaqx0/9Tn/XLFrbpzkFXi5aQCORr98ldvELM0QmYoFp3PTcxfkaV4UzgtNRLNSaUjekQu5ZKGqH2s/mpU3JmlQEJY2VLGjJXf09kNJ6EgW2M6JmpJe9mfif101NeOVnXCapQckWi8JUEBOT2d9kwBUyIyaWUKa4vZWwEVWUGZtOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOE8+K8Ox+L1oKTzxzDHzifP2CQjdg=</latexit><latexit sha1_base64="96Omxy49SPGED3hOl6/e89unjWM=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHsHM0nQj+hQ8pAzaqx0/9Tn/XLFrbpzkFXi5aQCORr98ldvELM0QmYoFp3PTcxfkaV4UzgtNRLNSaUjekQu5ZKGqH2s/mpU3JmlQEJY2VLGjJXf09kNJ6EgW2M6JmpJe9mfif101NeOVnXCapQckWi8JUEBOT2d9kwBUyIyaWUKa4vZWwEVWUGZtOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOE8+K8Ox+L1oKTzxzDHzifP2CQjdg=</latexit>

yi

<latexit sha1_base64="fYHWTPnicP3K/a8xK5ZBgxUcng=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8eK9gPaUDbTbt0swm7EyGE/gQvHhTx6i/y5r9x2+agrQ8GHu/NMDMvSKQw6LrfTmltfWNzq7xd2dnd2z+oHh61TZxqxlslrHuBtRwKRvoUDJu4nmNAok7wST25nfeLaiFg9YpZwP6IjJULBKFrpIRuIQbXm1t05yCrxClKDAs1B9as/jFkacYVMUmN6npugn1ONgk+rfRTwxPKJnTEe5YqGnHj5/NTp+TMKkMSxtqWQjJXf0/kNDImiwLbGVEcm2VvJv7n9VIMr/1cqCRFrthiUZhKgjGZ/U2GQnOGMrOEMi3srYSNqaYMbToVG4K3/PIqaV/UPbfu3V/WGjdFHGU4gVM4Bw+uoAF30IQWMBjBM7zCmyOdF+fd+Vi0lpxi5hj+wPn8AWIWjdk=</latexit><latexit sha1_base64="fYHWTPnicP3K/a8xK5ZBgxUcng=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8eK9gPaUDbTbt0swm7EyGE/gQvHhTx6i/y5r9x2+agrQ8GHu/NMDMvSKQw6LrfTmltfWNzq7xd2dnd2z+oHh61TZxqxlslrHuBtRwKRvoUDJu4nmNAok7wST25nfeLaiFg9YpZwP6IjJULBKFrpIRuIQbXm1t05yCrxClKDAs1B9as/jFkacYVMUmN6npugn1ONgk+rfRTwxPKJnTEe5YqGnHj5/NTp+TMKkMSxtqWQjJXf0/kNDImiwLbGVEcm2VvJv7n9VIMr/1cqCRFrthiUZhKgjGZ/U2GQnOGMrOEMi3srYSNqaYMbToVG4K3/PIqaV/UPbfu3V/WGjdFHGU4gVM4Bw+uoAF30IQWMBjBM7zCmyOdF+fd+Vi0lpxi5hj+wPn8AWIWjdk=</latexit><latexit sha1_base64="fYHWTPnicP3K/a8xK5ZBgxUcng=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8eK9gPaUDbTbt0swm7EyGE/gQvHhTx6i/y5r9x2+agrQ8GHu/NMDMvSKQw6LrfTmltfWNzq7xd2dnd2z+oHh61TZxqxlslrHuBtRwKRvoUDJu4nmNAok7wST25nfeLaiFg9YpZwP6IjJULBKFrpIRuIQbXm1t05yCrxClKDAs1B9as/jFkacYVMUmN6npugn1ONgk+rfRTwxPKJnTEe5YqGnHj5/NTp+TMKkMSxtqWQjJXf0/kNDImiwLbGVEcm2VvJv7n9VIMr/1cqCRFrthiUZhKgjGZ/U2GQnOGMrOEMi3srYSNqaYMbToVG4K3/PIqaV/UPbfu3V/WGjdFHGU4gVM4Bw+uoAF30IQWMBjBM7zCmyOdF+fd+Vi0lpxi5hj+wPn8AWIWjdk=</latexit><latexit sha1_base64="fYHWTPnicP3K/a8xK5ZBgxUcng=">AB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8eK9gPaUDbTbt0swm7EyGE/gQvHhTx6i/y5r9x2+agrQ8GHu/NMDMvSKQw6LrfTmltfWNzq7xd2dnd2z+oHh61TZxqxlslrHuBtRwKRvoUDJu4nmNAok7wST25nfeLaiFg9YpZwP6IjJULBKFrpIRuIQbXm1t05yCrxClKDAs1B9as/jFkacYVMUmN6npugn1ONgk+rfRTwxPKJnTEe5YqGnHj5/NTp+TMKkMSxtqWQjJXf0/kNDImiwLbGVEcm2VvJv7n9VIMr/1cqCRFrthiUZhKgjGZ/U2GQnOGMrOEMi3srYSNqaYMbToVG4K3/PIqaV/UPbfu3V/WGjdFHGU4gVM4Bw+uoAF30IQWMBjBM7zCmyOdF+fd+Vi0lpxi5hj+wPn8AWIWjdk=</latexit>

Input: Output:

{θA} = {wij, bk}

<latexit sha1_base64="Qwb9gPBMTS1UI6A4LY0eYZfsqU=">ACB3icbVDLSsNAFJ3UV62vqEtBovgQkoigm6EqhuXFewDmhAm0k7dvJg5kYpITs3/obF4q49Rfc+TdO2y0euDC4Zx7ufcePxFcgWV9GaW5+YXFpfJyZWV1bX3D3NxqTiVlDVpLGLZ8YligkesCRwE6ySkdAXrO0PL8d+45JxePoBkYJc0PSj3jAKQEteaukzkwYEC8cyfHZ9jJ7r2M3+aH2PeGTu6ZVatmTYD/ErsgVSg4ZmfTi+macgioIo1bWtBNyMSOBUsLzipIolhA5Jn3U1jUjIlJtN/sjxvlZ6OIilrgjwRP05kZFQqVHo686QwEDNemPxP6+bQnDqZjxKUmARnS4KUoEhxuNQcI9LRkGMNCFUcn0rpgMiCQUdXUWHYM+/Je0jmq2VbOvj6v1iyKOMtpBe+gA2egE1dEVaqAmougBPaEX9Go8Gs/Gm/E+bS0Zxcw2+gXj4xs1/Zjf</latexit><latexit sha1_base64="Qwb9gPBMTS1UI6A4LY0eYZfsqU=">ACB3icbVDLSsNAFJ3UV62vqEtBovgQkoigm6EqhuXFewDmhAm0k7dvJg5kYpITs3/obF4q49Rfc+TdO2y0euDC4Zx7ufcePxFcgWV9GaW5+YXFpfJyZWV1bX3D3NxqTiVlDVpLGLZ8YligkesCRwE6ySkdAXrO0PL8d+45JxePoBkYJc0PSj3jAKQEteaukzkwYEC8cyfHZ9jJ7r2M3+aH2PeGTu6ZVatmTYD/ErsgVSg4ZmfTi+macgioIo1bWtBNyMSOBUsLzipIolhA5Jn3U1jUjIlJtN/sjxvlZ6OIilrgjwRP05kZFQqVHo686QwEDNemPxP6+bQnDqZjxKUmARnS4KUoEhxuNQcI9LRkGMNCFUcn0rpgMiCQUdXUWHYM+/Je0jmq2VbOvj6v1iyKOMtpBe+gA2egE1dEVaqAmougBPaEX9Go8Gs/Gm/E+bS0Zxcw2+gXj4xs1/Zjf</latexit><latexit sha1_base64="Qwb9gPBMTS1UI6A4LY0eYZfsqU=">ACB3icbVDLSsNAFJ3UV62vqEtBovgQkoigm6EqhuXFewDmhAm0k7dvJg5kYpITs3/obF4q49Rfc+TdO2y0euDC4Zx7ufcePxFcgWV9GaW5+YXFpfJyZWV1bX3D3NxqTiVlDVpLGLZ8YligkesCRwE6ySkdAXrO0PL8d+45JxePoBkYJc0PSj3jAKQEteaukzkwYEC8cyfHZ9jJ7r2M3+aH2PeGTu6ZVatmTYD/ErsgVSg4ZmfTi+macgioIo1bWtBNyMSOBUsLzipIolhA5Jn3U1jUjIlJtN/sjxvlZ6OIilrgjwRP05kZFQqVHo686QwEDNemPxP6+bQnDqZjxKUmARnS4KUoEhxuNQcI9LRkGMNCFUcn0rpgMiCQUdXUWHYM+/Je0jmq2VbOvj6v1iyKOMtpBe+gA2egE1dEVaqAmougBPaEX9Go8Gs/Gm/E+bS0Zxcw2+gXj4xs1/Zjf</latexit><latexit sha1_base64="Qwb9gPBMTS1UI6A4LY0eYZfsqU=">ACB3icbVDLSsNAFJ3UV62vqEtBovgQkoigm6EqhuXFewDmhAm0k7dvJg5kYpITs3/obF4q49Rfc+TdO2y0euDC4Zx7ufcePxFcgWV9GaW5+YXFpfJyZWV1bX3D3NxqTiVlDVpLGLZ8YligkesCRwE6ySkdAXrO0PL8d+45JxePoBkYJc0PSj3jAKQEteaukzkwYEC8cyfHZ9jJ7r2M3+aH2PeGTu6ZVatmTYD/ErsgVSg4ZmfTi+macgioIo1bWtBNyMSOBUsLzipIolhA5Jn3U1jUjIlJtN/sjxvlZ6OIilrgjwRP05kZFQqVHo686QwEDNemPxP6+bQnDqZjxKUmARnS4KUoEhxuNQcI9LRkGMNCFUcn0rpgMiCQUdXUWHYM+/Je0jmq2VbOvj6v1iyKOMtpBe+gA2egE1dEVaqAmougBPaEX9Go8Gs/Gm/E+bS0Zxcw2+gXj4xs1/Zjf</latexit>

Parameters of network:

cost(θA) = X

training set

|ydesired − ypredicted(θA)|

<latexit sha1_base64="vBNI2e0tsdLSr0ioIglaijYtqs=">ACTXicbVDLSiQxFE21zqg9D1tdugk2A85imioRdCP42LhUsFXoaopU6lZ3MEkVyS2hKcsPdCO48y/cuJhBxHR3ga85EDg5z6SE+dSWPT9e68xM/vl69z8QvPb9x8/F1tLy6c2KwyHLs9kZs5jZkEKDV0UKOE8N8BULOEsvjgY+2eXYKzI9AmOcugrNtAiFZyhk6JWUoZGUZ5ZrNZDHAKyaO/3TmgLFU0cNExoQfXFrAKJaR4NZo6CVhIKn+1He3NhEcnfI6KDRiMSrqNX2O/4E9DMJatImNY6i1l2YZLxQoJFLZm0v8HPsl8yg4BKqZlhYyBm/YAPoOaqZAtsvJ2lU9JdTEpmxh2NdK+7SiZsnakYlepGA7tR28s/s/rFZhu90uh8wJB8+mitJAUMzqOliYuD45y5AjRri3Uj5khrlQjG26EIKPX/5MTjc6gd8Jjfbu/t1HPNklayRdRKQLbJLDskR6RJObsgD+Uv+ebfeo/fkPU9LG17ds0LeoTH3AgdRtrw=</latexit><latexit sha1_base64="vBNI2e0tsdLSr0ioIglaijYtqs=">ACTXicbVDLSiQxFE21zqg9D1tdugk2A85imioRdCP42LhUsFXoaopU6lZ3MEkVyS2hKcsPdCO48y/cuJhBxHR3ga85EDg5z6SE+dSWPT9e68xM/vl69z8QvPb9x8/F1tLy6c2KwyHLs9kZs5jZkEKDV0UKOE8N8BULOEsvjgY+2eXYKzI9AmOcugrNtAiFZyhk6JWUoZGUZ5ZrNZDHAKyaO/3TmgLFU0cNExoQfXFrAKJaR4NZo6CVhIKn+1He3NhEcnfI6KDRiMSrqNX2O/4E9DMJatImNY6i1l2YZLxQoJFLZm0v8HPsl8yg4BKqZlhYyBm/YAPoOaqZAtsvJ2lU9JdTEpmxh2NdK+7SiZsnakYlepGA7tR28s/s/rFZhu90uh8wJB8+mitJAUMzqOliYuD45y5AjRri3Uj5khrlQjG26EIKPX/5MTjc6gd8Jjfbu/t1HPNklayRdRKQLbJLDskR6RJObsgD+Uv+ebfeo/fkPU9LG17ds0LeoTH3AgdRtrw=</latexit><latexit sha1_base64="vBNI2e0tsdLSr0ioIglaijYtqs=">ACTXicbVDLSiQxFE21zqg9D1tdugk2A85imioRdCP42LhUsFXoaopU6lZ3MEkVyS2hKcsPdCO48y/cuJhBxHR3ga85EDg5z6SE+dSWPT9e68xM/vl69z8QvPb9x8/F1tLy6c2KwyHLs9kZs5jZkEKDV0UKOE8N8BULOEsvjgY+2eXYKzI9AmOcugrNtAiFZyhk6JWUoZGUZ5ZrNZDHAKyaO/3TmgLFU0cNExoQfXFrAKJaR4NZo6CVhIKn+1He3NhEcnfI6KDRiMSrqNX2O/4E9DMJatImNY6i1l2YZLxQoJFLZm0v8HPsl8yg4BKqZlhYyBm/YAPoOaqZAtsvJ2lU9JdTEpmxh2NdK+7SiZsnakYlepGA7tR28s/s/rFZhu90uh8wJB8+mitJAUMzqOliYuD45y5AjRri3Uj5khrlQjG26EIKPX/5MTjc6gd8Jjfbu/t1HPNklayRdRKQLbJLDskR6RJObsgD+Uv+ebfeo/fkPU9LG17ds0LeoTH3AgdRtrw=</latexit><latexit sha1_base64="vBNI2e0tsdLSr0ioIglaijYtqs=">ACTXicbVDLSiQxFE21zqg9D1tdugk2A85imioRdCP42LhUsFXoaopU6lZ3MEkVyS2hKcsPdCO48y/cuJhBxHR3ga85EDg5z6SE+dSWPT9e68xM/vl69z8QvPb9x8/F1tLy6c2KwyHLs9kZs5jZkEKDV0UKOE8N8BULOEsvjgY+2eXYKzI9AmOcugrNtAiFZyhk6JWUoZGUZ5ZrNZDHAKyaO/3TmgLFU0cNExoQfXFrAKJaR4NZo6CVhIKn+1He3NhEcnfI6KDRiMSrqNX2O/4E9DMJatImNY6i1l2YZLxQoJFLZm0v8HPsl8yg4BKqZlhYyBm/YAPoOaqZAtsvJ2lU9JdTEpmxh2NdK+7SiZsnakYlepGA7tR28s/s/rFZhu90uh8wJB8+mitJAUMzqOliYuD45y5AjRri3Uj5khrlQjG26EIKPX/5MTjc6gd8Jjfbu/t1HPNklayRdRKQLbJLDskR6RJObsgD+Uv+ebfeo/fkPU9LG17ds0LeoTH3AgdRtrw=</latexit>

θA ! θA η rAcost(θA)

<latexit sha1_base64="Dqkmg4HtXYleum2AoErgJIvkV0w=">ACInicbZDLSgMxFIYzXmu9V26CRahLiwzIqi7qhuXFewFOkPJpJk2NJMZkjNCGeqruPFV3LhQ1JXgw5jpBbT1h8CX/5xDcn4/FlyDbX9ZC4tLyurubX8+sbm1nZhZ7euo0RVqORiFTJ5oJLlkNOAjWjBUjoS9Yw+9fZ/XGPVOaR/IOBjHzQtKVPOCUgLHahQsXegxI+9KFCE8ZH2PXwIMriS+ye+qENIw7A07TlqF4p2R4Jz4MzgSKaqNoufLidiCYhk0AF0brl2DF4KVHAqWDvJtoFhPaJ13WMihJyLSXjlYc4kPjdHAQKXMk4JH7eyIlodaD0DedIYGenq1l5n+1VgLBuZdyGSfAJB0/FCQCmziyvHCHK0ZBDAwQqrj5K6Y9ogFk2rehODMrjwP9ZOyY5ed29Ni5WoSRw7towNUQg46QxV0g6qohih6RM/oFb1ZT9aL9W59jlsXrMnMHvoj6/sHNwKjcg=</latexit><latexit sha1_base64="Dqkmg4HtXYleum2AoErgJIvkV0w=">ACInicbZDLSgMxFIYzXmu9V26CRahLiwzIqi7qhuXFewFOkPJpJk2NJMZkjNCGeqruPFV3LhQ1JXgw5jpBbT1h8CX/5xDcn4/FlyDbX9ZC4tLyurubX8+sbm1nZhZ7euo0RVqORiFTJ5oJLlkNOAjWjBUjoS9Yw+9fZ/XGPVOaR/IOBjHzQtKVPOCUgLHahQsXegxI+9KFCE8ZH2PXwIMriS+ye+qENIw7A07TlqF4p2R4Jz4MzgSKaqNoufLidiCYhk0AF0brl2DF4KVHAqWDvJtoFhPaJ13WMihJyLSXjlYc4kPjdHAQKXMk4JH7eyIlodaD0DedIYGenq1l5n+1VgLBuZdyGSfAJB0/FCQCmziyvHCHK0ZBDAwQqrj5K6Y9ogFk2rehODMrjwP9ZOyY5ed29Ni5WoSRw7towNUQg46QxV0g6qohih6RM/oFb1ZT9aL9W59jlsXrMnMHvoj6/sHNwKjcg=</latexit><latexit sha1_base64="Dqkmg4HtXYleum2AoErgJIvkV0w=">ACInicbZDLSgMxFIYzXmu9V26CRahLiwzIqi7qhuXFewFOkPJpJk2NJMZkjNCGeqruPFV3LhQ1JXgw5jpBbT1h8CX/5xDcn4/FlyDbX9ZC4tLyurubX8+sbm1nZhZ7euo0RVqORiFTJ5oJLlkNOAjWjBUjoS9Yw+9fZ/XGPVOaR/IOBjHzQtKVPOCUgLHahQsXegxI+9KFCE8ZH2PXwIMriS+ye+qENIw7A07TlqF4p2R4Jz4MzgSKaqNoufLidiCYhk0AF0brl2DF4KVHAqWDvJtoFhPaJ13WMihJyLSXjlYc4kPjdHAQKXMk4JH7eyIlodaD0DedIYGenq1l5n+1VgLBuZdyGSfAJB0/FCQCmziyvHCHK0ZBDAwQqrj5K6Y9ogFk2rehODMrjwP9ZOyY5ed29Ni5WoSRw7towNUQg46QxV0g6qohih6RM/oFb1ZT9aL9W59jlsXrMnMHvoj6/sHNwKjcg=</latexit><latexit sha1_base64="Dqkmg4HtXYleum2AoErgJIvkV0w=">ACInicbZDLSgMxFIYzXmu9V26CRahLiwzIqi7qhuXFewFOkPJpJk2NJMZkjNCGeqruPFV3LhQ1JXgw5jpBbT1h8CX/5xDcn4/FlyDbX9ZC4tLyurubX8+sbm1nZhZ7euo0RVqORiFTJ5oJLlkNOAjWjBUjoS9Yw+9fZ/XGPVOaR/IOBjHzQtKVPOCUgLHahQsXegxI+9KFCE8ZH2PXwIMriS+ye+qENIw7A07TlqF4p2R4Jz4MzgSKaqNoufLidiCYhk0AF0brl2DF4KVHAqWDvJtoFhPaJ13WMihJyLSXjlYc4kPjdHAQKXMk4JH7eyIlodaD0DedIYGenq1l5n+1VgLBuZdyGSfAJB0/FCQCmziyvHCHK0ZBDAwQqrj5K6Y9ogFk2rehODMrjwP9ZOyY5ed29Ni5WoSRw7towNUQg46QxV0g6qohih6RM/oFb1ZT9aL9W59jlsXrMnMHvoj6/sHNwKjcg=</latexit>

41

slide-42
SLIDE 42

Constraining ALPs

  • Photon-axion interconversion in background magnetic fields:

  • One interesting parameter region can be obtained for

photons from sources in and behind galaxy cluster magnetic fields.


42

ℒ ⊃ − gaγγ 4 aF ˜ F = gaγγE ⋅ B

0.010 0.100 1 10 100 ω [keV] 0.05 0.10 0.15 0.20 0.25 Pγ → a

sweet spot

too rapid oscillations suppressed conversion

no oscillations

NGC1275: 1605.01043 Other sources: 1704.05256 Athena bounds: 1707.00176 with: Conlon, Day, Jennings, Rummel; Berg, Muia, Powell

Pγ→a = 1 2 Θ2 1 + Θ2 sin2 ⇣ ∆ p 1 + Θ2 ⌘

Θ = 0.28 ✓ B⊥ 1 µG ◆ ⇣ ω 1 keV ⌘ ✓10−3cm−3 ne ◆ ✓1011GeV M ◆

∆ = 0.54 ⇣ ne 10−3cm−3 ⌘ ✓ L 10kpc ◆ ✓1keV ω ◆

slide-43
SLIDE 43

Standard bounds

  • Spectral distortions (but unknown where & strength as

dependent on magnetic field)

  • Poisson noise vs. signal
  • Essentially: fluctuations larger than noise
  • Human-made: Fourier bounds no real improvement

43

Conlon, Rummel 1808.05916

10-18 10-16 10-14 10-12 10-10 10-8 10-6 10-30 10-25 10-20 10-15 10-10 10-5 100 Axion Coupling |GAγγ | (GeV-1) Axion Mass mA (eV) 10-18 10-16 10-14 10-12 10-10 10-8 10-6 10-30 10-25 10-20 10-15 10-10 10-5 100

LSW (OSQAR) Helioscopes (CAST) Haloscopes (ADMX) Telescopes

Horizontal Branch Stars KSVZ DFSZ

VMB (PVLAS)

SN 1987A

HESS NGC1275 - Chandra Fermi-LAT

Axion Coupling |GAγγ | (GeV-1) Axion Mass mA (eV) 10-18 10-16 10-14 10-12 10-10 10-8 10-6 10-30 10-25 10-20 10-15 10-10 10-5 100

LSW (OSQAR) Helioscopes (CAST) Haloscopes (ADMX) Telescopes

Horizontal Branch Stars KSVZ DFSZ

VMB (PVLAS)

SN 1987A

HESS NGC1275 - Chandra NGC1275 - Athena Fermi-LAT

NGC1275: 1605.01043 Other sources: 1704.05256 Athena bounds: 1707.00176 with: Conlon, Day, Jennings; Berg, Muia, Powell, Rummel