Generalized Approximate Survey Propagation for Hig igh-dimensional - - PowerPoint PPT Presentation

generalized approximate survey propagation
SMART_READER_LITE
LIVE PREVIEW

Generalized Approximate Survey Propagation for Hig igh-dimensional - - PowerPoint PPT Presentation

Generalized Approximate Survey Propagation for Hig igh-dimensional Estimation Luca Saglietti Yue Lu , Harvard University Carlo Lucibello , Bocconi University Outline Generalized Linear Models (GLM) Real-valued phase retrieval


slide-1
SLIDE 1

Generalized Approximate Survey Propagation for Hig igh-dimensional Estimation

Luca Saglietti

Yue Lu, Harvard University Carlo Lucibello, Bocconi University

slide-2
SLIDE 2

Outline

  • Generalized Linear Models (GLM)
  • Real-valued phase retrieval
  • Inference model
  • Approximate message-passing
  • Effective landscapes and competition
  • Breaking the replica symmetry
  • Changing the effective landscape
  • Conclusions
slide-3
SLIDE 3

Generalized Lin inear Models

3 ingredients : TRUE SIGNAL OBSERVATION MATRIX OBSERVED SIGNAL High-dimensional limit: with

  • f

: : :

slide-4
SLIDE 4

Generalized Lin inear Models

3 ingredients : TRUE SIGNAL OBSERVATION MATRIX OBSERVED SIGNAL High-dimensional limit: with

  • f

: : :

slide-5
SLIDE 5

Generalized Lin inear Models

3 ingredients : TRUE SIGNAL OBSERVATION MATRIX OBSERVED SIGNAL High-dimensional limit: with

  • f

: : :

slide-6
SLIDE 6

Generalized Lin inear Models

3 ingredients : TRUE SIGNAL OBSERVATION MATRIX OBSERVED SIGNAL High-dimensional limit: with

  • f

: : :

slide-7
SLIDE 7

An example: : Real-valued Phase Retrieval

Fun facts about phase retrieval:

  • Physically meaningful!
  • symmetry in the signal space.
  • should provide enough information for a perfect reconstruction.
  • Gradient descent struggles to reconstruct the signal until .
  • Rigorous result about convexification in a regime.

( + noise )

slide-8
SLIDE 8

An example: : Real-valued Phase Retrieval

Fun facts about phase retrieval:

  • Physically meaningful!
  • symmetry in the signal space.
  • should provide enough information for a perfect reconstruction.
  • Gradient descent struggles to reconstruct the signal until .
  • Rigorous result about convexification in a regime.

( + noise )

slide-9
SLIDE 9

An example: : Real-valued Phase Retrieval

Fun facts about phase retrieval:

  • Physically meaningful!
  • symmetry in the signal space.
  • should provide enough information for a perfect reconstruction.
  • Gradient descent struggles to reconstruct the signal until .
  • Rigorous result about convexification in a regime.

( + noise )

slide-10
SLIDE 10

An example: : Real-valued Phase Retrieval

Fun facts about phase retrieval:

  • Physically meaningful!
  • symmetry in the signal space.
  • should provide enough information for a perfect reconstruction.
  • Gradient descent struggles to reconstruct the signal until .
  • Rigorous result about convexification in a regime.

( + noise )

slide-11
SLIDE 11

An example: : Real-valued Phase Retrieval

Fun facts about phase retrieval:

  • Physically meaningful!
  • symmetry in the signal space.
  • should provide enough information for a perfect reconstruction.
  • Gradient descent struggles to reconstruct the signal until .
  • Rigorous result about convexification in a regime.

( + noise )

slide-12
SLIDE 12

An example: : Real-valued Phase Retrieval

Fun facts about phase retrieval:

  • Physically meaningful!
  • symmetry in the signal space.
  • should provide enough information for a perfect reconstruction.
  • Gradient descent struggles to reconstruct the signal until .
  • Rigorous result about convexification in a regime.

( + noise )

slide-13
SLIDE 13

In Inference Model

Sensible choice: GRAPHICAL MODEL MATCHED / MISMATCHED

Estimator :

Bayesian optimal: Maximum a posteriori:

slide-14
SLIDE 14

In Inference Model

Sensible choice: GRAPHICAL MODEL MATCHED / MISMATCHED

Estimator :

Bayesian optimal: Maximum a posteriori:

slide-15
SLIDE 15

In Inference Model

Sensible choice: GRAPHICAL MODEL MATCHED / MISMATCHED

Estimator :

Bayesian optimal: Maximum a posteriori:

slide-16
SLIDE 16

Approximate Message-passing

How do we obtain ? Easy (if everything is i.i.d.)

BP Gaussian ansatz rBP Close on single-site quantities AMP (TAP)

DEFINE 2 SCALAR INFERENCE CHANNELS:

Encoding prior dependence Encoding data dependence

When do we get a good estimator?

slide-17
SLIDE 17

Approximate Message-passing

How do we obtain ? Easy (if everything is i.i.d.)

BP Gaussian ansatz rBP Close on single-site quantities AMP (TAP)

DEFINE 2 SCALAR INFERENCE CHANNELS:

Encoding prior dependence Encoding data dependence

When do we get a good estimator?

slide-18
SLIDE 18

Approximate Message-passing

How do we obtain ? Easy (if everything is i.i.d.)

BP Gaussian ansatz rBP Close on single-site quantities AMP (TAP)

DEFINE 2 SCALAR INFERENCE CHANNELS:

Encoding prior dependence Encoding data dependence

When do we get a good estimator?

slide-19
SLIDE 19

Approximate Message-passing

How do we obtain ? Easy (if everything is i.i.d.)

BP Gaussian ansatz rBP Close on single-site quantities AMP (TAP)

DEFINE 2 SCALAR INFERENCE CHANNELS:

Encoding prior dependence Encoding data dependence

When do we get a good estimator?

slide-20
SLIDE 20

Approximate Message-passing

How do we obtain ? Easy (if everything is i.i.d.)

BP Gaussian ansatz rBP Close on single-site quantities AMP (TAP)

DEFINE 2 SCALAR INFERENCE CHANNELS:

Encoding prior dependence Encoding data dependence

When do we get a good estimator?

slide-21
SLIDE 21

Approximate Message-passing

How do we obtain ? Easy (if everything is i.i.d.)

BP Gaussian ansatz rBP Close on single-site quantities AMP (TAP)

DEFINE 2 SCALAR INFERENCE CHANNELS:

Encoding prior dependence Encoding data dependence

When do we get a good estimator?

slide-22
SLIDE 22

Effective Landscapes and Competition

ρ energy

Low overlap High overlap

IMPOSSIBLE (SNR) (possible scenario)

1:

  • verlap:

GD in this effective landscape Stationary points Fixed points

slide-23
SLIDE 23

energy IMPOSSIBLE (SNR)

2:

Effective Landscapes and Competition

(possible scenario)

  • verlap:

GD in this effective landscape Stationary points Fixed points ρ

Low overlap High overlap

slide-24
SLIDE 24

energy HARD IMPOSSIBLE (SNR)

3:

Effective Landscapes and Competition

(possible scenario)

  • verlap:

GD in this effective landscape Stationary points Fixed points ρ

Low overlap High overlap

slide-25
SLIDE 25

energy EASY HARD IMPOSSIBLE (SNR)

4:

Effective Landscapes and Competition

(possible scenario)

  • verlap:

GD in this effective landscape Stationary points Fixed points ρ

Low overlap High overlap

slide-26
SLIDE 26

Breaking the symmetry ry

GAMP GASP(s)

vs Replica symmetry assumption 1RSB assumption Input scalar channel: Input scalar channel:

  • Same computational complexity
  • (Potentially) more expensive element-wise operations
  • How to set the symmetry breaking parameter s ?

SYMMETRY BREAKING PARAMETER

slide-27
SLIDE 27

Breaking the symmetry ry

GAMP GASP(s)

vs Replica symmetry assumption 1RSB assumption Input scalar channel: Input scalar channel:

  • Same computational complexity
  • (Potentially) more expensive element-wise operations
  • How to set the symmetry breaking parameter s ?

SYMMETRY BREAKING PARAMETER

slide-28
SLIDE 28

Breaking the symmetry ry

GAMP GASP(s)

vs Replica symmetry assumption 1RSB assumption Input scalar channel: Input scalar channel:

  • Same computational complexity
  • (Potentially) more expensive element-wise operations
  • How to set the symmetry breaking parameter s ?

SYMMETRY BREAKING PARAMETER

slide-29
SLIDE 29

Message-passing equations

GAMP GASP(s)

slide-30
SLIDE 30

Changing the Effective Landscape

Phase retrieval, noiseless case No regularizer

RS RS

Energy

Perfect recovery!

: explore minima at different energy levels GROUND STATE

RS : Hard below 1RSB : Hard below

slide-31
SLIDE 31

Changing the Effective Landscape

Phase retrieval, noiseless case No regularizer

RS 1RSB RS 1RSB

Energy

Perfect recovery!

: explore minima at different energy/complexity levels GROUND STATE

RS : Hard below 1RSB : Hard below

slide-32
SLIDE 32

Changing the Effective Landscape

Phase retrieval, noiseless case No regularizer

RS 1RSB RS 1RSB

Energy

Perfect recovery!

: explore minima at different energy/complexity levels GROUND STATE

RS : Hard below 1RSB : Hard below

slide-33
SLIDE 33

Conclusions

  • In mismatched inference settings the RS assumption can be wrong.
  • GASP can improve over GAMP. Same O(N^2) complexity.
  • Simple continuation strategy can push GASP down to the BO algorithmic threshold.
  • For more details please check my poster this evening!

THANK YOU FOR YOUR ATTENTION!