Sparse Audio Models For Inverse Audio Problems Rmi Gribonval INRIA - - PowerPoint PPT Presentation

sparse audio models for inverse audio problems r mi
SMART_READER_LITE
LIVE PREVIEW

Sparse Audio Models For Inverse Audio Problems Rmi Gribonval INRIA - - PowerPoint PPT Presentation

Sparse Audio Models For Inverse Audio Problems Rmi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Outline Inverse problems in audio processing audio inpainting source localization Learning


slide-1
SLIDE 1

Sparse Audio Models For Inverse Audio Problems Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

remi.gribonval@inria.fr

slide-2
SLIDE 2

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Outline

  • Inverse problems in audio processing

✓ audio inpainting ✓ source localization

  • Learning low-dimensional audio models

✓ dictionaries ...

2

slide-3
SLIDE 3

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Contributors

  • Audio inpainting

✓ A. Adler, N. Bertin, V. Emiya,

  • M. Elad, C.Guichaoua, M. Jafari,
  • M. Plumbley
  • Source localization

✓ S. Nam

  • Dictionary learning

✓ F. Bach, R. Jenatton, K. Schnass

3

echange.inria.fr small-project.eu

slide-4
SLIDE 4

November 5th 2012-

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Audio inpainting

with A. Adler, V. Emiya, M. Elad, M. Jafari, M. Plumbley

slide-5
SLIDE 5
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

Image Inpainting

5

Observed image Inpainted image

slide-6
SLIDE 6
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

Audio Inpainting ?

6

Time (s) Frequency (Hz) 0.1 0.2 0.3 0.4 2000 4000 6000 8000 0.01 0.02 0.03 −1 1 Time (s) Amplitude

Clicks Limited bandwidth Holes (Packet Loss)

0.01 0.02 0.03 −1 1 Time (s) Amplitude

Clipping

slide-7
SLIDE 7
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

Audio Inpainting ?

6

0.01 0.02 0.03 −1 1 Time (s) Amplitude

Clipping

slide-8
SLIDE 8

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Declipping as a linear inverse problem

7

  • Original (unknown) samples
  • Clipped (observed) samples
  • Subset of reliable samples
  • Linear inverse problem

M x y yreliable yreliable = x

slide-9
SLIDE 9
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

Inverse problems & signal models

8

Observation Domain

Need for a model = prior knowledge

slide-10
SLIDE 10

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Sparse audio models

  • Time domain
  • Time-frequency domain

(Black = zero)

9

slide-11
SLIDE 11

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Mathematical expression

  • Signal / image = high dimensional vector
  • Model = linear combination of basis vectors

(ex: time-frequency atoms, wavelets)

  • Sparsity = small L0 (quasi)-norm

10

Dictionary of atoms (Mallat & Zhang 93)

x ∈ Rd x ≈ X

k

zkdk = Dz ⇥z⇥0 = X

k

|zk|0 = card{k, zk = 0}

slide-12
SLIDE 12
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

CoSparse models and inverse problems

11

Observation Domain

slide-13
SLIDE 13

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Model

✓ sparsity in time-frequency dictionary

  • Algorithm:

✓ find sparse coefficients such that

(Orthonormal) Matching Pursuit (Mallat & Zhang 93) ✓ + ensure compatibility with clipping constraint

Convex optimization ✓ estimate

  • A. Adler, V. Emiya, M. Jafari, M. Elad, R. Gribonval and M. D. Plumbley, Audio Inpainting, IEEE Trans

Audio Speech and Language Proc., 2012

Audio Declipping

12 0.01 0.02 0.03 0.04 0.05 −0.5 0.5 time (s) Amplitude

x = Dz

y = MDˆ z

ˆ z

ˆ x = Dˆ z

slide-14
SLIDE 14

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Model

✓ sparsity in time-frequency dictionary

  • Algorithm:

✓ find sparse coefficients such that

(Orthonormal) Matching Pursuit (Mallat & Zhang 93) ✓ + ensure compatibility with clipping constraint

Convex optimization ✓ estimate

  • A. Adler, V. Emiya, M. Jafari, M. Elad, R. Gribonval and M. D. Plumbley, Audio Inpainting, IEEE Trans

Audio Speech and Language Proc., 2012

Audio Declipping

12 0.01 0.02 0.03 0.04 0.05 −0.5 0.5 time (s) Amplitude

x = Dz

y = MDˆ z

ˆ z

ˆ x = Dˆ z

slide-15
SLIDE 15

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Model

✓ sparsity in time-frequency dictionary

  • Algorithm:

✓ find sparse coefficients such that

(Orthonormal) Matching Pursuit (Mallat & Zhang 93) ✓ + ensure compatibility with clipping constraint

Convex optimization ✓ estimate

  • A. Adler, V. Emiya, M. Jafari, M. Elad, R. Gribonval and M. D. Plumbley, Audio Inpainting, IEEE Trans

Audio Speech and Language Proc., 2012

Audio Declipping

12 0.01 0.02 0.03 0.04 0.05 −0.5 0.5 time (s) Amplitude

x = Dz

y = MDˆ z

ˆ z

ˆ x = Dˆ z

Clipped

slide-16
SLIDE 16

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Model

✓ sparsity in time-frequency dictionary

  • Algorithm:

✓ find sparse coefficients such that

(Orthonormal) Matching Pursuit (Mallat & Zhang 93) ✓ + ensure compatibility with clipping constraint

Convex optimization ✓ estimate

  • A. Adler, V. Emiya, M. Jafari, M. Elad, R. Gribonval and M. D. Plumbley, Audio Inpainting, IEEE Trans

Audio Speech and Language Proc., 2012

Audio Declipping

12 0.01 0.02 0.03 0.04 0.05 −0.5 0.5 time (s) Amplitude

Declipped

x = Dz

y = MDˆ z

ˆ z

ˆ x = Dˆ z

Clipped

slide-17
SLIDE 17

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Model

✓ sparsity in time-frequency dictionary

  • Algorithm:

✓ find sparse coefficients such that

(Orthonormal) Matching Pursuit (Mallat & Zhang 93) ✓ + ensure compatibility with clipping constraint

Convex optimization ✓ estimate

  • A. Adler, V. Emiya, M. Jafari, M. Elad, R. Gribonval and M. D. Plumbley, Audio Inpainting, IEEE Trans

Audio Speech and Language Proc., 2012

Audio Declipping

12 0.01 0.02 0.03 0.04 0.05 −0.5 0.5 time (s) Amplitude

Declipped

x = Dz

y = MDˆ z

ˆ z

ˆ x = Dˆ z

Clipped Original

slide-18
SLIDE 18

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Model

✓ sparsity in time-frequency dictionary

  • Algorithm:

✓ find sparse coefficients such that

(Orthonormal) Matching Pursuit (Mallat & Zhang 93) ✓ + ensure compatibility with clipping constraint

Convex optimization ✓ estimate

  • A. Adler, V. Emiya, M. Jafari, M. Elad, R. Gribonval and M. D. Plumbley, Audio Inpainting, IEEE Trans

Audio Speech and Language Proc., 2012

Audio Declipping

12 0.01 0.02 0.03 0.04 0.05 −0.5 0.5 time (s) Amplitude

Declipped

x = Dz

y = MDˆ z

ˆ z

ˆ x = Dˆ z

Clipped Original

see also talk by B. Mailhé

slide-19
SLIDE 19

November 5th 2012-

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Source localization

with S. Nam

slide-20
SLIDE 20

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Localization with «few» microphones

  • Possible goals

✓ localize emitting sources ✓ reconstruct emitted signals ✓ extrapolate acoustic field

  • Linear inverse problem
  • Need a model

14

y = Mx

time-series recorded at sensors (discretized) spatio-temporal acoustic field

∈ Rm ∈ RN

slide-21
SLIDE 21

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Physics-driven design of model

  • Pressure field
  • Wave equation on a domain
  • Boundary + initial conditions, e.g.

15

(∆p − 1

c2 ∂2 ∂t2 p)(

r, t) = s( r, t), r ∈ ˙ D p n(⇥ r, t) = 0, ⇥ r ∈ D p( r, t)

slide-22
SLIDE 22

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Physics-driven design of model

  • Pressure field
  • Wave equation on a domain
  • Boundary + initial conditions, e.g.

15

(∆p − 1

c2 ∂2 ∂t2 p)(

r, t) = s( r, t), r ∈ ˙ D p n(⇥ r, t) = 0, ⇥ r ∈ D p( r, t)

}

Ωx = z x

Discretization sources & boundaries

slide-23
SLIDE 23

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Group sparse source model

  • Few non-moving sources = spatially sparse

16

space time t

  • r

z

r,t

slide-24
SLIDE 24

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Group sparse regularization

  • Inverse problem
  • Sparse regularization with mixed norm

Promotes group sparsity, cf Kowalski & Torresani 2009, Eldar & Mishali 2009, Baraniuk & al 2010, Jenatton & al 2011

17

y = Mx ˆ x = arg min

x

1 2ky Mxk2

2 + λkΩxk1,2

slide-25
SLIDE 25

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Setting

✓ 2D+t vibrating plate 77x77 ✓ 2 sources, random location ✓ 6 microphones, random location ✓ known complex boundaries ✓ ground truth generated with naive

discretization

  • Results

Sparse Field Reconstruction

18

Ground truth Sparse reconstruction

  • S. Nam and R. Gribonval. Physics-driven structured cosparse modeling for source localization, ICASSP 2012
slide-26
SLIDE 26

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Setting

✓ 2D+t vibrating plate 77x77 ✓ 2 sources, random location ✓ 6 microphones, random location ✓ known complex boundaries ✓ ground truth generated with naive

discretization

  • Results

Sparse Field Reconstruction

18

Ground truth Sparse reconstruction

  • S. Nam and R. Gribonval. Physics-driven structured cosparse modeling for source localization, ICASSP 2012
slide-27
SLIDE 27

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Localizing the source next door

  • Domain, Source and

Microphones

  • Sparse source localization

19

slide-28
SLIDE 28

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Localizing the source next door

  • Domain, Source and

Microphones

  • Sparse source localization

19

Measured at microphone

slide-29
SLIDE 29

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Localizing the source next door

  • Domain, Source and

Microphones

  • Sparse source localization

19

Measured at microphone

slide-30
SLIDE 30

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Localizing the source next door

  • Domain, Source and

Microphones

  • Sparse source localization

19

Reasons of success

  • sparsity of sources
  • known room shape
  • known boundaries
slide-31
SLIDE 31

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Localizing the source next door

  • Domain, Source and

Microphones

  • Sparse source localization

19

Reasons of success

  • sparsity of sources
  • known room shape
  • known boundaries

What if shape is unknown ?

slide-32
SLIDE 32
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

CoSparse models and inverse problems

20

Observation Domain

slide-33
SLIDE 33
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

21

Observation Domain

CoSparse models and inverse problems

slide-34
SLIDE 34
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

21

Observation Domain

CoSparse models and inverse problems

«Perception» «Knowledge»

slide-35
SLIDE 35

November 5th 2012-

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Dictionary learning

with K. Schnass, F. Bach, R. Jenatton

slide-36
SLIDE 36
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

A quest for the perfect sparse model

23

Training image database

slide-37
SLIDE 37
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

A quest for the perfect sparse model

23

Training image database

patch extraction

Training patches

xn = Dzn, 1 ≤ n ≤ N

slide-38
SLIDE 38
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

Unknown sparse coefficients Unknown dictionary

A quest for the perfect sparse model

23

Training image database

patch extraction

Training patches

xn = Dzn, 1 ≤ n ≤ N

slide-39
SLIDE 39
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

Unknown sparse coefficients Unknown dictionary

A quest for the perfect sparse model

sparse learning

23

Training image database

= edge-like atoms

[Olshausen & Field 96, Aharon et al 06, Mairal et al 09, ...]

= shifts of edge-like motifs

[Blumensath 05, Jost et al 05, ...] patch extraction

Training patches

xn = Dzn, 1 ≤ n ≤ N ˆ D

slide-40
SLIDE 40

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Training collection = point cloud

Dictionary Learning = Sparse Matrix Factorization

24

slide-41
SLIDE 41

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Training collection = point cloud

Dictionary Learning = Sparse Matrix Factorization

24

xn ≈ Dzn ∈ Rd D x1 ≈ z1

s-sparse = at most s nonzero entries

slide-42
SLIDE 42

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Training collection = point cloud

Dictionary Learning = Sparse Matrix Factorization

24

xn ≈ Dzn ∈ Rd D x1 x2 ≈ z1 z2

slide-43
SLIDE 43

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Training collection = point cloud

Dictionary Learning = Sparse Matrix Factorization

24

xn ≈ Dzn ∈ Rd D x1 x2 ≈ xN . . . . . . z1 z2 zN

slide-44
SLIDE 44
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

Dictionary Learning = Sparse Matrix Factorization

25

X ≈ DZ

D x1 x2 ≈ xN . . . . . . z1 z2 zN d × N

with s-sparse columns

d × K K × N

slide-45
SLIDE 45

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Many approaches

26

  • Independent component analysis

[see e.g. book by Comon & Jutten 2011]

  • Convex

[Bach et al., 2008; Bradley and Bagnell, 2009]

  • Submodular

[Krause and Cevher, 2010]

  • Bayesian

[Zhou et al., 2009]

  • Non-convex matrix-factorization

[Olshausen and Field, 1997; Pearlmutter & Zibulevsky 2001, Aharon et al. 2006; Lee et al., 2007; Mairal et al., 2010 (... and many other authors)]

slide-46
SLIDE 46

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Learning = constrained minimization

✓ Constraint = dictionary with unit columns

27

ˆ D = arg min

D∈D FX(D)

D = {D = [d1, . . . , dD], k ⇥dk⇥2 = 1}

slide-47
SLIDE 47

November 5th 2012-

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Empirical findings

slide-48
SLIDE 48
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples

29

Numerical example (2D)

X = D0Z0

slide-49
SLIDE 49
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples

29

Numerical example (2D)

X = D0Z0 θ1 θ0 Dθ0,θ1

slide-50
SLIDE 50
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples

29

Numerical example (2D)

X = D0Z0 θ1 θ0 Dθ0,θ1 kD−1

θ0,θ1Xk1

FX(D)

slide-51
SLIDE 51
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples

29

Numerical example (2D)

X = D0Z0 θ1 θ0 Dθ0,θ1 kD−1

θ0,θ1Xk1

FX(D)

Symmetry = permutation ambiguity

slide-52
SLIDE 52
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples

29

Numerical example (2D)

a) Global minima match angles of the original basis b) There is no other local minimum. Empirical observations

X = D0Z0 θ1 θ0 Dθ0,θ1 kD−1

θ0,θ1Xk1

FX(D)

slide-53
SLIDE 53

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Sparsity vs coherence (2D)

30

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 4 N = 1000 Bernoulli−Gaussian training samples −4 −3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 4 N = 1000 Bernoulli−Gaussian training samples −3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples −2.5 −2 −1.5 −1 −0.5 0.5 1 1.5 2 −2 −1.5 −1 −0.5 0.5 1 1.5 2 2.5 3 N = 1000 Bernoulli−Gaussian training samples

sparse weakly sparse

p 1

µ = | cos(θ1 − θ0)|

1

incoherent coherent

slide-54
SLIDE 54

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

ground truth=local min ground truth=global min no spurious local min

1 0.9 0.8 0.7 0.6. 0.5

Sparsity vs coherence (2D)

30

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 4 N = 1000 Bernoulli−Gaussian training samples −4 −3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 4 N = 1000 Bernoulli−Gaussian training samples −3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples −2.5 −2 −1.5 −1 −0.5 0.5 1 1.5 2 −2 −1.5 −1 −0.5 0.5 1 1.5 2 2.5 3 N = 1000 Bernoulli−Gaussian training samples

sparse weakly sparse

p 1

µ = | cos(θ1 − θ0)|

1

incoherent coherent

Empirical probability of success

slide-55
SLIDE 55

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

ground truth=local min ground truth=global min no spurious local min

1 0.9 0.8 0.7 0.6. 0.5

Sparsity vs coherence (2D)

30

−3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 4 N = 1000 Bernoulli−Gaussian training samples −4 −3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 4 N = 1000 Bernoulli−Gaussian training samples −3 −2 −1 1 2 3 −4 −3 −2 −1 1 2 3 N = 1000 Bernoulli−Gaussian training samples −2.5 −2 −1.5 −1 −0.5 0.5 1 1.5 2 −2 −1.5 −1 −0.5 0.5 1 1.5 2 2.5 3 N = 1000 Bernoulli−Gaussian training samples

sparse weakly sparse

p 1

µ = | cos(θ1 − θ0)|

1

incoherent coherent

Empirical probability of success Rule of thumb: perfect recovery if: a) Incoherence b) Enough training samples (N large enough)

µ < 1 − p

slide-56
SLIDE 56

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Empirical findings

  • Stable & robust dictionary identification

✓ Global minima often match ground truth ✓ Often, there is no spurious local minimum

  • Role of parameters ?

✓ sparsity of Z ? ✓ incoherence of D ? ✓ noise level ? ✓ presence / nature of outliers ? ✓ sample complexity (number of training samples) ?

31

slide-57
SLIDE 57

November 5th 2012-

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Theoretical guarantees

slide-58
SLIDE 58

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Theoretical guarantees

33

slide-59
SLIDE 59

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Theoretical guarantees

  • Excess risk analysis (~Machine Learning)

[Maurer and Pontil, 2010; Vainsencher et al., 2010; Mehta and Gray, 2012]

33

FX( ˆ D) − min

D EXFX(D)

slide-60
SLIDE 60

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Theoretical guarantees

  • Excess risk analysis (~Machine Learning)

[Maurer and Pontil, 2010; Vainsencher et al., 2010; Mehta and Gray, 2012]

  • Identifiability analysis (~Signal Processing)

[Independent Component Analysis, e.g. book Comon & Jutten 2011]

33

k ˆ D D0kF

Array processing perspective

Dictionary ~ directions of arrival

Identification ~ source localization

Neural coding perspective:

Dictionaries ~ receptive fields

FX( ˆ D) − min

D EXFX(D)

slide-61
SLIDE 61

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

[G. & Schnass 2010] [Geng & al 2011] [Jenatton, Bach & G.]

signal model

  • vercomplete

no yes yes

  • utliers

yes no yes noise no no yes cost function

Theoretical guarantees: overview

34

min FX(D) minD,Z kZk1 s.t.DZ = X

− − − − − − − − −
slide-62
SLIDE 62

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

[G. & Schnass 2010] [Geng & al 2011] [Jenatton, Bach & G.]

signal model

  • vercomplete

no yes yes

  • utliers

yes no yes noise no no yes cost function

Theoretical guarantees: overview

34

min FX(D) minD,Z kZk1 s.t.DZ = X

− − − − − − − − −
slide-63
SLIDE 63

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Learning Guarantees vs Empirical Findings

  • Robustness to noise
  • Sample complexity

35

10

1

10

2

10

3

10

4

10

5

10

−2

10

−1

10 Hadamard−Dirac dictionary in dimension d number N of training signals relative error

d=8 d=8 d=16 d=16 d=32 (random init.) d=32 (oracle init.)

10 10

1

10

−3

10

−2

10

−1

10 10

1

Hadamard dictionary in dimension d Noise level Relative error

d=8 d=8 d=16 d=16 d=32 (random init.) d=32 (oracle init.)

Predicted slope

slide-64
SLIDE 64

November 5th 2012-

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

To conclude ...

slide-65
SLIDE 65
  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

November 5th 2012

37

Observation Domain

«Perception» «Knowledge»

CoSparse models and inverse problems

slide-66
SLIDE 66

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Synthesis vs Analysis

  • Traditional Synthesis Model

✓ Synthesis dictionary of atoms ✓ «Lego» model: building blocks ✓ Low-dimension = few atoms

Ex: man-made codes in communications

  • Alternate Analysis Model

✓ Analysis operator ✓ «Carving out» model: constraints ✓ Low-dimension = many constraints

Ex: coupling with laws of physics

38

for many rows of

x = Dz = X

i

zidi

kzk0 ⌧ dimension hωi, xi = 0 kΩxk0 ⌧ dimension

Ω (∆x − 1

c2 ∂2 ∂t2 x)| ˙ D = 0

slide-67
SLIDE 67

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Synthesis vs Analysis

  • Traditional Synthesis Model

✓ Synthesis dictionary of atoms ✓ «Lego» model: building blocks ✓ Low-dimension = few atoms

Ex: man-made codes in communications

  • Alternate Analysis Model

✓ Analysis operator ✓ «Carving out» model: constraints ✓ Low-dimension = many constraints

Ex: coupling with laws of physics

38

for many rows of

x = Dz = X

i

zidi

kzk0 ⌧ dimension hωi, xi = 0 kΩxk0 ⌧ dimension

Ω (∆x − 1

c2 ∂2 ∂t2 x)| ˙ D = 0

Misleadingly similar models, In fact, fundamentally different! Concept of cosparsity Nam & al 2011

slide-68
SLIDE 68

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Time scales in «knowledge building»

39

  • Expert knowledge (Fourier / wavelets)

✓ Harmonic analysis ✓ Evolution of species

  • Training from corpus

✓ Dictionary learning ✓ Individual experience

  • «Online» training / adaptivity ?
slide-69
SLIDE 69

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications

Time scales in «knowledge building»

39

  • Expert knowledge (Fourier / wavelets)

✓ Harmonic analysis ✓ Evolution of species

  • Training from corpus

✓ Dictionary learning ✓ Individual experience

  • «Online» training / adaptivity ?

see also talk by M. Yaghoobi

slide-70
SLIDE 70

November 5th 2012

  • R. GRIBONVAL - Workshop on Sparsity, Compressed Sensing and Applications
  • Audio inpainting

✓ A. Adler, N. Bertin, V. Emiya,

  • M. Elad, C.Guichaoua, M. Jafari,
  • M. Plumbley
  • Source localization

✓ S. Nam

  • Dictionary learning

✓ F. Bach, R. Jenatton, K. Schnass

40

echange.inria.fr small-project.eu

TH###NKS #