Just Relax
❦
Convex Programming Methods for Subset Selection and Sparse Approximation Joel A. Tropp
<jtropp@ices.utexas.edu> The University of Texas at Austin
1
Just Relax Convex Programming Methods for Subset Selection and - - PowerPoint PPT Presentation
Just Relax Convex Programming Methods for Subset Selection and Sparse Approximation Joel A. Tropp <jtropp@ices.utexas.edu> The University of Texas at Austin 1 Subset Selection Work in finite-dimensional inner-product space
❦
Convex Programming Methods for Subset Selection and Sparse Approximation Joel A. Tropp
<jtropp@ices.utexas.edu> The University of Texas at Austin
1
❦ ❧ Work in finite-dimensional inner-product space Cd ❧ Let {ϕω : ω ∈ Ω} be a dictionary of unit-norm elementary signals ❧ Suppose s is an arbitrary input signal from Cd ❧ Let τ be a fixed, positive threshold ❧ The subset selection problem is to solve min
c ∈ CΩ
2 + τ 2 c0
❧ Problem arose in statistics more than 50 years ago ❧ Reference: [Miller 2002]
Just Relax 2
❦ ❧ Linear regression ❧ Lossy compression of audio, images and video ❧ De-noising functions ❧ Detection and estimation of superimposed signals ❧ Regularization of linear inverse problems ❧ Approximation of functions by low-cost surrogates ❧ Sparse pre-conditioners for conjugate gradient solvers ❧ . . .
Just Relax 3
❦
Subset selection is combinatorial
min
c ∈ CΩ
2 + τ 2 c0
❧ References: [Natarajan 1995, Davis et al. 1997]
Replace with a convex program
min
b ∈ CΩ
1 2
2 + γ b1
❧ Can be solved in polynomial time with standard software ❧ Reference: [Chen et al. 1999]
Just Relax 4
❦ ℓ0 quasi-norm ℓ1 norm ℓ2 norm
Just Relax 5
❦ ℓ0 quasi-norm ℓ1 norm ℓ2 norm
Just Relax 6
❦
Subset Selection
min
c ∈ CΩ
2 + τ 2 c0
Convex Relaxation
min
b ∈ CΩ
1 2
2 + γ b1
Just Relax 7
❦ ❧ If the dictionary is orthonormal, the ℓ0 problem has an analytic solution ❧ Compute inner products between signal and dictionary cω = s, ϕω ❧ Apply hard threshold operator with cutoff τ to each coefficient
Just Relax 8
❦ ❧ If the dictionary is orthonormal, the ℓ1 problem has an analytic solution ❧ Compute inner products between signal and dictionary bω = s, ϕω ❧ Apply soft threshold operator with cutoff γ to each coefficient
Just Relax 9
❦
Insight: Subset selection is easy provided that the dictionary is nearly
❧ [Donoho–Huo 2001] introduces the coherence parameter µ
def
= max
λ=ω
|ϕλ, ϕω| ❧ Related to packing radius of dictionary, viewed as subset of Pd−1(C) ❧ Possible to have |Ω| = d2 and µ = 1/ √ d
Just Relax 10
❦
1
Impulses
1/√d
Complex Exponentials
Just Relax 11
❦ Theorem A. Fix an input signal and a threshold τ. Suppose that ❧ copt solves the subset selection problem with threshold τ; ❧ copt contains no more than 1
3 µ−1 nonzero components; and
❧ b⋆ solves the convex relaxation with γ = 2 τ. Then it follows that ❧ copt(ω) = 0 implies b⋆(ω) = 0; ❧ |b⋆(ω) − copt(ω)| ≤ 3 τ for each ω; ❧ in particular, b⋆(ω) = 0 so long as |copt(ω)| > 3 τ; and ❧ the relaxation has a unique solution.
Just Relax 12
❦ ❧ Suppose s is an arbitrary input signal from Cd ❧ Let ε be a fixed, positive error tolerance ❧ The error-constrained sparse approximation problem is min
c ∈ CΩ
c0 subject to
❧ Its convex relaxation is min
b ∈ CΩ
b1 subject to
Just Relax 13
❦ Theorem B. Fix an input signal, and let m ≤ 1
3 µ−1. Suppose that
❧ copt solves the sparse approximation problem with tolerance ε; ❧ copt contains no more than m nonzero components; and ❧ b⋆ solves the convex relaxation with tolerance δ = ε √1 + 6 m. Then it follows that ❧ copt(ω) = 0 implies b⋆(ω) = 0; ❧ b⋆ − copt2 ≤ δ
❧ the relaxation has a unique solution. [Donoho et al. 2004] contains related results.
Just Relax 14
❦ Just Relax: Convex Programming Methods for Subset Selection and Sparse Approximation Available from <http://www.ices.utexas.edu/~jtropp/>
Other Work. . .
❧ Greedy and iterative algorithms for sparse approximation ❧ Other types of sparse approximation ❧ Construction of packings in Grassmannian manifolds ❧ Matrix nearness and inverse eigenvalue problems
Just Relax 15