Optimality and Support Projection Algorithm for Sparsity Constrained - - PowerPoint PPT Presentation

optimality and support projection algorithm for sparsity
SMART_READER_LITE
LIVE PREVIEW

Optimality and Support Projection Algorithm for Sparsity Constrained - - PowerPoint PPT Presentation

Optimality and Support Projection Algorithm for Sparsity Constrained Minimization Lili Pan , Naihua Xiu , Shenglong Zhou Department of Applied Mathematics, Beijing Jiaotong University School of Science, Shandong University of


slide-1
SLIDE 1

Optimality and Support Projection Algorithm for Sparsity Constrained Minimization

Lili Pan†‡, Naihua Xiu†, Shenglong Zhou†

† Department of Applied Mathematics, Beijing Jiaotong University ‡ School of Science, Shandong University of Technology

Sept 2014

slide-2
SLIDE 2

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Outline

1

Introduction

2

Optimality Conditions (I)

3

Optimality Conditions (II)

4

Gradient Support Projection Algorithms

5

Numerical Experiments

6

Summary

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 2 / 38

slide-3
SLIDE 3

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Introduction

Introduction

In this talk, we mainly consider the nonlinear minimization with sparse and nonnegative constraints. By discussing tangent cone and normal cone of sparse constraint, we give the first necessary

  • ptimality conditions, α-Stability, T-Stability and N-Stability, and

the second necessary and sufficient optimality conditions for the nonlinear problem. By adopting Armijo-type stepsize rule, we present a gradient support projection algorithmic framework for the problem and establish its full convergence and computational complexity under mild

  • conditions. By doing some numerical experiments, we show the

excellent performance of the new algorithm for the least squares without and with noise.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 3 / 38

slide-4
SLIDE 4

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Introduction

Introduction

Model Representation Sparsity and Nonnegativity Constrained Nonlinear Optimization min f (x), s.t. x0 ≤ s, x ≥ 0. (1) where f (x) : RN → R is a continuously differentiable or twice differentiable function, x0 is the l0-norm of x. The special case of problem (1) min Ax − b2 s.t.x0 ≤ s, x ≥ 0, (2) where A ∈ RM×N, b ∈ RM, s < M < N and · is l2-norm.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 4 / 38

slide-5
SLIDE 5

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Model (I)

Introduction

We study the first and second order optimality conditions of the following model min f (x), s.t. x0 ≤ s. (3) Let S {x ∈ RN| x0 ≤ s}. Support Projection PS(x) =

  • y ∈ RN|yi = xi, i ∈ Is(x); yi = 0, i /

∈ Is(x)

  • .

where Is(x) := {j1, j2, · · · , js} ⊆ {1, 2, · · · , N} of indices of x with min

i∈Is(x) |xi| ≥ max i / ∈Is(x) |xi|.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 5 / 38

slide-6
SLIDE 6

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Tangent Cone and Normal Cone

Optimality Conditions (I)

Definition of Bouligand Tangent Cone For any nonempty set Ω ⊆ RN, its Bouligand Tangent Cone T B

Ω (x), and

corresponding Normal Cone NB

Ω(x) at point x ∈ Ω are defined as:

T B

Ω (x) :=

  • d ∈ RN
  • ∃ {x k} ⊂ Ω, lim

k→∞x k = x, λk ≥ 0, k = 1,

2, · · · , such that lim

k→∞λk(x k − x) = d

  • ,

NB

Ω(x) :=

d ∈ RN | d, z ≤ 0, ∀ z ∈ T B

Ω (x)

,

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 6 / 38

slide-7
SLIDE 7

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Tangent Cone and Normal Cone

Optimality Conditions (I)

Definition of Clarke Tangent Cone The Clarke Tangent Cone T C

Ω (x) and corresponding Normal Cone NC Ω(x)

at point x ∈ Ω are defined as:

T C

Ω (x) :=

    

d ∈ RN

  • ∀ {x k} ⊂ Ω, ∀ {λk} ⊂ R+ with lim

k→∞x k = x,

limk→∞ λk = 0, ∃ {y k} such that lim

k→∞y k = d

and x k + λky k ∈ Ω, k ∈ N

    

, NC

Ω(x) :=

d ∈ RN | d, z ≤ 0, ∀ z ∈ T C

Ω (x)

.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 7 / 38

slide-8
SLIDE 8

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Tangent Cone and Normal Cone

Optimality Conditions (I)

Bouligand Tangent Cone of Sparse Set

Theorem

For any x ∈ S and letting Γ = supp(x), the Bouligand tangent cone and corresponding normal cone of S at x are T B

S (x)

=

  • Υ

span { ei, i ∈ Υ ⊇ Γ, |Υ| ≤ s } (4) NB

S (x)

= span { ei, i / ∈ Γ } , if |Γ| = s {0}, if |Γ| < s (5) where ei ∈ RN is a vector whose the ith component is one and others are zeros, span{ei, i ∈ Γ} denotes the subspace of RN spanned by { ei, i ∈ Γ}, and supp(x) = {i ∈ {1, · · · , N} | xi = 0}.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 8 / 38

slide-9
SLIDE 9

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Tangent Cone and Normal Cone

Optimality Conditions (I)

Clarke Tangent Cone of Sparse Set

Theorem

For any x ∈ S and letting Γ = supp(x), then the Clarke tangent cone and corresponding normal cone of S at x are T C

S (x)

= { d ∈ RN | supp(d) ⊆ Γ } = span { ei, i ∈ Γ } (6) NC

S (x)

= span { ei, i / ∈ Γ } . (7)

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 9 / 38

slide-10
SLIDE 10

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

α-Stability, N-Stability and T-Stability

Optimality Conditions (I)

α-Stability, N-Stability and T-Stability

Definition

For real number α > 0, a vector x∗ ∈ S is called an α-stationary point, N♯-stationary point and T ♯-stationary point of (3) if it respectively satisfies the relation α − stationary point: x∗ ∈ PS (x∗ − α∇f (x∗)) , (8) N♯ − stationary point: ∈ ∇f (x∗) + N♯

S(x∗),

(9) T ♯ − stationary point: = ∇♯

Sf (x∗),

(10) where ∇♯

Sf (x∗) = arg min{ x + ∇f (x∗) | x ∈ T ♯ S(x∗) }, ♯ ∈ {B, C}

stands for the sense of Bouligand tangent cone or Clarke tangent cone.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 10 / 38

slide-11
SLIDE 11

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

α-Stability, N-Stability and T-Stability

Optimality Conditions (I)

Relationship of the Three Kinds of Stability

Theorem

Under the concept of Bouligand tangent cone, for model (3) and α > 0, if the vector x ∗ ∈ S satisfies x∗0 = s, then α−stationary point = ⇒ NB−stationary point ⇐ ⇒ T B−stationary point; if the vector x ∗ ∈ S satisfies x∗0 < s, then α−stationary point ⇐ ⇒ NB−stationary point ⇐ ⇒ T B−stationary point ⇐ ⇒ ∇f (x∗) = 0.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 11 / 38

slide-12
SLIDE 12

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

α-Stability, N-Stability and T-Stability

Optimality Conditions (I)

Relationship of the Three Kinds of Stability

x∗0 = s x∗0 < s α – stationary point |(∇f (x∗))i|

  

= 0, i ∈ Γ ≤

1 α Ms(|x∗|),

i / ∈ Γ, ∇f (x∗) = 0 x∗ ∈ PS (x∗ − α∇f (x∗)) NB – stationary point (∇f (x∗))i

  

= 0, i ∈ Γ ∈ R, i / ∈ Γ, ∇f (x∗) = 0 −∇f (x∗) ∈ NB

S (x∗)

T B – stationary point (∇f (x∗))i

  

= 0, i ∈ Γ ∈ R, i / ∈ Γ, ∇f (x∗) = 0 ∇B

S f (x∗) = 0 L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 12 / 38

slide-13
SLIDE 13

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

α-Stability, N-Stability and T-Stability

Optimality Conditions (I)

Relationship of the Three Kinds of Stability

Theorem

Under the concept of Clarke tangent cone, we consider the problem (3). For α > 0, if x∗ ∈ S then α−stationary point = ⇒ NC−stationary point ⇐ ⇒ T C−stationary point.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 13 / 38

slide-14
SLIDE 14

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

α-Stability, N-Stability and T-Stability

Optimality Conditions (I)

Theorem

Let function f (x) satisfy Assumption 1, we have if x∗ ∈ S is the optimal solution of (3), then for 0 < α <

1 Lf , x∗ is also the α-stationary point. On

the contrary, let’s further assume that f (x) is convex, if x∗0 < s and x∗ is the α-stationary point of (3), then x∗ is the optimal solution of (3).

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 14 / 38

slide-15
SLIDE 15

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Second Order Optimality Conditions

Optimality Conditions (I)

Theorem (Second Order Necessary Optimality)

If x∗ ∈ S is the optimal solution of (3) , then for 0 < α <

1 Lf we have

d⊤∇2f (x∗)d ≥ 0, ∀ d ∈ T C

S (x∗).

(11) where ∇2f (x∗) is the Hessian matrix of f at x∗.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 15 / 38

slide-16
SLIDE 16

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Second Order Optimality Conditions

Optimality Conditions (I)

Theorem (Second Order Sufficient Optimality)

If x∗ ∈ S is an α-stationary point of (3) and satisfies d⊤∇2f (x∗)d > 0, ∀ d ∈ T C

S (x∗),

(12) then x∗ is the strictly locally optimal solution of (3). Moreover, there are η > 0 and δ > 0, for any x ∈ B(x∗, δ) ∩ S, it holds f (x) ≥ f (x∗) + ηx − x∗2. (13)

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 16 / 38

slide-17
SLIDE 17

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Optimality Conditions (II)

Optimality Conditions (II)

Support projection and Tangent cones for (1) PS∩RN

+(x) = PS · PRN +(x).

Theorem

For x ∈ S ∩ RN

+, by denoting RN +(x) := { x ∈ RN | xi ≥ 0, i /

∈ Γ }, it has T B

S∩RN

+(x) = T B

S (x) ∩ RN +(x),

NB

S∩RN

+(x) = T B

S (x) ∩ (−RN +(x))

T C

S∩RN

+(x) = T C

S (x),

NC

S∩RN

+(x) = NC

S (x).

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 17 / 38

slide-18
SLIDE 18

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Optimality Conditions (II)

Optimality Conditions (II)

α-stationary point of (1) is defined as: x∗ ∈ PS∩RN

+ (x∗ − α∇f (x∗)) .

(14)

Theorem

For any α > 0, x∗ ∈ S ∩ RN

+ is α-stationary point of (1) if and only if

∇if (x∗) = 0, if i ∈ supp(x∗), ∈ [− 1

αMs(x∗), +∞),

if i / ∈ supp(x∗), (15)

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 18 / 38

slide-19
SLIDE 19

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Optimality Conditions (II)

Optimality Conditions (II)

Relationship of the Three Kinds of Stability for model (1)

Theorem

For the model (1) and any α > 0. A) Under the concept of Bouligand tangent cone, if x∗0 = s, x∗ ≥ 0, then α−stationary point = ⇒ NB−stationary point ⇐ ⇒ T B−stationary point. B) Under the concept of Clarke tangent cone, if x∗0 ≤ s, x∗ ≥ 0, then α−stationary point = ⇒ NC−stationary point ⇐ ⇒ T C−stationary point.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 19 / 38

slide-20
SLIDE 20

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Optimality Conditions (II)

Optimality Conditions (II)

Assumption 1. The gradient of the objective function f (x) is Lipschitz with constant Lf over RN: ∇f (x) − ∇f (y) ≤ Lf x − y, ∀ x, y ∈ RN. (16)

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 20 / 38

slide-21
SLIDE 21

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Optimality Conditions (II)

α-stationary point of (1)

Theorem (Second Order Optimality for model (1))

If x∗ ∈ S ∩ RN

+ is the optimal solution of (1), then for 0 < α < 1 Lf , x∗ is

also the α-stationary point of (1), and moreover, d⊤∇2f (x∗)d ≥ 0, ∀ d ∈ T C

S (x∗).

(17) On the contrary, if x∗ ∈ S ∩ RN

+ is an α-stationary point of (1) and

d⊤∇2f (x∗)d > 0, ∀ d ∈ T C

S (x∗),

(18) then x∗ is the strictly locally optimal solution of (1). Moreover, there is a γ > 0 and δ > 0, when any x ∈ B(x∗, δ) ∩ S ∩ RN

+, it holds

f (x) ≥ f (x∗) + γx − x∗2. (19)

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 21 / 38

slide-22
SLIDE 22

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Gradient Support Projection Algorithms

Gradient Support Projection Algorithm for (1)

Step 0 Initialize x 0 = 0, Γ0 = supp(PS∩RN

+(∇f (x 0))), 0 < α0 <

1 Lf ,

0 < σ ≤

1 4Lf , 0 < β < 1, ǫ > 0. Set k ⇐ 0;

Step 1 Compute ˜ x k+1 = PS∩RN

+

  • x k − α0∇f (x k)

; Step 2 If supp(˜ x k+1) = Γk, then x k+1 = ˜ x k+1, Γk+1 = supp(x k+1); Else x k+1 = PS∩RN

+

  • x k − αk∇f (x k)

, Γk+1 = supp(x k+1), where αk = α0βmk and mk is the smallest positive integer m such that f (x k(α0βm)) ≤ f (x k) − σ

2 xk(α0βm)−xk2 (α0βm)2

, here x k(α) = PS∩RN

+(x k − α∇f (x k));

Step 3 If x k+1 − x k ≤ ǫ, stop; Otherwise k ⇐ k + 1, go to Step 1.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 22 / 38

slide-23
SLIDE 23

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Gradient Support Projection Algorithms

Gradient Support Projection Algorithm for (1)

Lemma

Let Assumption 1. hold and

  • xk

be the iterative point in Step 2 in

  • GSPA. Then

f (x k(α)) ≤

      

f (x k) − 1

2( 1 α − Lf )x k(α) − x k2, α ∈

  • 0, 1

Lf

  • f (x k) − σ

2 xk(α)−xk2 α2

, α ∈

  • 1−√

1−4σLf 2Lf

,

1+√ 1−4σLf 2Lf

  • .

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 23 / 38

slide-24
SLIDE 24

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Gradient Support Projection Algorithms

Gradient Support Projection Algorithm for (1)

Theorem

Let Assumption 1 hold and the sequence {xk} be generated by GSPA, we have (i) lim

k→∞ xk+1−xk αk

= 0; (ii) any accumulation point of {xk} is the α-stationary point of (3); (iii) limk→∞ ∇C

S∩RN

+f (xk) = 0. L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 24 / 38

slide-25
SLIDE 25

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Gradient Support Projection Algorithms

Gradient Supp-Projection Algorithm for (2)

Let r(x) = 1

2Ax − b2, we consider the problem (2).

Step 0 Initialize x 0 = 0, Γ0 = supp(PS∩RN

+(ATb)), 0 < σ ≤

1 4Lr ,

0 < β < 1, ǫ > 0. Set k ⇐ 0; Step 1 Compute ˜ x k+1 = PS∩RN

+

  • x k − αk

0∇r(x k)

; αk

0 = AT

Γk (b−Axk)2

AΓk AT

Γk (b−Axk)2 .

Step 2 If supp(˜ x k+1) = Γk, then x k+1 = ˜ x k+1, Γk+1 = supp(x k+1); Else x k+1 = PS∩RN

+

  • x k − αk∇r(x k)

, Γk+1 = supp(x k+1), where αk = αk

0βmk and mk is the smallest positive integer

m such that r(x k(αk

0βm)) ≤ r(x k) − σ 2 xk(αk

0βm)−xk2

(αk

0βm)2

, here x k(α) = PS∩RN

+(x k − α∇r(x k));

Step 3 If x k+1 − x k ≤ ǫ, stop; Otherwise k ⇐ k + 1, go to Step 1.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 25 / 38

slide-26
SLIDE 26

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Gradient Support Projection Algorithms

Gradient Supp-Projection Algorithm for (2)

Assumption 2. Matrix A is s-regular if any s of its columns are linearly independent, namely, d⊤A⊤Ad > 0, ∀ d0 ≤ s.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 26 / 38

slide-27
SLIDE 27

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Gradient Support Projection Algorithms

Gradient Supp-Projection Algorithm for (2)

Theorem

Let the sequence {xk} be generated by GSPA, then {xk} converges to a local minimizer of (2) if A is s-regular.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 27 / 38

slide-28
SLIDE 28

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Gradient Support Projection Algorithms

Gradient Supp-Projection Algorithm for (2)

Theorem

If Assumption 2 holds for matrix A, then the local solutions of problem (2) exist and are finite. Moreover, if A and b satisfies ΠΓib = ΠΓjb with Γi = Γj, |Γi| ≤ s, |Γj| ≤ s (20) where ΠΓib = bTAΓi(AT

ΓiAΓi)−1AT Γib. then problem (2) has a unique

solution.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 28 / 38

slide-29
SLIDE 29

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Greedy methods

MP — Matching pursuit[MZ] OMP — Orthogonal MP[DM] CoSaMP — Compressive sampling matching pursuit [NT] SP — Subspace pursuit[DM] NIHT — Iterative hard thresholding algorithm [B] · · ·

[MZ] S. Mallat and Z. Zhang, Matching pursuits with time-frequency dictionaries, IEEE Trans. Signal Process., 41, pp. 3397-3415, 1993. [NT] D. Needell and J.A. Tropp, CoSaMP: Iterative signal recovery from incomplete and inaccurate samples, Appl. Comput. Harmon. Anal., 26, pp.301-32,2009. [DM] W. Dai and O. Milenkovic, Subspace pursuit for compressive sensing signal reconstruction, IEEE

  • Trans. Inform. Theory, 55, pp.2230-2249, 2009.

[B] T Blumensath, Normalized iterative hard thresholding: Guaranteed stability and performance , Selected Topics in Signal Processing, IEEE Journal of, vol. 4. no. 2, pp. 298-309, 2010..

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 29 / 38

slide-30
SLIDE 30

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Exact recovery GSPA and NIHT for (2) with sparsity and nonnegativity

2000 4000 6000 8000 10000 1.5 2 2.5 3 3.5 4 4.5 5 5.5 x 10

−5

Average error || Ax−b||

2 N_GSPA(M=N/2) N_NIHT(M=N/2) N_GSPA(M=N/4) N_NIHT(M=N/4) 2000 4000 6000 8000 10000 0.5 1 1.5 2 2.5 3 3.5 4 4.5 x 10

−7

Size of N Average error ||x−xorig||∞

N_GSPA(M=N/2) N_NIHT(M=N/2) N_SGPA(M=N/4) N_NIHT(M=N/4) 2000 4000 6000 8000 10000 2 4 6 8 10 12 14 16 18 20

Average CPUtime

N_GSPA(M=N/2) N_NIHT(M=N/2) N_GSPA(M=N/4) N_NIHT(M=N/4)

Figure: Average results yielded by Non_NIHT and Non_GSPA.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 30 / 38

slide-31
SLIDE 31

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Exact recovery GSPA, NIHT, CoSaMP(short for CSMP) and SP for (2) with sparsity

20 40 60 80 100 10 20 30 40 50

Average error || Ax−b||

2

N=1000, M=N/4

GSPA NIHT CSMP SP 20 40 60 80 100 50 100 150 200 250 300

Average error || Ax−b||

2

N=5000, M=N/4

GSPA NIHT CSMP SP 20 40 60 80 100 100 200 300 400

Numbers of Iterations Average error || Ax−b||

2

N=7000, M=N/4

GSPA NIHT CSMP SP 20 40 60 80 100 100 200 300 400 500 600

Numbers of Iterations Average error || Ax−b||

2

N=10000, M=N/4

GSPA NIHT CSMP SP

Figure: Average prediction error Ax b for each iteration with k 5 N

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 31 / 38

slide-32
SLIDE 32

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Exact recovery: GSPA, NIHT, CoSaMP and SP for (2) with sparsity

Table: The average CPU time over 40 simulations with k = 5%N. N M GSPA NIHT CSMP SP N = 1000 M = N/4 0.0689 0.2583 0.1492 0.0961 M = N/2 0.0677 0.2459 0.1687 0.1307 N = 3000 M = N/4 0.5385 3.3210 1.9171 1.1197 M = N/2 0.5756 2.6228 1.8754 1.3627 N = 5000 M = N/4 1.5583 11.246 8.0507 4.5900 M = N/2 1.5114 8.0690 7.7457 5.0981 N = 7000 M = N/4 3.0050 20.761 19.698 10.729 M = N/2 2.9543 16.389 19.336 12.613 N = 10000 M = N/4 6.3880 52.257 51.680 27.864 M = N/2 5.9462 38.256 53.707 30.924

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 32 / 38

slide-33
SLIDE 33

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Recovery with Noise GSPA and NIHT for (2) with sparsity

20 40 60 80 100 10 20 30 40 50 60 70 80

Numbers of Iterations Average error || Ax−b||

2

N=1000

20 40 60 80 100 50 100 150 200 250 300 350 400

Numbers of Iterations Average error || Ax−b||

2

N=5000

GSPA(M=N/4) NIHT(M=N/4) GSPA(M=N/2) NIHT(M=N/2) GSPA(M=N/4) NIHT(M=N/4) GSPA(M=N/2) NIHT(M=N/2)

Figure: Average error Ax − b2 for each iteration with k = 5%N over 40 simulations with noise.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 33 / 38

slide-34
SLIDE 34

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Recovery with Noise GSPA and CoSaMP for (2) with sparsity

20 30 40 50 60 70 80 90 100 0.5 1 1.5 2 2.5 3 3.5 4

Numbers of Iterations Average error || Ax−b||

2

N=500

20 30 40 50 60 70 80 90 100 0.5 1 1.5 2 2.5

Numbers of Iterations Average error || Ax−b||

2

N=1000

GSPA(M=N/4) CSMP(M=N/4) GSPA(M=N/2) CSMP(M=N/2) GSPA(M=N/4) CSMP(M=N/4) GSPA(M=N/2) CSMP(M=N/2)

Figure: Average error Ax − b2 for each iteration with k = 5%N over 40 simulations with noise.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 34 / 38

slide-35
SLIDE 35

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Recovery with Noise GSPA and SP for (2) with sparsity

10 20 30 40 50 10 20 30 40 50 60

Numbers of Iterations Average error || Ax−b||

2

N=1000

GSPA(M=N/4) SP (M=N/4) GSPA(M=N/2) SP (M=N/2) 10 20 30 40 50 50 100 150 200 250 300

Numbers of Iterations Average error || Ax−b||

2

N=5000

GSPA(M=N/4) SP (M=N/4) GSPA(M=N/2) SP (M=N/2)

Figure: Average error Ax − b2 for each iteration with k = 5%N over 40 simulations with noise.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 35 / 38

slide-36
SLIDE 36

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Numerical Experiments

Numerical Experiments

Recovery with Noise

GSPA, NIHT, CoSaMP and SP for (2) with sparsity Table: The average CPU time over 40 simulations with M = N/4, s = 5%N and noise. N GSPA NIHT CSMP SP CPU time 1000 0.0812 0.3226 116.87 0.1859 3000 0.5797 3.9317 1416.1 1.1631 5000 1.6221 9.6857 – – 4.9076 7000 3.2252 25.306 – – 11.556 10000 6.6369 38.440 – – 28.429

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 36 / 38

slide-37
SLIDE 37

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Summary

Summary

Contributions We have established the first and second order

  • ptimality conditions for problem (1) and (3), proposed a gradient

support projection algorithm for (3), and shown that the new algorithm has elegant convergence and exceptional performance. Future Work In the future, we will further consider conjugate gradient or quasi-Newton direction in stead of negative gradient direction to improve convergence speed. On the other hand, we will think to develop this algorithm for optimization problems with sparsity and other complex constraints.

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 37 / 38

slide-38
SLIDE 38

Introduction Optimality Conditions (I) Optimality Conditions (II) Gradient Support Projection Algorithms Numerical Experiments Summary

Summary

L Pan, S Zhou, N Xiu Optimality and Support Projection Algorithm for Sparsity Constrained Minimization May 2014 38 / 38