Support Recovery for Orthogonal Matching Pursuit: Upper and Lower - - PowerPoint PPT Presentation

support recovery for orthogonal matching pursuit
SMART_READER_LITE
LIVE PREVIEW

Support Recovery for Orthogonal Matching Pursuit: Upper and Lower - - PowerPoint PPT Presentation

Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds Raghav Somani 1 , Chirag Gupta 2 , Prateek Jain 1 & Praneeth Netrapalli 1 1 Microsoft Research Lab - India 2 Machine Learning Department, Carnegie Mellon University


slide-1
SLIDE 1

Support Recovery for Orthogonal Matching Pursuit:

Upper and Lower bounds Raghav Somani 1, Chirag Gupta 2, Prateek Jain 1 & Praneeth Netrapalli 1

1Microsoft Research Lab - India 2Machine Learning Department,

Carnegie Mellon University

December 6, 2018

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 1 / 7

slide-2
SLIDE 2

Sparse Linear Regression (SLR)

Sparse Linear Regression (SLR)

A y x = arg x ¯ min

‖x ≤ ‖0 s∗

2 2

Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :-

Standard optimization : Condition number κ = smoothness strong convexity Sparse optimization : Restricted Condition number ˜ κ = restricted smoothness restricted strong convexity

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

slide-3
SLIDE 3

Sparse Linear Regression (SLR)

Sparse Linear Regression (SLR)

A y x = arg x ¯ min

‖x ≤ ‖0 s∗

2 2

Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :-

Standard optimization : Condition number κ = smoothness strong convexity Sparse optimization : Restricted Condition number ˜ κ = restricted smoothness restricted strong convexity

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

slide-4
SLIDE 4

Sparse Linear Regression (SLR)

Sparse Linear Regression (SLR)

A y x = arg x ¯ min

‖x ≤ ‖0 s∗

2 2

Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :-

Standard optimization : Condition number κ = smoothness strong convexity Sparse optimization : Restricted Condition number ˜ κ = restricted smoothness restricted strong convexity

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

slide-5
SLIDE 5

Sparse Linear Regression (SLR)

Sparse Linear Regression (SLR)

A y x = arg x ¯ min

‖x ≤ ‖0 s∗

2 2

Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :-

Standard optimization : Condition number κ = smoothness strong convexity Sparse optimization : Restricted Condition number ˜ κ = restricted smoothness restricted strong convexity

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

slide-6
SLIDE 6

Sparse Linear Regression (SLR)

Setup and Goals

We work under the model where Observations Measurement matrix y = A ¯ x + η s∗-sparse vector Noise Goals of SLR

1

Bounding Generalization error/Excess Risk - G(x) ∶= 1 n ∥A(x − ¯ x)∥2

2.

2

Support Recovery - Recover the support of ¯ x. We study SLR under RSC assumption for OMP .

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

slide-7
SLIDE 7

Sparse Linear Regression (SLR)

Setup and Goals

We work under the model where Observations Measurement matrix y = A ¯ x + η s∗-sparse vector Noise Goals of SLR

1

Bounding Generalization error/Excess Risk - G(x) ∶= 1 n ∥A(x − ¯ x)∥2

2.

2

Support Recovery - Recover the support of ¯ x. We study SLR under RSC assumption for OMP .

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

slide-8
SLIDE 8

Sparse Linear Regression (SLR)

Setup and Goals

We work under the model where Observations Measurement matrix y = A ¯ x + η s∗-sparse vector Noise Goals of SLR

1

Bounding Generalization error/Excess Risk - G(x) ∶= 1 n ∥A(x − ¯ x)∥2

2.

2

Support Recovery - Recover the support of ¯ x. We study SLR under RSC assumption for OMP .

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

slide-9
SLIDE 9

Sparse Linear Regression (SLR)

Setup and Goals

We work under the model where Observations Measurement matrix y = A ¯ x + η s∗-sparse vector Noise Goals of SLR

1

Bounding Generalization error/Excess Risk - G(x) ∶= 1 n ∥A(x − ¯ x)∥2

2.

2

Support Recovery - Recover the support of ¯ x. We study SLR under RSC assumption for OMP .

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

slide-10
SLIDE 10

Sparse Linear Regression (SLR)

Setup and Goals

We work under the model where Observations Measurement matrix y = A ¯ x + η s∗-sparse vector Noise Goals of SLR

1

Bounding Generalization error/Excess Risk - G(x) ∶= 1 n ∥A(x − ¯ x)∥2

2.

2

Support Recovery - Recover the support of ¯ x. We study SLR under RSC assumption for OMP .

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

slide-11
SLIDE 11

Sparse Linear Regression (SLR)

Setup and Goals

We work under the model where Observations Measurement matrix y = A ¯ x + η s∗-sparse vector Noise Goals of SLR

1

Bounding Generalization error/Excess Risk - G(x) ∶= 1 n ∥A(x − ¯ x)∥2

2.

2

Support Recovery - Recover the support of ¯ x. We study SLR under RSC assumption for OMP .

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

slide-12
SLIDE 12

Sparse Linear Regression (SLR)

Setup and Goals

We work under the model where Observations Measurement matrix y = A ¯ x + η s∗-sparse vector Noise Goals of SLR

1

Bounding Generalization error/Excess Risk - G(x) ∶= 1 n ∥A(x − ¯ x)∥2

2.

2

Support Recovery - Recover the support of ¯ x. We study SLR under RSC assumption for OMP .

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

slide-13
SLIDE 13

Orthogonal Matching Pursuit

Orthogonal Matching Pursuit

Incremental Greedy algorithm Popular and easy to implement Widely studied in literature

iteration

(k + 1)th

k+1

Selected greedily iteration

kth

k

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 4 / 7

slide-14
SLIDE 14

Results Known results and our contribution

Known results and our contribution

Upper bound Lower bound Known Generalization bound ∝ 1 nσ2s∗̃ κ2 1 nσ2s∗̃ κ Our Generalization bound ∝ 1 nσ2s∗̃ κ log̃ κ 1 nσ2s∗̃ κ Unconditional lower bounds for OMP . Support recovery guarantees and its lower bounds. Support Expansion Known ∝ s∗̃ κ2 Our’s ∝ s∗̃ κ log̃ κ

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 5 / 7

slide-15
SLIDE 15

Results Known results and our contribution

Known results and our contribution

Upper bound Lower bound Known Generalization bound ∝ 1 nσ2s∗̃ κ2 1 nσ2s∗̃ κ Our Generalization bound ∝ 1 nσ2s∗̃ κ log̃ κ 1 nσ2s∗̃ κ Unconditional lower bounds for OMP . Support recovery guarantees and its lower bounds. Support Expansion Known ∝ s∗̃ κ2 Our’s ∝ s∗̃ κ log̃ κ

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 5 / 7

slide-16
SLIDE 16

Results Known results and our contribution

Known results and our contribution

Upper bound Lower bound Known Generalization bound ∝ 1 nσ2s∗̃ κ2 1 nσ2s∗̃ κ Our Generalization bound ∝ 1 nσ2s∗̃ κ log̃ κ 1 nσ2s∗̃ κ Unconditional lower bounds for OMP . Support recovery guarantees and its lower bounds. Support Expansion Known ∝ s∗̃ κ2 Our’s ∝ s∗̃ κ log̃ κ

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 5 / 7

slide-17
SLIDE 17

Results Key ideas

A key idea

f(x) = ∥Ax − y∥2

2

If any support is unrecovered, then there is a large additive decrease. f(x) ≥ 0 ⇒ support recovery will happen soon. Recovery with small support ⇒ small generalization error.

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

slide-18
SLIDE 18

Results Key ideas

A key idea

f(x) = ∥Ax − y∥2

2

If any support is unrecovered, then there is a large additive decrease. f(x) ≥ 0 ⇒ support recovery will happen soon. Recovery with small support ⇒ small generalization error.

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

slide-19
SLIDE 19

Results Key ideas

A key idea

f(x) = ∥Ax − y∥2

2

If any support is unrecovered, then there is a large additive decrease. f(x) ≥ 0 ⇒ support recovery will happen soon. Recovery with small support ⇒ small generalization error.

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

slide-20
SLIDE 20

Results Key ideas

A key idea

f(x) = ∥Ax − y∥2

2

If any support is unrecovered, then there is a large additive decrease. f(x) ≥ 0 ⇒ support recovery will happen soon. Recovery with small support ⇒ small generalization error.

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

slide-21
SLIDE 21

Results Thank You

Thank You!

  • R. Somani, C. Gupta, P

. Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 7 / 7