Sparse Recovery and Fourier Sampling Eric Price MIT Eric Price - - PowerPoint PPT Presentation

sparse recovery and fourier sampling
SMART_READER_LITE
LIVE PREVIEW

Sparse Recovery and Fourier Sampling Eric Price MIT Eric Price - - PowerPoint PPT Presentation

Sparse Recovery and Fourier Sampling Eric Price MIT Eric Price (MIT) Sparse Recovery and Fourier Sampling 1 / 37 The Fourier Transform Conversion between time and frequency domains Frequency Domain Time Domain Fourier Transform


slide-1
SLIDE 1

Sparse Recovery and Fourier Sampling

Eric Price

MIT

Eric Price (MIT) Sparse Recovery and Fourier Sampling 1 / 37

slide-2
SLIDE 2

The Fourier Transform

Conversion between time and frequency domains

Time Domain Frequency Domain Fourier Transform Displacement of Air Concert A

Eric Price (MIT) Sparse Recovery and Fourier Sampling 2 / 37

slide-3
SLIDE 3

The Fourier Transform is Ubiquitous

Audio Video Medical Imaging Radar GPS Oil Exploration

Eric Price (MIT) Sparse Recovery and Fourier Sampling 3 / 37

slide-4
SLIDE 4

Computing the Discrete Fourier Transform

How to compute x = Fx?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-5
SLIDE 5

Computing the Discrete Fourier Transform

How to compute x = Fx? Naive multiplication: O(n2).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-6
SLIDE 6

Computing the Discrete Fourier Transform

How to compute x = Fx? Naive multiplication: O(n2). Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-7
SLIDE 7

Computing the Discrete Fourier Transform

How to compute x = Fx? Naive multiplication: O(n2). Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965] [T]he method greatly reduces the tediousness of mechanical calculations. – Carl Friedrich Gauss, 1805

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-8
SLIDE 8

Computing the Discrete Fourier Transform

How to compute x = Fx? Naive multiplication: O(n2). Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965] [T]he method greatly reduces the tediousness of mechanical calculations. – Carl Friedrich Gauss, 1805 By hand: 22n log n seconds. [Danielson-Lanczos, 1942]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-9
SLIDE 9

Computing the Discrete Fourier Transform

How to compute x = Fx? Naive multiplication: O(n2). Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965] [T]he method greatly reduces the tediousness of mechanical calculations. – Carl Friedrich Gauss, 1805 By hand: 22n log n seconds. [Danielson-Lanczos, 1942] Can we do better?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-10
SLIDE 10

Computing the Discrete Fourier Transform

How to compute x = Fx? Naive multiplication: O(n2). Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965] [T]he method greatly reduces the tediousness of mechanical calculations. – Carl Friedrich Gauss, 1805 By hand: 22n log n seconds. [Danielson-Lanczos, 1942] Can we do much better?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-11
SLIDE 11

Computing the Discrete Fourier Transform

How to compute x = Fx? Naive multiplication: O(n2). Fast Fourier Transform: O(n log n) time. [Cooley-Tukey, 1965] [T]he method greatly reduces the tediousness of mechanical calculations. – Carl Friedrich Gauss, 1805 By hand: 22n log n seconds. [Danielson-Lanczos, 1942] Can we do much better? When can we compute the Fourier Transform in sublinear time?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 4 / 37

slide-12
SLIDE 12

Idea: Leverage Sparsity

Often the Fourier transform is dominated by a small number of peaks: Time Signal Frequency (Exactly sparse) Frequency (Approximately sparse)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 5 / 37

slide-13
SLIDE 13

Idea: Leverage Sparsity

Often the Fourier transform is dominated by a small number of peaks: Time Signal Frequency (Exactly sparse) Frequency (Approximately sparse) Sparsity is common:

Audio Video Medical Imaging Radar GPS Oil Exploration

Eric Price (MIT) Sparse Recovery and Fourier Sampling 5 / 37

slide-14
SLIDE 14

Idea: Leverage Sparsity

Often the Fourier transform is dominated by a small number of peaks: Time Signal Frequency (Exactly sparse) Frequency (Approximately sparse) Sparsity is common:

Audio Video Medical Imaging Radar GPS Oil Exploration

Goal of this work: a sparse Fourier transform

Faster Fourier Transform on sparse data.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 5 / 37

slide-15
SLIDE 15

Talk Outline

1

Sparse Fourier Transform Overview Technical Details

Eric Price (MIT) Sparse Recovery and Fourier Sampling 6 / 37

slide-16
SLIDE 16

Talk Outline

1

Sparse Fourier Transform Overview Technical Details

2

Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 6 / 37

slide-17
SLIDE 17

Talk Outline

1

Sparse Fourier Transform Overview Technical Details

2

Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 7 / 37

slide-18
SLIDE 18

My Contributions

Goal: Compute the Fourier transform x = Fx when x is k-sparse. Theory:

◮ The fastest algorithm for Fourier transforms of sparse data. ◮ The only algorithms faster than FFT for all k = o(n). Eric Price (MIT) Sparse Recovery and Fourier Sampling 8 / 37

slide-19
SLIDE 19

My Contributions

Goal: Compute the Fourier transform x = Fx when x is k-sparse. Theory:

◮ The fastest algorithm for Fourier transforms of sparse data. ◮ The only algorithms faster than FFT for all k = o(n).

Practice:

◮ Implementation is faster than FFTW for a wide range of inputs. ◮ Orders of magnitude faster than previous sparse Fourier transforms. ◮ Useful in multiple applications. Eric Price (MIT) Sparse Recovery and Fourier Sampling 8 / 37

slide-20
SLIDE 20

Applications of ideas

http://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× faster

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

slide-21
SLIDE 21

Applications of ideas

http://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× faster Spectrum sensing [HSAHK]: 6× lower sampling rate

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

slide-22
SLIDE 22

Applications of ideas

http://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× faster Spectrum sensing [HSAHK]: 6× lower sampling rate Dense FFT over clusters [TPKP]: 2× faster

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

slide-23
SLIDE 23

Applications of ideas

http://groups.csail.mit.edu/netmit/sFFT/workshop.html

GPS [HAKI]: 2× faster Spectrum sensing [HSAHK]: 6× lower sampling rate Dense FFT over clusters [TPKP]: 2× faster ...

Eric Price (MIT) Sparse Recovery and Fourier Sampling 9 / 37

slide-24
SLIDE 24

Talk Outline

1

Sparse Fourier Transform Overview Technical Details

2

Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 10 / 37

slide-25
SLIDE 25

Theoretical Results

For a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

slide-26
SLIDE 26

Theoretical Results

For a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93] Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05, Iwen ’10, Akavia ’10]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

slide-27
SLIDE 27

Theoretical Results

For a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93] Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05, Iwen ’10, Akavia ’10]

◮ All take at least k log4 n time. ◮ Only better than FFT if k ≪ n/ log3 n. Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

slide-28
SLIDE 28

Theoretical Results

For a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93] Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05, Iwen ’10, Akavia ’10]

◮ All take at least k log4 n time. ◮ Only better than FFT if k ≪ n/ log3 n.

Our results [HIKP12a, HIKP12b]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

slide-29
SLIDE 29

Theoretical Results

For a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93] Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05, Iwen ’10, Akavia ’10]

◮ All take at least k log4 n time. ◮ Only better than FFT if k ≪ n/ log3 n.

Our results [HIKP12a, HIKP12b]

◮ Exactly k-sparse: O(k log n) ⋆ Optimal if FFT is optimal. Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

slide-30
SLIDE 30

Theoretical Results

For a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93] Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05, Iwen ’10, Akavia ’10]

◮ All take at least k log4 n time. ◮ Only better than FFT if k ≪ n/ log3 n.

Our results [HIKP12a, HIKP12b]

◮ Exactly k-sparse: O(k log n) ⋆ Optimal if FFT is optimal. ◮ Approximately k-sparse: O(k log(n/k) log n)

result − x2 (1 + ǫ) min

k-sparse x(k)

  • x(k) −

x2

Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

slide-31
SLIDE 31

Theoretical Results

For a signal of size n with k large frequencies

First on Boolean cube [GL89, KM92, L93] Adapted to complexes [Mansour ’92, GGIMS02, AGS03, GMS05, Iwen ’10, Akavia ’10]

◮ All take at least k log4 n time. ◮ Only better than FFT if k ≪ n/ log3 n.

Our results [HIKP12a, HIKP12b]

◮ Exactly k-sparse: O(k log n) ⋆ Optimal if FFT is optimal. ◮ Approximately k-sparse: O(k log(n/k) log n)

result − x2 (1 + ǫ) min

k-sparse x(k)

  • x(k) −

x2

◮ Better than FFT for any k = o(n) Eric Price (MIT) Sparse Recovery and Fourier Sampling 11 / 37

slide-32
SLIDE 32

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x:

  • xi = 1

n

  • j

ω−ijxj for ω = eτi/n

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

slide-33
SLIDE 33

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x:

  • xi = 1

n

  • j

ω−ijxj for ω = eτi/n (where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

slide-34
SLIDE 34

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x:

  • xi = 1

n

  • j

ω−ijxj for ω = eτi/n

  • x = F x

for Fij = ω−ij/n (where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

slide-35
SLIDE 35

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x:

  • xi = 1

n

  • j

ω−ijxj for ω = eτi/n

  • x = F x

for Fij = ω−ij/n Inverse transform almost identical: (where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

slide-36
SLIDE 36

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x:

  • xi = 1

n

  • j

ω−ijxj for ω = eτi/n

  • x = F x

for Fij = ω−ij/n Inverse transform almost identical: xi =

  • j

ωij xj

◮ ω → ω−1, scale

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

slide-37
SLIDE 37

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x:

  • xi = 1

n

  • j

ω−ijxj for ω = eτi/n

  • x = F x

for Fij = ω−ij/n Inverse transform almost identical: xi =

  • j

ωij xj

◮ ω → ω−1, scale

Lots of nice properties (where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

slide-38
SLIDE 38

Discrete Fourier Transform (DFT) Definition

Given x ∈ Cn, compute Fourier transform x:

  • xi = 1

n

  • j

ω−ijxj for ω = eτi/n

  • x = F x

for Fij = ω−ij/n Inverse transform almost identical: xi =

  • j

ωij xj

◮ ω → ω−1, scale

Lots of nice properties

◮ Convolution ←

→ Multiplication

(where τ is the circle constant 6.283...)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 12 / 37

slide-39
SLIDE 39

Algorithm

Simpler case: x is exactly k-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

slide-40
SLIDE 40

Algorithm

Simpler case: x is exactly k-sparse.

Theorem

We can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

slide-41
SLIDE 41

Algorithm

Simpler case: x is exactly k-sparse.

Theorem

We can compute x in O(k log n) expected time. Still kind of hard.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

slide-42
SLIDE 42

Algorithm

Simpler case: x is exactly k-sparse.

Theorem

We can compute x in O(k log n) expected time. Still kind of hard. Simplest case: x is exactly 1-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

slide-43
SLIDE 43

Algorithm

Simpler case: x is exactly k-sparse.

Theorem

We can compute x in O(k log n) expected time. Still kind of hard. Simplest case: x is exactly 1-sparse.

Lemma

We can compute a 1-sparse x in O(1) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 13 / 37

slide-44
SLIDE 44

Algorithm for k = 1

  • x:

t a

Lemma

We can compute a 1-sparse x in O(1) time.

  • xi =

a if i = t

  • therwise

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

slide-45
SLIDE 45

Algorithm for k = 1

  • x:

t a

Lemma

We can compute a 1-sparse x in O(1) time.

  • xi =

a if i = t

  • therwise

Then x = (a, aωt, aω2t, aω3t, . . . , aω(n−1)t).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

slide-46
SLIDE 46

Algorithm for k = 1

  • x:

t a

Lemma

We can compute a 1-sparse x in O(1) time.

  • xi =

a if i = t

  • therwise

Then x = (a, aωt, aω2t, aω3t, . . . , aω(n−1)t). x0 = a

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

slide-47
SLIDE 47

Algorithm for k = 1

  • x:

t a

Lemma

We can compute a 1-sparse x in O(1) time.

  • xi =

a if i = t

  • therwise

Then x = (a, aωt, aω2t, aω3t, . . . , aω(n−1)t). x0 = a x1 = aωt

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

slide-48
SLIDE 48

Algorithm for k = 1

  • x:

t a

Lemma

We can compute a 1-sparse x in O(1) time.

  • xi =

a if i = t

  • therwise

Then x = (a, aωt, aω2t, aω3t, . . . , aω(n−1)t). x0 = a x1 = aωt x1/x0 = ωt = ⇒ t.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

slide-49
SLIDE 49

Algorithm for k = 1

  • x:

t a

Lemma

We can compute a 1-sparse x in O(1) time.

  • xi =

a if i = t

  • therwise

Then x = (a, aωt, aω2t, aω3t, . . . , aω(n−1)t). x0 = a x1 = aωt x1/x0 = ωt = ⇒ t.

  • Eric Price (MIT)

Sparse Recovery and Fourier Sampling 14 / 37

slide-50
SLIDE 50

Algorithm for k = 1

  • x:

t a

Lemma

We can compute a 1-sparse x in O(1) time.

  • xi =

a if i = t

  • therwise

Then x = (a, aωt, aω2t, aω3t, . . . , aω(n−1)t). x0 = a x1 = aωt x1/x0 = ωt = ⇒ t.

  • (Related to OFDM, Prony’s method, matrix pencil.)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 14 / 37

slide-51
SLIDE 51

Algorithm for general k

Reduce general k to k = 1.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-52
SLIDE 52

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-53
SLIDE 53

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-54
SLIDE 54

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-55
SLIDE 55

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-56
SLIDE 56

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-57
SLIDE 57

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-58
SLIDE 58

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-59
SLIDE 59

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-60
SLIDE 60

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-61
SLIDE 61

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket. Random permutation

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-62
SLIDE 62

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket. Random permutation

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-63
SLIDE 63

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket. Random permutation

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-64
SLIDE 64

Algorithm for general k

Reduce general k to k = 1. “Filters”: partition frequencies into O(k) buckets.

◮ Sample from time domain of each

bucket with O(log n) overhead.

◮ Recovered by k = 1 algorithm

Most frequencies alone in bucket. Random permutation

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Recovers most of x:

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 15 / 37

slide-65
SLIDE 65

Overall outline

  • x

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-66
SLIDE 66

Overall outline

  • x −

x ′

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-67
SLIDE 67

Overall outline

  • x −

x ′

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse. Repeat, k → k/2 → k/4 → · · ·

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-68
SLIDE 68

Overall outline

  • x −

x ′

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse. Repeat, k → k/2 → k/4 → · · ·

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-69
SLIDE 69

Overall outline

  • x −

x ′

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse. Repeat, k → k/2 → k/4 → · · ·

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-70
SLIDE 70

Overall outline

  • x −

x ′

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse. Repeat, k → k/2 → k/4 → · · ·

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-71
SLIDE 71

Overall outline

  • x −

x ′

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse. Repeat, k → k/2 → k/4 → · · ·

Theorem

We can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-72
SLIDE 72

Overall outline

  • x −

x ′

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial k-sparse recovery x

  • x ′

Lemma (Partial sparse recovery)

In O(k log n) expected time, we can compute an estimate x ′ such that

  • x −

x ′ is k/2-sparse. Repeat, k → k/2 → k/4 → · · ·

Theorem

We can compute x in O(k log n) expected time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 16 / 37

slide-73
SLIDE 73

How can you isolate frequencies?

Time Frequency

× = ∗ =

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT: O(n log n) x → x

slide-74
SLIDE 74

How can you isolate frequencies?

Time Frequency

× = ∗ =

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT: O(n log n) x → x

slide-75
SLIDE 75

How can you isolate frequencies?

Time Frequency

× = ∗ =

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT: O(n log n) x → x n-dimensional DFT of first k terms: O(n log n) x · rect → x ∗ sinc.

slide-76
SLIDE 76

How can you isolate frequencies?

Time Frequency

× = ∗ =

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT: O(n log n) x → x n-dimensional DFT of first k terms: O(n log n) x · rect → x ∗ sinc.

slide-77
SLIDE 77

How can you isolate frequencies?

Time Frequency

× = ∗ =

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT: O(n log n) x → x n-dimensional DFT of first k terms: O(n log n) x · rect → x ∗ sinc. k-dimensional DFT of first k terms: O(B log B) alias(x · rect) → subsample( x ∗ sinc).

slide-78
SLIDE 78

How can you isolate frequencies?

Time Frequency

× = ∗ =

Eric Price (MIT) Sparse Recovery and Fourier Sampling 17 / 37

n-dimensional DFT: O(n log n) x → x n-dimensional DFT of first k terms: O(n log n) x · rect → x ∗ sinc. k-dimensional DFT of first k terms: O(B log B) alias(x · rect) → subsample( x ∗ sinc).

slide-79
SLIDE 79

The issue

Frequency We want to isolate frequencies.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 18 / 37

slide-80
SLIDE 80

The issue

Frequency We want to isolate frequencies. The sinc filter “leaks”. Contamination from other buckets.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 18 / 37

slide-81
SLIDE 81

The issue

Frequency We want to isolate frequencies. The sinc filter “leaks”. Contamination from other buckets. We introduce a better filter: (Gaussian / prolate spheroidal sequence) convolved with rectangle.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 18 / 37

slide-82
SLIDE 82

Algorithm for exactly sparse signals

Original signal x Goal ˆ x

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-83
SLIDE 83

Algorithm for exactly sparse signals

Computed F ·x Filtered signal c F ∗c x

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-84
SLIDE 84

Algorithm for exactly sparse signals

F ·x aliased to k terms Filtered signal c F ∗c x

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-85
SLIDE 85

Algorithm for exactly sparse signals

F ·x aliased to k terms Computed samples of c F ∗c x

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-86
SLIDE 86

Algorithm for exactly sparse signals

F ·x aliased to k terms Computed samples of c F ∗c x

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-87
SLIDE 87

Algorithm for exactly sparse signals

F ·x aliased to k terms Knowledge about ˆ x

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-88
SLIDE 88

Algorithm for exactly sparse signals

F ·x aliased to k terms Knowledge about ˆ x

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-89
SLIDE 89

Algorithm for exactly sparse signals

F ·x aliased to k terms Knowledge about ˆ x

Lemma

If t is isolated in its bucket and in the “super-pass” region, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 19 / 37

slide-90
SLIDE 90

Algorithm for exactly sparse signals

Lemma

For most t, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

slide-91
SLIDE 91

Algorithm for exactly sparse signals

Lemma

For most t, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time. Time-shift x by one and repeat: b′ = xtωt. Divide to get b′/b = ωt

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

slide-92
SLIDE 92

Algorithm for exactly sparse signals

Lemma

For most t, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time. Time-shift x by one and repeat: b′ = xtωt. Divide to get b′/b = ωt = ⇒ can compute t.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

slide-93
SLIDE 93

Algorithm for exactly sparse signals

Lemma

For most t, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time. Time-shift x by one and repeat: b′ = xtωt. Divide to get b′/b = ωt = ⇒ can compute t.

◮ Just like our 1-sparse recovery algorithm, x1/x0 = ωt. Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

slide-94
SLIDE 94

Algorithm for exactly sparse signals

Lemma

For most t, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time. Time-shift x by one and repeat: b′ = xtωt. Divide to get b′/b = ωt = ⇒ can compute t.

◮ Just like our 1-sparse recovery algorithm, x1/x0 = ωt.

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

slide-95
SLIDE 95

Algorithm for exactly sparse signals

Lemma

For most t, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time. Time-shift x by one and repeat: b′ = xtωt. Divide to get b′/b = ωt = ⇒ can compute t.

◮ Just like our 1-sparse recovery algorithm, x1/x0 = ωt.

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Repeat k → k/2 → k/4 → · · ·

Eric Price (MIT) Sparse Recovery and Fourier Sampling 20 / 37

slide-96
SLIDE 96

Algorithm for exactly sparse signals

Lemma

For most t, the value b we compute for its bucket satisfies b = xt. Computing the b for all O(k) buckets takes O(k log n) time. Time-shift x by one and repeat: b′ = xtωt. Divide to get b′/b = ωt = ⇒ can compute t.

◮ Just like our 1-sparse recovery algorithm, x1/x0 = ωt.

Gives partial sparse recovery: x ′ such that x − x ′ is k/2-sparse.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Repeat k → k/2 → k/4 → · · · O(k log n) time sparse Fourier transform.

  • Eric Price (MIT)

Sparse Recovery and Fourier Sampling 20 / 37

slide-97
SLIDE 97

Algorithm for approximately sparse signals

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

slide-98
SLIDE 98

Algorithm for approximately sparse signals

What changes with noise?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

slide-99
SLIDE 99

Algorithm for approximately sparse signals

What changes with noise? Identical architecture:

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial sparse recovery x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

slide-100
SLIDE 100

Algorithm for approximately sparse signals

What changes with noise? Identical architecture:

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

Partial sparse recovery x

  • x ′

Just requires robust 1-sparse recovery.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 21 / 37

slide-101
SLIDE 101

Algorithm for approximately sparse signals: k = 1

Lemma

Suppose x is approximately 1-sparse: | xt|/ x2 90%. Then we can recover it with O(log n) samples and O(log2 n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

slide-102
SLIDE 102

Algorithm for approximately sparse signals: k = 1

Lemma

Suppose x is approximately 1-sparse: | xt|/ x2 90%. Then we can recover it with O(log n) samples and O(log2 n) time.

x1/x0 = ωt

With exact sparsity: log n bits in a single measurement.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

slide-103
SLIDE 103

Algorithm for approximately sparse signals: k = 1

Lemma

Suppose x is approximately 1-sparse: | xt|/ x2 90%. Then we can recover it with O(log n) samples and O(log2 n) time.

x1/x0 = ωt + noise

With exact sparsity: log n bits in a single measurement. With noise: only constant number of useful bits.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

slide-104
SLIDE 104

Algorithm for approximately sparse signals: k = 1

Lemma

Suppose x is approximately 1-sparse: | xt|/ x2 90%. Then we can recover it with O(log n) samples and O(log2 n) time.

x1/x0 = ωt + noise

With exact sparsity: log n bits in a single measurement. With noise: only constant number of useful bits. Choose Θ(log n) time shifts c to recover i.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

slide-105
SLIDE 105

Algorithm for approximately sparse signals: k = 1

Lemma

Suppose x is approximately 1-sparse: | xt|/ x2 90%. Then we can recover it with O(log n) samples and O(log2 n) time.

xc2/x0 = ωc2t + noise

With exact sparsity: log n bits in a single measurement. With noise: only constant number of useful bits. Choose Θ(log n) time shifts c to recover i.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

slide-106
SLIDE 106

Algorithm for approximately sparse signals: k = 1

Lemma

Suppose x is approximately 1-sparse: | xt|/ x2 90%. Then we can recover it with O(log n) samples and O(log2 n) time.

xc3/x0 = ωc3t + noise

With exact sparsity: log n bits in a single measurement. With noise: only constant number of useful bits. Choose Θ(log n) time shifts c to recover i.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 22 / 37

slide-107
SLIDE 107

Algorithm for approximately sparse signals: k = 1

Lemma

Suppose x is approximately 1-sparse: | xt|/ x2 90%. Then we can recover it with O(log n) samples and O(log2 n) time.

xc3/x0 = ωc3t + noise

With exact sparsity: log n bits in a single measurement. With noise: only constant number of useful bits. Choose Θ(log n) time shifts c to recover i. Error correcting code with efficient recovery = ⇒ Lemma.

  • Eric Price (MIT)

Sparse Recovery and Fourier Sampling 22 / 37

slide-108
SLIDE 108

Algorithm for approximately sparse signals: general k

Lemma

If x is approximately 1-sparse, we can recover it with O(log n) samples and O(log2 n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 23 / 37

slide-109
SLIDE 109

Algorithm for approximately sparse signals: general k

Lemma

If x is approximately 1-sparse, we can recover it with O(log n) samples and O(log2 n) time.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Reduce k-sparse to 1-sparse on buckets of size n/k, with log n

  • verhead per sample.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 23 / 37

slide-110
SLIDE 110

Algorithm for approximately sparse signals: general k

Lemma

If x is approximately 1-sparse, we can recover it with O(log n) samples and O(log2 n) time.

Permute Filters O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Reduce k-sparse to 1-sparse on buckets of size n/k, with log n

  • verhead per sample.

Theorem

If x is approximately k-sparse, we can recover it in O(k log(n/k) log n) time.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 23 / 37

slide-111
SLIDE 111

Empirical performance

Compare to

◮ FFTW, the “Fastest Fourier Transform in the West” ◮ AAFFT, the [GMS05] sparse Fourier transform.

0.001 0.01 0.1 1 10 26 27 28 29 210 211 212 213 214 215 216 217 218

Run Time (sec) Sparsity (K) Run Time vs Signal Sparsity (N=222)

sFFT 3.0 (Exact) FFTW AAFFT 0.9 1e-05 0.0001 0.001 0.01 0.1 1 10 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224

Run Time (sec) Signal Size (n) Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact) FFTW AAFFT 0.9

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

slide-112
SLIDE 112

Empirical performance

Compare to

◮ FFTW, the “Fastest Fourier Transform in the West” ◮ AAFFT, the [GMS05] sparse Fourier transform.

0.001 0.01 0.1 1 10 26 27 28 29 210 211 212 213 214 215 216 217 218

Run Time (sec) Sparsity (K) Run Time vs Signal Sparsity (N=222)

sFFT 3.0 (Exact) FFTW AAFFT 0.9 1e-05 0.0001 0.001 0.01 0.1 1 10 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224

Run Time (sec) Signal Size (n) Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact) FFTW AAFFT 0.9

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

slide-113
SLIDE 113

Empirical performance

Compare to

◮ FFTW, the “Fastest Fourier Transform in the West” ◮ AAFFT, the [GMS05] sparse Fourier transform.

0.001 0.01 0.1 1 10 26 27 28 29 210 211 212 213 214 215 216 217 218

Run Time (sec) Sparsity (K) Run Time vs Signal Sparsity (N=222)

sFFT 3.0 (Exact) FFTW AAFFT 0.9 1e-05 0.0001 0.001 0.01 0.1 1 10 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224

Run Time (sec) Signal Size (n) Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact) FFTW AAFFT 0.9

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

slide-114
SLIDE 114

Empirical performance

Compare to

◮ FFTW, the “Fastest Fourier Transform in the West” ◮ AAFFT, the [GMS05] sparse Fourier transform.

0.001 0.01 0.1 1 10 26 27 28 29 210 211 212 213 214 215 216 217 218

Run Time (sec) Sparsity (K) Run Time vs Signal Sparsity (N=222)

sFFT 3.0 (Exact) FFTW AAFFT 0.9 1e-05 0.0001 0.001 0.01 0.1 1 10 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224

Run Time (sec) Signal Size (n) Run Time vs Signal Size (k=50)

sFFT 3.0 (Exact) FFTW AAFFT 0.9

Faster than FFTW for wide range of values.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 24 / 37

slide-115
SLIDE 115

Recap of Sparse Fourier Transform

Theory:

◮ The fastest algorithm for Fourier transforms of sparse data. ◮ The only algorithms faster than FFT for all k = o(n). Eric Price (MIT) Sparse Recovery and Fourier Sampling 25 / 37

slide-116
SLIDE 116

Recap of Sparse Fourier Transform

Theory:

◮ The fastest algorithm for Fourier transforms of sparse data. ◮ The only algorithms faster than FFT for all k = o(n).

Practice:

◮ Implementation is faster than FFTW for a wide range of inputs. ◮ Orders of magnitude faster than previous sparse Fourier transforms. ◮ Useful in multiple applications. Eric Price (MIT) Sparse Recovery and Fourier Sampling 25 / 37

slide-117
SLIDE 117

Talk Outline

1

Sparse Fourier Transform Overview Technical Details

2

Beyond: Sparse Recovery / Compressive Sensing Overview Adaptivity Conclusion

Eric Price (MIT) Sparse Recovery and Fourier Sampling 26 / 37

slide-118
SLIDE 118

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax. x A y =

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

slide-119
SLIDE 119

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax. x A y =

Sparse Fourier

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

slide-120
SLIDE 120

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax.

Sparse Fourier

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

slide-121
SLIDE 121

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax.

Sparse Fourier MRI

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

slide-122
SLIDE 122

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax.

Sparse Fourier MRI Single-Pixel Camera

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

slide-123
SLIDE 123

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax.

Sparse Fourier MRI Single-Pixel Camera Streaming Algorithms A(x + ∆) = Ax + A∆

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

slide-124
SLIDE 124

Sparse Recovery / Compressive Sensing

Robustly recover sparse x from linear measurements y = Ax.

Sparse Fourier MRI Single-Pixel Camera Streaming Algorithms A(x + ∆) = Ax + A∆ Genetic Testing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 27 / 37

slide-125
SLIDE 125

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

slide-126
SLIDE 126

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a] MRI: minimize Fourier sample complexity [GHIKPS13, IKP14]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

slide-127
SLIDE 127

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a] MRI: minimize Fourier sample complexity [GHIKPS13, IKP14] Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

slide-128
SLIDE 128

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a] MRI: minimize Fourier sample complexity [GHIKPS13, IKP14] Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11] Streaming: improved analysis of Count-Sketch [MP14, PW11, P11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

slide-129
SLIDE 129

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a] MRI: minimize Fourier sample complexity [GHIKPS13, IKP14] Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11] Streaming: improved analysis of Count-Sketch [MP14, PW11, P11] Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

slide-130
SLIDE 130

My Contributions

Sparse Fourier: minimize time complexity [HIKP12b, HIKP12a] MRI: minimize Fourier sample complexity [GHIKPS13, IKP14] Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11] Streaming: improved analysis of Count-Sketch [MP14, PW11, P11] Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 28 / 37

slide-131
SLIDE 131

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-132
SLIDE 132

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn. Choose v ∈ Rn, observe y = v, x.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-133
SLIDE 133

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn. Choose v ∈ Rn, observe y = v, x. Choose another v and repeat as needed.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-134
SLIDE 134

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn. Choose v ∈ Rn, observe y = v, x. Choose another v and repeat as needed. Output x ′ satisfying x ′ − x2 < (1 + ǫ) min

k-sparse x(k)

x − x(k)2

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-135
SLIDE 135

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn. Choose v ∈ Rn, observe y = v, x. Choose another v and repeat as needed. Output x ′ satisfying x ′ − x2 < (1 + ǫ) min

k-sparse x(k)

x − x(k)2 Nonadaptively: Θ(k log(n/k)) measurements necessary and

  • sufficient. [Cand`

es-Romberg-Tao ’06, DIPW ’10]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-136
SLIDE 136

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn. Choose v ∈ Rn, observe y = v, x. Choose another v and repeat as needed. Output x ′ satisfying x ′ − x2 < (1 + ǫ) min

k-sparse x(k)

x − x(k)2 Nonadaptively: Θ(k log(n/k)) measurements necessary and

  • sufficient. [Cand`

es-Romberg-Tao ’06, DIPW ’10] Natural question: does adaptivity help?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-137
SLIDE 137

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn. Choose v ∈ Rn, observe y = v, x. Choose another v and repeat as needed. Output x ′ satisfying x ′ − x2 < (1 + ǫ) min

k-sparse x(k)

x − x(k)2 Nonadaptively: Θ(k log(n/k)) measurements necessary and

  • sufficient. [Cand`

es-Romberg-Tao ’06, DIPW ’10] Natural question: does adaptivity help?

◮ Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...] Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-138
SLIDE 138

Adaptive Sparse Recovery Model

Unknown approximately k-sparse vector x ∈ Rn. Choose v ∈ Rn, observe y = v, x. Choose another v and repeat as needed. Output x ′ satisfying x ′ − x2 < (1 + ǫ) min

k-sparse x(k)

x − x(k)2 Nonadaptively: Θ(k log(n/k)) measurements necessary and

  • sufficient. [Cand`

es-Romberg-Tao ’06, DIPW ’10] Natural question: does adaptivity help?

◮ Studied in [MSW08, JXC08, CHNR08, AWZ08, HCN09, ACD11, ...]

First asymptotic improvement: O(k log log(n/k)) measurements. [IPW ’11]

Eric Price (MIT) Sparse Recovery and Fourier Sampling 29 / 37

slide-139
SLIDE 139

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

slide-140
SLIDE 140

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

slide-141
SLIDE 141

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

slide-142
SLIDE 142

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

slide-143
SLIDE 143

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

slide-144
SLIDE 144

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

slide-145
SLIDE 145

Applications of Adaptivity

Eric Price (MIT) Sparse Recovery and Fourier Sampling 30 / 37

slide-146
SLIDE 146

Outline of Algorithm

Theorem

Adaptive k-sparse recovery is possible with O(k log log(n/k)) measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

slide-147
SLIDE 147

Outline of Algorithm

Theorem

Adaptive k-sparse recovery is possible with O(k log log(n/k)) measurements.

Permute Partition O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

slide-148
SLIDE 148

Outline of Algorithm

Theorem

Adaptive k-sparse recovery is possible with O(k log log(n/k)) measurements.

Permute Partition O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Suffices to solve for k = 1:

Lemma

Adaptive 1-sparse recovery is possible with O(log log n) measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

slide-149
SLIDE 149

Outline of Algorithm

Theorem

Adaptive k-sparse recovery is possible with O(k log log(n/k)) measurements.

Permute Partition O(k) 1-sparse recovery 1-sparse recovery 1-sparse recovery 1-sparse recovery

x

  • x ′

Suffices to solve for k = 1:

Lemma

Adaptive 1-sparse recovery is possible with O(log log n) measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 31 / 37

slide-150
SLIDE 150

1-sparse recovery: non-adaptive lower bound

Lemma

Adaptive 1-sparse recovery is possible with O(log log n) measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

slide-151
SLIDE 151

1-sparse recovery: non-adaptive lower bound

Lemma

Adaptive 1-sparse recovery is possible with O(log log n) measurements. Non-adaptive lower bound: why is this hard?

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

slide-152
SLIDE 152

1-sparse recovery: non-adaptive lower bound

Lemma

Adaptive 1-sparse recovery is possible with O(log log n) measurements. Non-adaptive lower bound: why is this hard? Hard case: x is random ei plus Gaussian noise w with w2 ≈ 1.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

slide-153
SLIDE 153

1-sparse recovery: non-adaptive lower bound

Lemma

Adaptive 1-sparse recovery is possible with O(log log n) measurements. Non-adaptive lower bound: why is this hard? Hard case: x is random ei plus Gaussian noise w with w2 ≈ 1. Robust recovery must locate i.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

slide-154
SLIDE 154

1-sparse recovery: non-adaptive lower bound

Lemma

Adaptive 1-sparse recovery is possible with O(log log n) measurements. Non-adaptive lower bound: why is this hard? Hard case: x is random ei plus Gaussian noise w with w2 ≈ 1. Robust recovery must locate i. Observations v, x = vi + v, w = vi + v2

√n z, for z ∼ N(0, 1).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 32 / 37

slide-155
SLIDE 155

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-156
SLIDE 156

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-157
SLIDE 157

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-158
SLIDE 158

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-159
SLIDE 159

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-160
SLIDE 160

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-161
SLIDE 161

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Shannon 1948: information capacity I(i, v, x) 1 2 log(1 + SNR) where SNR denotes the “signal-to-noise ratio,” SNR = E[signal2] E[noise2] = E[v2

i ]

v2

2/n = 1

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-162
SLIDE 162

1-sparse recovery: non-adaptive lower bound

Observe v, x = vi + v2

√n z, where z ∼ N(0, 1)

Shannon 1948: information capacity I(i, v, x) 1 2 log(1 + SNR) where SNR denotes the “signal-to-noise ratio,” SNR = E[signal2] E[noise2] = E[v2

i ]

v2

2/n = 1

Finding i needs Ω(log n) non-adaptive measurements.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 33 / 37

slide-163
SLIDE 163

1-sparse recovery: changes in adaptive setting

Information capacity I(i, v, x) 1 2 log(1 + SNR). where SNR denotes the “signal-to-noise ratio,” SNR = E[v2

i ]

v2

2/n.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 34 / 37

slide-164
SLIDE 164

1-sparse recovery: changes in adaptive setting

Information capacity I(i, v, x) 1 2 log(1 + SNR). where SNR denotes the “signal-to-noise ratio,” SNR = E[v2

i ]

v2

2/n.

If i is independent of v, this is O(1).

Eric Price (MIT) Sparse Recovery and Fourier Sampling 34 / 37

slide-165
SLIDE 165

1-sparse recovery: changes in adaptive setting

Information capacity I(i, v, x) 1 2 log(1 + SNR). where SNR denotes the “signal-to-noise ratio,” SNR = E[v2

i ]

v2

2/n.

If i is independent of v, this is O(1). As we learn about i, we can increase the SNR.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 34 / 37

slide-166
SLIDE 166

1-sparse recovery: idea

x = ei + w

0 bits v Candidate set Signal

SNR = 2 I(i, v, x) log SNR = 1 v, x = vi + v, w

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

slide-167
SLIDE 167

1-sparse recovery: idea

x = ei + w

0 bits 1 bit v Candidate set Signal

SNR = 22 I(i, v, x) log SNR = 2 v, x = vi + v, w

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

slide-168
SLIDE 168

1-sparse recovery: idea

x = ei + w

0 bits 1 bit 2 bits v Candidate set Signal

SNR = 24 I(i, v, x) log SNR = 4 v, x = vi + v, w

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

slide-169
SLIDE 169

1-sparse recovery: idea

x = ei + w

0 bits 1 bit 2 bits 4 bits v Candidate set Signal

SNR = 28 I(i, v, x) log SNR = 8 v, x = vi + v, w

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

slide-170
SLIDE 170

1-sparse recovery: idea

x = ei + w

0 bits 1 bit 2 bits 4 bits 8 bits v Candidate set Signal

SNR = 216 I(i, v, x) log SNR = 16 v, x = vi + v, w

Eric Price (MIT) Sparse Recovery and Fourier Sampling 35 / 37

slide-171
SLIDE 171

1-sparse recovery

Lemma (IPW11)

Adaptive 1-sparse recovery takes O(log log n) measurements.

5 10 15 20 25 30

log n

5 10 15 20 25 30

Number of measurements

m as a function of n (SNR=10db,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

10 20 30 40 50

SNR (dB)

5 10 15 20 25 30

Number of measurements

m as a function of SNR (n=8192,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

slide-172
SLIDE 172

1-sparse recovery

Lemma (IPW11, PW13)

Adaptive 1-sparse recovery takes Θ(log log n) measurements.

5 10 15 20 25 30

log n

5 10 15 20 25 30

Number of measurements

m as a function of n (SNR=10db,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

10 20 30 40 50

SNR (dB)

5 10 15 20 25 30

Number of measurements

m as a function of SNR (n=8192,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

slide-173
SLIDE 173

1-sparse recovery

Lemma (IPW11, PW13)

Adaptive 1-sparse recovery takes Θ(log log n) measurements.

5 10 15 20 25 30

log n

5 10 15 20 25 30

Number of measurements

m as a function of n (SNR=10db,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

10 20 30 40 50

SNR (dB)

5 10 15 20 25 30

Number of measurements

m as a function of SNR (n=8192,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

slide-174
SLIDE 174

1-sparse recovery

Lemma (IPW11, PW13)

Adaptive 1-sparse recovery takes Θ(log log n) measurements.

5 10 15 20 25 30

log n

5 10 15 20 25 30

Number of measurements

m as a function of n (SNR=10db,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

10 20 30 40 50

SNR (dB)

5 10 15 20 25 30

Number of measurements

m as a function of SNR (n=8192,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

slide-175
SLIDE 175

1-sparse recovery

Lemma (IPW11, PW13)

Adaptive 1-sparse recovery takes Θ(log log n) measurements.

5 10 15 20 25 30

log n

5 10 15 20 25 30

Number of measurements

m as a function of n (SNR=10db,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

10 20 30 40 50

SNR (dB)

5 10 15 20 25 30

Number of measurements

m as a function of SNR (n=8192,k=1)

Gaussian measurements, L1 minimization Adaptive measurements

Gives Θ(k log log(n/k)) k-sparse recovery via general framework.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 36 / 37

slide-176
SLIDE 176

Summary

Sparse Fourier transform

◮ Fastest algorithm for Fourier transforms on sparse data ◮ Already has applications with substantial improvements Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

slide-177
SLIDE 177

Summary

Sparse Fourier transform

◮ Fastest algorithm for Fourier transforms on sparse data ◮ Already has applications with substantial improvements

Broader sparse recovery theory

◮ Sparse Fourier: minimize time complexity [HIKP12] ◮ MRI: minimize Fourier sample complexity [GHIKPS13, IKP14] ◮ Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11] ◮ Streaming: improved analysis of Count-Sketch [MP14, PW11, P11] ◮ Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13] Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

slide-178
SLIDE 178

Summary

Sparse Fourier transform

◮ Fastest algorithm for Fourier transforms on sparse data ◮ Already has applications with substantial improvements

Broader sparse recovery theory

◮ Sparse Fourier: minimize time complexity [HIKP12] ◮ MRI: minimize Fourier sample complexity [GHIKPS13, IKP14] ◮ Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11] ◮ Streaming: improved analysis of Count-Sketch [MP14, PW11, P11] ◮ Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Lower bounds

◮ Based on Gaussian channel capacity: tight bounds, extensible to

adaptive settings.

◮ Based on communication complexity: extends to ℓ1 setting. Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

slide-179
SLIDE 179

Summary

Sparse Fourier transform

◮ Fastest algorithm for Fourier transforms on sparse data ◮ Already has applications with substantial improvements

Broader sparse recovery theory

◮ Sparse Fourier: minimize time complexity [HIKP12] ◮ MRI: minimize Fourier sample complexity [GHIKPS13, IKP14] ◮ Camera: use Earth-Mover Distance metric [IP11, GIP10, GIPR11] ◮ Streaming: improved analysis of Count-Sketch [MP14, PW11, P11] ◮ Genetic testing: first asymptotic gain using adaptivity [IPW11, PW13]

Lower bounds

◮ Based on Gaussian channel capacity: tight bounds, extensible to

adaptive settings.

◮ Based on communication complexity: extends to ℓ1 setting.

Thank You

Eric Price (MIT) Sparse Recovery and Fourier Sampling 37 / 37

slide-180
SLIDE 180

The Future

Make sparse Fourier applicable to more problems

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

slide-181
SLIDE 181

The Future

Make sparse Fourier applicable to more problems

◮ Better sample complexity Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

slide-182
SLIDE 182

The Future

Make sparse Fourier applicable to more problems

◮ Better sample complexity ◮ Incorporate stronger notions of structure Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

slide-183
SLIDE 183

The Future

Make sparse Fourier applicable to more problems

◮ Better sample complexity ◮ Incorporate stronger notions of structure

Tight constants in compressive sensing

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

slide-184
SLIDE 184

The Future

Make sparse Fourier applicable to more problems

◮ Better sample complexity ◮ Incorporate stronger notions of structure

Tight constants in compressive sensing

◮ Analogous to channel capacity in coding theory. Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

slide-185
SLIDE 185

The Future

Make sparse Fourier applicable to more problems

◮ Better sample complexity ◮ Incorporate stronger notions of structure

Tight constants in compressive sensing

◮ Analogous to channel capacity in coding theory. ◮ Lower bound techniques, from information theory, should be strong

enough.

Eric Price (MIT) Sparse Recovery and Fourier Sampling 38 / 37

slide-186
SLIDE 186

Eric Price (MIT) Sparse Recovery and Fourier Sampling 39 / 37

slide-187
SLIDE 187

Eric Price (MIT) Sparse Recovery and Fourier Sampling 40 / 37