Adaptive Sparse Recovery
Eric Price
MIT
2012-04-26 Joint work with Piotr Indyk and David Woodruff, 2011-2012
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 1 / 29
Adaptive Sparse Recovery Eric Price MIT 2012-04-26 Joint work - - PowerPoint PPT Presentation
Adaptive Sparse Recovery Eric Price MIT 2012-04-26 Joint work with Piotr Indyk and David Woodruff, 2011-2012 Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 1 / 29 Outline Motivating Example 1 Eric Price (MIT) Adaptive Sparse
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 1 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 2 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 2 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 2 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 2 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 3 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 4 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 4 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 4 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 4 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 4 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 4 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
◮ Trying to learn x ∈ Rn. (Here, x ∈ {0, 1, 2}n.) Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
◮ Trying to learn x ∈ Rn. (Here, x ∈ {0, 1, 2}n.) ◮ Choose coefficients v ∈ Rn. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
◮ Trying to learn x ∈ Rn. (Here, x ∈ {0, 1, 2}n.) ◮ Choose coefficients v ∈ Rn. ◮ Measure v, x with noise. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
◮ Trying to learn x ∈ Rn. (Here, x ∈ {0, 1, 2}n.) ◮ Choose coefficients v ∈ Rn. ◮ Measure v, x with noise.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 5 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 6 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 7 / 29
◮ Which of n people have a genetic mutation. ◮ Image ◮ Traffic pattern of packets on a network. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 8 / 29
◮ Which of n people have a genetic mutation. ◮ Image ◮ Traffic pattern of packets on a network.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 8 / 29
◮ Which of n people have a genetic mutation. ◮ Image ◮ Traffic pattern of packets on a network.
◮ Genetics: most people don’t have the mutation. ◮ Images: mostly smooth with some edges. ◮ Traffic: Zipf distribution. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 8 / 29
◮ Which of n people have a genetic mutation. ◮ Image ◮ Traffic pattern of packets on a network.
◮ Genetics: most people don’t have the mutation. ◮ Images: mostly smooth with some edges. ◮ Traffic: Zipf distribution.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 8 / 29
◮ Which of n people have a genetic mutation. ◮ Image ◮ Traffic pattern of packets on a network.
◮ Genetics: most people don’t have the mutation. ◮ Images: mostly smooth with some edges. ◮ Traffic: Zipf distribution.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 8 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 9 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 9 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 9 / 29
◮ $30,000 for 256x256 IR sensor. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 9 / 29
◮ $30,000 for 256x256 IR sensor.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 9 / 29
◮ $30,000 for 256x256 IR sensor.
◮ What structure? Sparsity. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 9 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 10 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 10 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 10 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 11 / 29
◮ Genetic testing: mixing blood samples. ◮ Streaming updates: A(x + ∆) = Ax + A∆. ◮ Camera optics: filter in front of lens. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 11 / 29
◮ Genetic testing: mixing blood samples. ◮ Streaming updates: A(x + ∆) = Ax + A∆. ◮ Camera optics: filter in front of lens.
◮ Quickly ◮ Robustly: get close to x if x is close to k-sparse. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 11 / 29
◮ Genetic testing: mixing blood samples. ◮ Streaming updates: A(x + ∆) = Ax + A∆. ◮ Camera optics: filter in front of lens.
◮ Quickly ◮ Robustly: get close to x if x is close to k-sparse.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 11 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 12 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 12 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 12 / 29
◮ Choose matrix Ai based on previous observations (possibly
◮ Observe Aix. ◮ Number of measurements m is total number of rows in all Ai. ◮ Number of rounds is r.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 12 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 13 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 13 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 13 / 29
◮ Using r = O(log log n log∗ k) rounds. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 13 / 29
◮ Using r = O(log log n log∗ k) rounds.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 13 / 29
◮ Using r = O(log log n log∗ k) rounds.
◮ Separating dependence on n and ǫ. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 13 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly.
◮ Programmable pixels (mirrors or LCD display): Yes. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly.
◮ Programmable pixels (mirrors or LCD display): Yes. ◮ Hardwired lens: No. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly.
◮ Programmable pixels (mirrors or LCD display): Yes. ◮ Hardwired lens: No.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly.
◮ Programmable pixels (mirrors or LCD display): Yes. ◮ Hardwired lens: No.
◮ Adaptivity corresponds to multiple passes. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly.
◮ Programmable pixels (mirrors or LCD display): Yes. ◮ Hardwired lens: No.
◮ Adaptivity corresponds to multiple passes. ◮ Router finding most common connections: No. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
◮ Yes, but multiple rounds can be costly.
◮ Programmable pixels (mirrors or LCD display): Yes. ◮ Hardwired lens: No.
◮ Adaptivity corresponds to multiple passes. ◮ Router finding most common connections: No. ◮ Mapreduce finding most frequent URLs: Yes. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 14 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 15 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 16 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 16 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 16 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 17 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 17 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 17 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 17 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 17 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 18 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 19 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 19 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 19 / 29
◮ Equivalently, for constant E[v2
i ] we can decrease v2.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 19 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 20 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 20 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 20 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 20 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 20 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 21 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 21 / 29
◮ α may not be 1. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 21 / 29
◮ α may not be 1. ◮ Work for a specific x with 3/4 probability. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 21 / 29
◮ α may not be 1. ◮ Work for a specific x with 3/4 probability. ◮ Distribution over A, for fixed w. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 21 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 22 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 22 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 22 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 22 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 22 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 22 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 22 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 23 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 24 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 25 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 25 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 25 / 29
◮ Subsample at rate ǫ/k and apply the lemma, O(k/ǫ) times. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 25 / 29
◮ Subsample at rate ǫ/k and apply the lemma, O(k/ǫ) times. ◮ Replace k by k/2, repeat. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 25 / 29
10 20 30 40 50
SNR (dB)
5 10 15 20 25 30
Number of measurements
Gaussian measurements, L1 minimization Adaptive measurements
5 10 15 20 25 30
log n
5 10 15 20 25 30
Number of measurements
Gaussian measurements, L1 minimization Adaptive measurements
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 26 / 29
10 20 30 40 50
SNR (dB)
5 10 15 20 25 30
Number of measurements
Gaussian measurements, L1 minimization Adaptive measurements
5 10 15 20 25 30
log n
5 10 15 20 25 30
Number of measurements
Gaussian measurements, L1 minimization Adaptive measurements
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 26 / 29
◮ O( 1
ǫk log log(n/k)) measurements.
◮ O(log∗ k log log n) rounds. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 27 / 29
◮ O( 1
ǫk log log(n/k)) measurements.
◮ O(log∗ k log log n) rounds.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 27 / 29
◮ O( 1
ǫk log log(n/k)) measurements.
◮ O(log∗ k log log n) rounds.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 27 / 29
◮ O( 1
ǫk log log(n/k)) measurements.
◮ O(log∗ k log log n) rounds.
◮ For k = 1, tight up to O(log∗ k) factor in rounds. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 27 / 29
◮ O( 1
ǫk log log(n/k)) measurements.
◮ O(log∗ k log log n) rounds.
◮ For k = 1, tight up to O(log∗ k) factor in rounds.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 27 / 29
◮ O( 1
ǫk log log(n/k)) measurements.
◮ O(log∗ k log log n) rounds.
◮ For k = 1, tight up to O(log∗ k) factor in rounds.
◮ Separates dependence on ǫ and n. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 27 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 28 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 29 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 29 / 29
◮ Also: 2 rounds, O( 1
ǫk log(k/ǫ) + k log(n/k)) measurements.
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 29 / 29
◮ Also: 2 rounds, O( 1
ǫk log(k/ǫ) + k log(n/k)) measurements.
◮ Algorithm is O(log∗ k) rounds off lower bound. ◮ Given 4 iterations, how many total blood tests do we need? Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 29 / 29
◮ Also: 2 rounds, O( 1
ǫk log(k/ǫ) + k log(n/k)) measurements.
◮ Algorithm is O(log∗ k) rounds off lower bound. ◮ Given 4 iterations, how many total blood tests do we need?
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 29 / 29
◮ Also: 2 rounds, O( 1
ǫk log(k/ǫ) + k log(n/k)) measurements.
◮ Algorithm is O(log∗ k) rounds off lower bound. ◮ Given 4 iterations, how many total blood tests do we need?
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 29 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 30 / 29
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29
◮ A perfect hash, so heavy hitters land in different blocks. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29
◮ A perfect hash, so heavy hitters land in different blocks. ◮ Each heavy hitter dominates the noise in the same block. Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29
◮ A perfect hash, so heavy hitters land in different blocks. ◮ Each heavy hitter dominates the noise in the same block. ◮ Overall, the noise grows by at most 1 + ǫ/2 factor Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29
◮ A perfect hash, so heavy hitters land in different blocks. ◮ Each heavy hitter dominates the noise in the same block. ◮ Overall, the noise grows by at most 1 + ǫ/2 factor
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29
◮ A perfect hash, so heavy hitters land in different blocks. ◮ Each heavy hitter dominates the noise in the same block. ◮ Overall, the noise grows by at most 1 + ǫ/2 factor
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29
◮ A perfect hash, so heavy hitters land in different blocks. ◮ Each heavy hitter dominates the noise in the same block. ◮ Overall, the noise grows by at most 1 + ǫ/2 factor
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29
◮ A perfect hash, so heavy hitters land in different blocks. ◮ Each heavy hitter dominates the noise in the same block. ◮ Overall, the noise grows by at most 1 + ǫ/2 factor
Eric Price (MIT) Adaptive Sparse Recovery 2012-04-26 31 / 29