Adaptive Sparse Recovery with Limited Adaptivity
Akshay Kamath Eric Price
UT Austin
2018-11-27
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 1 / 33
Adaptive Sparse Recovery with Limited Adaptivity Akshay Kamath Eric - - PowerPoint PPT Presentation
Adaptive Sparse Recovery with Limited Adaptivity Akshay Kamath Eric Price UT Austin 2018-11-27 Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 1 / 33 Outline Introduction 1 Analysis for k = 1 2
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 1 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 2 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 3 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 4 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 4 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 4 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 4 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 4 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
◮ Genetic testing: mixing blood samples. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
◮ Genetic testing: mixing blood samples. ◮ Streaming updates: A(x + ∆) = Ax + A∆. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
◮ Genetic testing: mixing blood samples. ◮ Streaming updates: A(x + ∆) = Ax + A∆. ◮ Camera optics: filter in front of lens. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
◮ Genetic testing: mixing blood samples. ◮ Streaming updates: A(x + ∆) = Ax + A∆. ◮ Camera optics: filter in front of lens.
◮ Informally: get close to x if x is close to k-sparse. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
◮ Genetic testing: mixing blood samples. ◮ Streaming updates: A(x + ∆) = Ax + A∆. ◮ Camera optics: filter in front of lens.
◮ Informally: get close to x if x is close to k-sparse.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 5 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 6 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 6 / 33
◮ Choose matrix Ai based on previous observations (possibly
◮ Observe Aix. ◮ Number of measurements m is total number of rows in all Ai. ◮ Number of rounds is R.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 6 / 33
◮ Choose matrix Ai based on previous observations (possibly
◮ Observe Aix. ◮ Number of measurements m is total number of rows in all Ai. ◮ Number of rounds is R.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 6 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 7 / 33
◮ [Malioutov, Sanghavi, Willski ’08], [Castro, Haupt, Nowak, Raz ’08],
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 7 / 33
◮ [Malioutov, Sanghavi, Willski ’08], [Castro, Haupt, Nowak, Raz ’08],
◮ [Indyk-Price-Woodruff ’11], [Nakos, Shi, Woodruff, Zhang ’18]
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 7 / 33
◮ [Malioutov, Sanghavi, Willski ’08], [Castro, Haupt, Nowak, Raz ’08],
◮ [Indyk-Price-Woodruff ’11], [Nakos, Shi, Woodruff, Zhang ’18]
◮ [Arias-Castro, Cand`
εk
◮ [Price, Woodruff ’13]: m log log n. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 7 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 8 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 8 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 8 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 8 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 8 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 9 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
◮ R = 1 lower bound: Ω(log n). Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
◮ R = 1 lower bound: Ω(log n). ◮ Adaptive upper bound: O(log log n). Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
◮ R = 1 lower bound: Ω(log n). ◮ Adaptive upper bound: O(log log n). ◮ Adaptive lower bound: Ω(log log n). Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
◮ R = 1 lower bound: Ω(log n). ◮ Adaptive upper bound: O(log log n). ◮ Adaptive lower bound: Ω(log log n).
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
◮ R = 1 lower bound: Ω(log n). ◮ Adaptive upper bound: O(log log n). ◮ Adaptive lower bound: Ω(log log n).
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
◮ R = 1 lower bound: Ω(log n). ◮ Adaptive upper bound: O(log log n). ◮ Adaptive lower bound: Ω(log log n).
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 10 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 11 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 12 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 12 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 12 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 13 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 13 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 13 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 13 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 13 / 33
◮ Given b bits of information about z. ◮ Identifies z to set of size n/2b. ◮ Increases SNR, E[v 2
z ], by 2b.
◮ Recover b bits of information in one measurement. ◮ 1 → 2 → · · · → log n in log log n measurements. ◮ R = 2: 1 → √log n → log n in √log n measurements/round. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 14 / 33
◮ Given b bits of information about z. ◮ Identifies z to set of size n/2b. ◮ Increases SNR, E[v 2
z ], by 2b.
◮ Recover b bits of information in one measurement. ◮ 1 → 2 → · · · → log n in log log n measurements. ◮ R = 2: 1 → √log n → log n in √log n measurements/round.
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) bits known. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 14 / 33
◮ Given b bits of information about z. ◮ Identifies z to set of size n/2b. ◮ Increases SNR, E[v 2
z ], by 2b.
◮ Recover b bits of information in one measurement. ◮ 1 → 2 → · · · → log n in log log n measurements. ◮ R = 2: 1 → √log n → log n in √log n measurements/round.
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) bits known.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 14 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) bits known. ◮ Show any measurement gives O(b + 1) bits of information. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 15 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) bits known. ◮ Show any measurement gives O(b + 1) bits of information.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 15 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) bits known. ◮ Show any measurement gives O(b + 1) bits of information.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 15 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) bits known. ◮ Show any measurement gives O(b + 1) bits of information.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 15 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) bits known. ◮ Show any measurement gives O(b + 1) bits of information.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 15 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information.
◮ SJ = {z | pz ∈ [2J/n, 2J+1/n]} Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information.
◮ SJ = {z | pz ∈ [2J/n, 2J+1/n]} ◮ E[J] b. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information.
◮ SJ = {z | pz ∈ [2J/n, 2J+1/n]} ◮ E[J] b.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information.
◮ SJ = {z | pz ∈ [2J/n, 2J+1/n]} ◮ E[J] b.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information.
◮ SJ = {z | pz ∈ [2J/n, 2J+1/n]} ◮ E[J] b.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
◮ At each stage, have posterior distribution p on z. ◮ b = log n − H(p) = pz log npz bits known. ◮ Show any measurement gives O(b + 1) bits of information.
◮ SJ = {z | pz ∈ [2J/n, 2J+1/n]} ◮ E[J] b.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 16 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 17 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 17 / 33
◮ O(m) bits learned in first round. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 17 / 33
◮ O(m) bits learned in first round. ◮ O(m2) bits in second round. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 17 / 33
◮ O(m) bits learned in first round. ◮ O(m2) bits in second round. ◮ Hence m √log n. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 17 / 33
◮ O(m) bits learned in first round. ◮ O(m2) bits in second round. ◮ Hence m √log n.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 17 / 33
◮ O(m) bits learned in first round. ◮ O(m2) bits in second round. ◮ Hence m √log n.
◮ Ω(log log n) for unlimited R. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 17 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 18 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 19 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 19 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 19 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 20 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 20 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 20 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 20 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 20 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block.
◮ Strong enough, at least for constant R. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block.
◮ Strong enough, at least for constant R. ◮ True for product distributions p... Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block.
◮ Strong enough, at least for constant R. ◮ True for product distributions p... ◮ but correlated p can make this false. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block.
◮ Strong enough, at least for constant R. ◮ True for product distributions p... ◮ but correlated p can make this false.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block.
◮ Strong enough, at least for constant R. ◮ True for product distributions p... ◮ but correlated p can make this false.
◮ True! Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
◮ True but too weak: would get Ω(√k log n) not k√log n.
◮ Strong but false: if algorithm does 1-sparse recovery on first block, it
◮ But the learned bits are only about that first block.
◮ Strong enough, at least for constant R. ◮ True for product distributions p... ◮ but correlated p can make this false.
◮ True! ◮ Strong enough if b > k log k after the first round. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 21 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 22 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 22 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 22 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 22 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 22 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 23 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 23 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 23 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 23 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 24 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 24 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 25 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 25 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 25 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 25 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 25 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 25 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
◮ Less restriction on k? Conjecture:
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
◮ Less restriction on k? Conjecture:
◮ Better dependence on R? Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 26 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 27 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 28 / 33
1
2
3
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 28 / 33
1
2
3
◮ [IPW ’11]: cleanup is recursive, multiplying rounds by O(log∗ k). ◮ [NSZW ’18]: 1 round setup, 2 rounds cleanup. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 28 / 33
1
2
3
◮ [IPW ’11]: cleanup is recursive, multiplying rounds by O(log∗ k). ◮ [NSZW ’18]: 1 round setup, 2 rounds cleanup.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 28 / 33
1
2
3
◮ [IPW ’11]: cleanup is recursive, multiplying rounds by O(log∗ k). ◮ [NSZW ’18]: 1 round setup, 2 rounds cleanup.
◮ Instead, reduce to C-approximate k-sparse recovery for C ≫ 1. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 28 / 33
1
2
3
◮ [IPW ’11]: cleanup is recursive, multiplying rounds by O(log∗ k). ◮ [NSZW ’18]: 1 round setup, 2 rounds cleanup.
◮ Instead, reduce to C-approximate k-sparse recovery for C ≫ 1. ◮ This is solvable nonadaptively in O(k logC(n/k) · log∗ k)
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 28 / 33
1 Throw the coordinates into B = k · 2
◮ k log(B/k) = k√log n measurements. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 29 / 33
1 Throw the coordinates into B = k · 2
◮ k log(B/k) = k√log n measurements. 2 Apply k-sparse 2
◮ k logC n = k√log n measurements. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 29 / 33
1 Throw the coordinates into B = k · 2
◮ k log(B/k) = k√log n measurements. 2 Apply k-sparse 2
◮ k logC n = k√log n measurements.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 29 / 33
1 Throw the coordinates into B = k · 2
◮ k log(B/k) = k√log n measurements. 2 Apply k-sparse 2
◮ k logC n = k√log n measurements.
◮ There will be collisions. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 29 / 33
1 Throw the coordinates into B = k · 2
◮ k log(B/k) = k√log n measurements. 2 Apply k-sparse 2
◮ k logC n = k√log n measurements.
◮ There will be collisions. ◮ Yet if x has no noise, must find every entry. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 29 / 33
1 Throw the coordinates into B = k · 2
◮ k log(B/k) = k√log n measurements. 2 Apply k-sparse 2
◮ k logC n = k√log n measurements.
◮ There will be collisions. ◮ Yet if x has no noise, must find every entry.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 29 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 30 / 33
◮ Successful recovery must find all but O(1) binary entries of x. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 30 / 33
◮ Successful recovery must find all but O(1) binary entries of x.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 30 / 33
◮ Successful recovery must find all but O(1) binary entries of x.
◮ Goal: preimage of k-sparse recovery on y includes large entries in x. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 30 / 33
◮ Successful recovery must find all but O(1) binary entries of x.
◮ Goal: preimage of k-sparse recovery on y includes large entries in x.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 30 / 33
◮ Successful recovery must find all but O(1) binary entries of x.
◮ Goal: preimage of k-sparse recovery on y includes large entries in x.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 30 / 33
◮ Successful recovery must find all but O(1) binary entries of x.
◮ Goal: preimage of k-sparse recovery on y includes large entries in x.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 30 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 31 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 31 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 31 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 31 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 32 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 32 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 32 / 33
◮ Take union of three independent sparse recovery attempts. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 32 / 33
◮ Take union of three independent sparse recovery attempts. ◮ Expected false negatives are O(noise), so can be skipped. Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 32 / 33
◮ Take union of three independent sparse recovery attempts. ◮ Expected false negatives are O(noise), so can be skipped.
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 32 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 33 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 33 / 33
◮ Are ω(k) measurements necessary for unlimited R? Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 33 / 33
◮ Are ω(k) measurements necessary for unlimited R?
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 33 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 34 / 33
Akshay Kamath, Eric Price (UT Austin) Adaptive Sparse Recovery with Limited Adaptivity 35 / 33