learning and optimization for next generation wireless
play

Learning and Optimization for Next Generation Wireless Networks - PowerPoint PPT Presentation

Learning and Optimization for Next Generation Wireless Networks Tara Javidi S. Chiu, A. Lalitha, N. Ronquillo, O. Shayevitz, S. Shubhanshu, Y. Kaspi 1 / 30 Motivation & Setup Motivation I Motivation II Examles Noisy Search Code


  1. Spectrum Sensing: Problem Statement Directional transmission B ⊂ 2 π is available for transmission � Motivation & Setup Angular resolution of δ ≤ B � Examles Spectrum Sensing ⊲ Initial Access Noisy Search Code to Search Break Experiment Design Subsets of B are used sequentially by transmitter (receiver) � Inspection of a subset results in a signal plus noise � measurement Y a = a T ( W + Z ) B Z ∼ N (0 , δσ 2 I ) a , W ∈ { 0 , 1 } || W || 0 = K δ 7 / 30

  2. Spectrum Sensing: Problem Statement Directional transmission B ⊂ 2 π is available for transmission � Motivation & Setup Angular resolution of δ ≤ B � Examles Spectrum Sensing ⊲ Initial Access Noisy Search Code to Search Break Experiment Design Subsets of B are used sequentially by transmitter (receiver) � Inspection of a subset results in a signal plus noise � measurement Y a = a T ( W + Z ) B Z ∼ N (0 , δσ 2 I ) a , W ∈ { 0 , 1 } || W || 0 = K δ 7 / 30

  3. Spectrum Sensing: Problem Statement Directional transmission B ⊂ 2 π is available for transmission � Motivation & Setup Angular resolution of δ ≤ B � Examles Spectrum Sensing ⊲ Initial Access Noisy Search Code to Search Break Experiment Design Subsets of B are used sequentially by transmitter (receiver) � Inspection of a subset results in a signal plus noise � measurement Y a = a T ( W + Z ) B Z ∼ N (0 , δσ 2 I ) a , W ∈ { 0 , 1 } || W || 0 = K δ Minimize E { τ ǫ } � 7 / 30

  4. Motivation & Setup Examles ⊲ Noisy Search Problem Setup Questions Analysis I Analysis II Summary Result Measurement-Dependent Noisy Search Code to Search Break Experiment Design 8 / 30

  5. Measurement-Dependent Noisy Search Motivation & Setup B δ , || W || 0 = 1 Uknown parameter: W ∈ { 0 , 1 } � Examles B δ chosen sequentially Actions A ( t ) ∈ A ⊂ { 0 , 1 } � Noisy Search ⊲ Problem Setup Y ( t ) = A ( t )( W + Z ) � Questions Analysis I Analysis II Summary Result Code to Search Break Experiment Design 9 / 30

  6. Measurement-Dependent Noisy Search Motivation & Setup B δ , || W || 0 = 1 Uknown parameter: W ∈ { 0 , 1 } � Examles B δ chosen sequentially Actions A ( t ) ∈ A ⊂ { 0 , 1 } � Noisy Search ⊲ Problem Setup Y ( t ) = A ( t )( W + Z ) = A ( t ) W + ˆ Z � Questions Analysis I Observation noise variance increases w | A ( t ) | – Analysis II Summary Result Code to Search Break Experiment Design 9 / 30

  7. Measurement-Dependent Noisy Search Motivation & Setup B δ , || W || 0 = 1 Uknown parameter: W ∈ { 0 , 1 } � Examles B δ chosen sequentially Actions A ( t ) ∈ A ⊂ { 0 , 1 } � Noisy Search ⊲ Problem Setup Y ( t ) = A ( t )( W + Z ) = A ( t ) W + ˆ Z � Questions Analysis I Observation noise variance increases w | A ( t ) | – Analysis II Summary Result τ − 1 time 1 . . . τ Code to Search A ( τ − 1) sample A (1) . . . Break observation Y (1) . . . Y ( τ − 1) Experiment Design ˆ W = d ( Y τ − 1 , x τ − 1 ) declaration error 1 { ˆ W � = W } Objective: Find τ , A (0) , . . . , A ( τ − 1) , and d ( · ) that minimize E [ τ ] s.t. Pe ≤ ǫ 9 / 30

  8. Measurement-Dependent Noisy Search Motivation & Setup B δ , || W || 0 = 1 Uknown parameter: W ∈ { 0 , 1 } � Examles B δ chosen sequentially Actions A ( t ) ∈ A ⊂ { 0 , 1 } � Noisy Search ⊲ Problem Setup Y ( t ) = A ( t )( W + Z ) = A ( t ) W + ˆ Z � Questions Analysis I Observation noise variance increases w | A ( t ) | – Analysis II Summary Result τ − 1 time 1 . . . τ Code to Search A ( τ − 1) sample A (1) . . . Break observation Y (1) . . . Y ( τ − 1) Experiment Design ˆ W = d ( Y τ − 1 , x τ − 1 ) declaration error 1 { ˆ W � = W } Objective: Find τ , A (0) , . . . , A ( τ − 1) , and d ( · ) that minimize E [ τ ] s.t. Pe ≤ ǫ Numerical solution via a dynamic programming equation � 9 / 30

  9. Simpler Questions of General Consequence Role of allowable actions set A Motivation & Setup � Examles Noisy Search Problem Setup ⊲ Questions Analysis I Analysis II Summary Result Code to Search Break Experiment Design 10 / 30

  10. Simpler Questions of General Consequence Role of allowable actions set A Motivation & Setup � Examles Noisy Search Designing A can significantly reduce the overhead – Problem Setup ⊲ Questions Analysis I Analysis II Summary Result Code to Search Break Experiment Design 10 / 30

  11. Simpler Questions of General Consequence Role of allowable actions set A Motivation & Setup � Examles Noisy Search Designing A can significantly reduce the overhead – Problem Setup ⊲ Questions Analysis I Even though noise variance increases w | a | linearly! ⊲ Analysis II Summary Result Code to Search Break Experiment Design 10 / 30

  12. Simpler Questions of General Consequence Role of allowable actions set A Motivation & Setup � Examles Noisy Search Designing A can significantly reduce the overhead – Problem Setup ⊲ Questions Analysis I Even though noise variance increases w | a | linearly! ⊲ Analysis II Summary Result Code to Search Selecting A ( t ) based on past observations (a feedback � Break scheme) or off-line (non-adaptively)? Experiment Design 10 / 30

  13. Simpler Questions of General Consequence Role of allowable actions set A Motivation & Setup � Examles Noisy Search Designing A can significantly reduce the overhead – Problem Setup ⊲ Questions Analysis I Even though noise variance increases w | a | linearly! ⊲ Analysis II Summary Result Code to Search Selecting A ( t ) based on past observations (a feedback � Break scheme) or off-line (non-adaptively)? Experiment Design What is the adaptivity gain? – Feedback policies are computationally expensive – 10 / 30

  14. Role of Measurements Role of allowable actions set A Motivation & Setup � Examles Noisy Search Problem Setup Questions ⊲ Analysis I Analysis II Summary Result Code to Search Break Experiment Design 11 / 30

  15. Role of Measurements Role of allowable actions set A Motivation & Setup � Examles Advantages of group testing – Noisy Search Problem Setup Questions ⊲ Analysis I Analysis II Summary Result Code to Search Break Experiment Design 11 / 30

  16. Role of Measurements Role of allowable actions set A Motivation & Setup � Examles Advantages of group testing – Noisy Search Problem Setup Questions ⊲ Analysis I Analysis II Summary Result Code to Search Break Experiment Design 11 / 30

  17. Role of Measurements Role of allowable actions set A Motivation & Setup � Examles Advantages of group testing – Noisy Search Problem Setup Questions ⊲ Analysis I Analysis II Summary Result Code to Search Break Experiment Design If A only singletons ( || A ( t ) || = 1 ) ⇒ search time O ( B/δ ) – If A includes intervals, can be O (log( B/δǫ )) – 11 / 30

  18. Role of Measurements Role of allowable actions set A Motivation & Setup � Examles Advantages of group testing – Noisy Search Problem Setup Questions ⊲ Analysis I Analysis II Summary Result Code to Search Break Experiment Design If A only singletons ( || A ( t ) || = 1 ) ⇒ search time O ( B/δ ) – If A includes intervals, can be O (log( B/δǫ )) – Observation: X � �� � If Y a = z ) , ⇒ E [ τ ] ≈ log B/δǫ 1 { object in a } + Z , Z ∼ N (0 , σ 2 I ( X,Y a ) 11 / 30

  19. Adaptivity Gain Motivation & Setup Selecting A ( t ) based on past observations (a feedback � Examles scheme) is computationally expensive Noisy Search Problem Setup Questions Analysis I ⊲ Analysis II Summary Result Code to Search Break Experiment Design 12 / 30

  20. Adaptivity Gain Motivation & Setup Selecting A ( t ) based on past observations (a feedback � Examles scheme) is computationally expensive Noisy Search Problem Setup Critical to quantify the Adaptivity (feedback) gain E [ τ ǫ ] : � Questions Analysis I ⊲ Analysis II E [ τ na ǫ ] − E [ τ ∗ ǫ ] Summary Result Code to Search Break Experiment Design 12 / 30

  21. Adaptivity Gain Motivation & Setup Selecting A ( t ) based on past observations (a feedback � Examles scheme) is computationally expensive Noisy Search Problem Setup Critical to quantify the Adaptivity (feedback) gain E [ τ ǫ ] : � Questions Analysis I ⊲ Analysis II E [ τ na ǫ ] − E [ τ ∗ ǫ ] Summary Result Code to Search Asymptotic analysis when B/δ grows � Break Experiment Design 12 / 30

  22. Adaptivity Gain Motivation & Setup Selecting A ( t ) based on past observations (a feedback � Examles scheme) is computationally expensive Noisy Search Problem Setup Critical to quantify the Adaptivity (feedback) gain E [ τ ǫ ] : � Questions Analysis I ⊲ Analysis II E [ τ na ǫ ] − E [ τ ∗ ǫ ] Summary Result Code to Search Asymptotic analysis when B/δ grows � Break Experiment Design Qualitative difference when B grows versus δ shrinks – 12 / 30

  23. Adaptivity Gain Motivation & Setup Selecting A ( t ) based on past observations (a feedback � Examles scheme) is computationally expensive Noisy Search Problem Setup Critical to quantify the Adaptivity (feedback) gain E [ τ ǫ ] : � Questions Analysis I ⊲ Analysis II E [ τ na ǫ ] − E [ τ ∗ ǫ ] Summary Result Code to Search Asymptotic analysis when B/δ grows � Break Experiment Design Qualitative difference when B grows versus δ shrinks – When B grows overall noise variance grows ⊲ Overall noise is constant even when 1 /δ grows ⊲ 12 / 30

  24. Adaptivity Gain Motivation & Setup Selecting A ( t ) based on past observations (a feedback � Examles scheme) is computationally expensive Noisy Search Problem Setup Critical to quantify the Adaptivity (feedback) gain E [ τ ǫ ] : � Questions Analysis I ⊲ Analysis II E [ τ na ǫ ] − E [ τ ∗ ǫ ] Summary Result Code to Search Asymptotic analysis when B/δ grows � Break Experiment Design Qualitative difference when B grows versus δ shrinks – When B grows overall noise variance grows ⊲ Overall noise is constant even when 1 /δ grows ⊲ Need for a fairly tight non-asymptotic analysis – 12 / 30

  25. Our Contributions: Main Take-aways (general K , K = 1 ) Motivation & Setup Searching with codebooks with feedback over a stateful � Examles channel Noisy Search Problem Setup Questions Analysis I Analysis II ⊲ Summary Result Code to Search Break Experiment Design 13 / 30

  26. Our Contributions: Main Take-aways (general K , K = 1 ) Searching with codebooks with feedback over a stateful Motivation & Setup � Examles channel Noisy Search Problem Setup Questions Analysis I Analysis II ⊲ Summary Result (1) Code to Search Break (2) Y n Experiment Design (r) Z n 13 / 30

  27. Our Contributions: Main Take-aways (general K , K = 1 ) Motivation & Setup Searching with codebooks with feedback over a stateful � Examles channel ( K = 1 ) Noisy Search Problem Setup Reduces the non-adaptive case to known IT problems – Questions Adaptive strategy as a variant of feedback code – Analysis I Analysis II ⊲ Summary Result Code to Search Break Experiment Design 13 / 30

  28. Our Contributions: Main Take-aways (general K , K = 1 ) Motivation & Setup Searching with codebooks with feedback over a stateful � Examles channel ( K = 1 ) Noisy Search Problem Setup Reduces the non-adaptive case to known IT problems – Questions Adaptive strategy as a variant of feedback code – Analysis I Analysis II ⊲ Summary Result Non-asymptotic achievability analysis for an adaptive scheme � Code to Search Sorted Posterior Matching (SortPM) search strategy – Break Experiment Design 13 / 30

  29. Our Contributions: Main Take-aways (general K , K = 1 ) Motivation & Setup Searching with codebooks with feedback over a stateful � Examles channel ( K = 1 ) Noisy Search Problem Setup Reduces the non-adaptive case to known IT problems – Questions Adaptive strategy as a variant of feedback code – Analysis I Analysis II ⊲ Summary Result Non-asymptotic achievability analysis for an adaptive scheme � Code to Search Sorted Posterior Matching (SortPM) search strategy – Break Experiment Design Characterize daptivity gain with two distinct asymptotic � regimes B/δ → ∞ 13 / 30

  30. Our Contributions: Main Take-aways (general K , K = 1 ) Motivation & Setup Searching with codebooks with feedback over a stateful � Examles channel ( K = 1 ) Noisy Search Problem Setup Reduces the non-adaptive case to known IT problems – Questions Adaptive strategy as a variant of feedback code – Analysis I Analysis II ⊲ Summary Result Non-asymptotic achievability analysis for an adaptive scheme � Code to Search Sorted Posterior Matching (SortPM) search strategy – Break Experiment Design Characterize daptivity gain with two distinct asymptotic � regimes B/δ → ∞ Fixed search interval and increasing resolution (initial – access) Fixed resolution and increasing search (primary user – detection) 13 / 30

  31. Our Contributions: Main Take-aways (general K , K = 1 ) Searching with codebooks with feedback over a stateful Motivation & Setup � Examles channel ( K = 1 ) Noisy Search Reduces the non-adaptive case to known IT problems – Problem Setup Questions Adaptive strategy as a variant of feedback code – Analysis I Analysis II ⊲ Summary Result Non-asymptotic achievability analysis for an adaptive scheme � Code to Search Sorted Posterior Matching (SortPM) search strategy – Break Characterize daptivity gain with two distinct asymptotic � Experiment Design regimes B/δ → ∞ Fixed search interval and increasing resolution (initial – access) Fixed resolution and increasing search (primary user – detection) 13 / 30

  32. Motivation & Setup Examles Noisy Search ⊲ Code to Search Non-adaptive Search Strategies Upper Bound Prior Work Analysis Generalizations I Generalizations II Generalization III Break Experiment Design 14 / 30

  33. Non-asymptotic Converse for Non-adaptive Search: Searching via coding over a stateful channel Motivation & Setup � Examles Noisy Search Code to Search ⊲ Non-adaptive Search Strategies Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design 15 / 30

  34. Non-asymptotic Converse for Non-adaptive Search: Searching via coding over a stateful channel Motivation & Setup � Examles Noisy Search Code to Search (1) ⊲ Non-adaptive (2) Search Strategies Y n Upper Bound Prior Work (r) Z n Generalizations I Generalizations II Generalization III Break Experiment Design 15 / 30

  35. Non-asymptotic Converse for Non-adaptive Search: Searching via coding over a stateful channel Motivation & Setup � Examles Noisy Search Code to Search (1) ⊲ Non-adaptive (2) Search Strategies Y n Upper Bound Prior Work (r) Z n Generalizations I Generalizations II Generalization III Break Reduces non-adaptive case to known IT problem: Experiment Design – X q ∼ Ber ( q ) , Z q ∼ N (0 , qB Y = X q + Z q , δ σ 2 ) 15 / 30

  36. Non-asymptotic Converse for Non-adaptive Search: Searching via coding over a stateful channel Motivation & Setup � Examles Noisy Search Code to Search (1) ⊲ Non-adaptive (2) Search Strategies Y n Upper Bound Prior Work (r) Z n Generalizations I Generalizations II Generalization III Break Reduces non-adaptive case to known IT problem: Experiment Design – X q ∼ Ber ( q ) , Z q ∼ N (0 , qB Y = X q + Z q , δ σ 2 ) ] ≥ (1 − ǫ ) log B δ − h ( ǫ ) E [ τ NA � ǫ C BPSK ( q, σ qB/δ ) 15 / 30

  37. Non-asymptotic Converse for Non-adaptive Search: Searching via coding over a stateful channel Motivation & Setup � Examles Noisy Search Code to Search (1) ⊲ Non-adaptive (2) Search Strategies Y n Upper Bound Prior Work (r) Z n Generalizations I Generalizations II Generalization III Break Reduces non-adaptive case to known IT problem: Experiment Design – X q ∼ Ber ( q ) , Z q ∼ N (0 , qB Y = X q + Z q , δ σ 2 ) (1 − ǫ ) log B δ − h ( ǫ ) E [ τ NA ] ≥ � ǫ C BPSK ( q ∗ , σ q ∗ B/δ ) 15 / 30

  38. Non-asymptotic Converse for Non-adaptive Search: Searching via coding over a stateful channel Motivation & Setup � Examles Noisy Search Code to Search ⊲ Non-adaptive Search Strategies Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design Reduces non-adaptive case to known IT problem: – X q ∼ Ber ( q ) , Z q ∼ N (0 , qB Y = X q + Z q , δ σ 2 ) (1 − ǫ ) log B δ − h ( ǫ ) E [ τ NA ] ≥ � ǫ C BPSK ( q ∗ , σ q ∗ B/δ ) 15 / 30

  39. Non-asymptotic Converse for Non-adaptive Search: Searching via coding over a stateful channel Motivation & Setup � Examles Noisy Search Code to Search ⊲ Non-adaptive Search Strategies Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design Reduces non-adaptive case to known IT problem: – X q ∼ Ber ( q ) , Z q ∼ N (0 , qB Y = X q + Z q , δ σ 2 ) (1 − ǫ ) log B δ − h ( ǫ ) E [ τ NA ] ≥ � ǫ C BPSK ( q ∗ , σ q ∗ B/δ ) 15 / 30

  40. Non-adaptive and Adaptive Search Strategies Motivation & Setup Examles Noisy Search Code to Search Non-adaptive ⊲ Search Strategies Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design 16 / 30

  41. Non-adaptive and Adaptive Search Strategies Non-adaptive Strategy: Motivation & Setup Examles Noisy Search Code to Search Non-adaptive ⊲ Search Strategies Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design 16 / 30

  42. Non-adaptive and Adaptive Search Strategies Non-adaptive Strategy: Motivation & Setup Examles Fix the number of samples τ = T Noisy Search Code to Search Non-adaptive ⊲ Search Strategies Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design 16 / 30

  43. Non-adaptive and Adaptive Search Strategies Non-adaptive Strategy: Motivation & Setup Examles Fix the number of samples τ = T Noisy Search select T to be such that E { P e } ≤ ǫ � Code to Search Non-adaptive ⊲ Search Strategies Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design 16 / 30

  44. Non-adaptive and Adaptive Search Strategies Non-adaptive Strategy: Motivation & Setup Examles Fix the number of samples τ = T Noisy Search select T to be such that E { P e } ≤ ǫ � Code to Search for all t ≤ T query random set a such that | a | = q ∗ B/δ Non-adaptive � ⊲ Search Strategies optimized Upper Bound Prior Work Generalizations I Generalizations II Generalization III Break Experiment Design 16 / 30

  45. Non-adaptive and Adaptive Search Strategies Motivation & Setup Sorted Posterior Matching (sortPM) Strategy: Examles Consider prior ρ ( t ) := ( P { W = e i | A (0 : t − 1) , Y (0 , t − 1) } ) Noisy Search Code to Search declares i as the target, if ρ i ( t ) ≥ 1 − ǫ, i ∈ Ω � Non-adaptive ⊲ Search Strategies otherwise, queries the bins left of the median of the sorted � Upper Bound prior Prior Work Generalizations I observe (noisy) Y Generalizations II – Generalization III update the prior (posterior) via the Bayes’ rule – Break Experiment Design 16 / 30

  46. Non-adaptive and Adaptive Search Strategies Motivation & Setup Sorted Posterior Matching (sortPM) Strategy: Examles Consider prior ρ ( t ) := ( P { W = e i | A (0 : t − 1) , Y (0 , t − 1) } ) Noisy Search Code to Search declares i as the target, if ρ i ( t ) ≥ 1 − ǫ, i ∈ Ω � Non-adaptive ⊲ Search Strategies otherwise, queries the bins left of the median of the sorted � Upper Bound prior Prior Work Generalizations I observe (noisy) Y Generalizations II – Generalization III update the prior (posterior) via the Bayes’ rule – Break Experiment Design � � � � � � � � �� 16 / 30

  47. Non-adaptive and Adaptive Search Strategies Motivation & Setup Sorted Posterior Matching (sortPM) Strategy: Examles Consider prior ρ ( t ) := ( P { W = e i | A (0 : t − 1) , Y (0 , t − 1) } ) Noisy Search Code to Search declares i as the target, if ρ i ( t ) ≥ 1 − ǫ, i ∈ Ω � Non-adaptive ⊲ Search Strategies otherwise, queries the bins left of the median of the sorted � Upper Bound prior Prior Work Generalizations I observe (noisy) Y Generalizations II – Generalization III update the prior (posterior) via the Bayes’ rule – Break Experiment Design � � � � � � � � �� 16 / 30

  48. Non-adaptive and Adaptive Search Strategies Motivation & Setup Sorted Posterior Matching (sortPM) Strategy: Examles Consider prior ρ ( t ) := ( P { W = e i | A (0 : t − 1) , Y (0 , t − 1) } ) Noisy Search Code to Search declares i as the target, if ρ i ( t ) ≥ 1 − ǫ, i ∈ Ω � Non-adaptive ⊲ Search Strategies otherwise, queries the bins left of the median of the sorted � Upper Bound prior Prior Work Generalizations I observe (noisy) Y Generalizations II – Generalization III update the prior (posterior) via the Bayes’ rule – Break Experiment Design � � � � � � � � �� 16 / 30

  49. Non-adaptive and Adaptive Search Strategies Motivation & Setup Sorted Posterior Matching (sortPM) Strategy: Examles Consider prior ρ ( t ) := ( P { W = e i | A (0 : t − 1) , Y (0 , t − 1) } ) Noisy Search Code to Search declares i as the target, if ρ i ( t ) ≥ 1 − ǫ, i ∈ Ω � Non-adaptive ⊲ Search Strategies otherwise, queries the bins left of the median of the sorted � Upper Bound prior Prior Work Generalizations I observe (noisy) Y Generalizations II – Generalization III update the prior (posterior) via the Bayes’ rule – Break Experiment Design ���������������������� ��������� �������� � � � � � � � � �� 16 / 30

  50. Non-adaptive and Adaptive Search Strategies Motivation & Setup Sorted Posterior Matching (sortPM) Strategy: Examles Consider prior ρ ( t ) := ( P { W = e i | A (0 : t − 1) , Y (0 , t − 1) } ) Noisy Search Code to Search declares i as the target, if ρ i ( t ) ≥ 1 − ǫ, i ∈ Ω � Non-adaptive ⊲ Search Strategies otherwise, queries the bins left of the median of the sorted � Upper Bound prior Prior Work Generalizations I observe (noisy) Y Generalizations II – Generalization III update the prior (posterior) via the Bayes’ rule – Break Experiment Design 16 / 30

  51. SortPM: Upper Bound Motivation & Setup Theorem. [ Lalitha, Ronquillo and J. 17 ] Under SortPM, we have Examles log B/δǫ + max { log log B/δ, log log 1 ǫ } Noisy Search E [ τ SP M ] ≤ min + K ( α ) . 1 − h ( Q (( σ 2 αB/δ ) − 1 / 2 )) α Code to Search Non-adaptive Search Strategies ⊲ Upper Bound Prior Work Generalizations I Generalizations II where Generalization III h ( p ) = p log 1 1 Break p + (1 − p ) log 1 − p, Experiment Design K ( · ) is non-increasing function 17 / 30

  52. SortPM: Upper Bound Motivation & Setup Theorem. [ Lalitha, Ronquillo and J. 17 ] Under SortPM, we have Examles log B/δǫ + max { log log B/δ, log log 1 ǫ } Noisy Search E [ τ SP M ] ≤ min + K ( α ) . 1 − h ( Q (( σ 2 αB/δ ) − 1 / 2 )) α Code to Search Non-adaptive Search Strategies ⊲ Upper Bound Prior Work Generalizations I Generalizations II where Generalization III h ( p ) = p log 1 1 Break p + (1 − p ) log 1 − p, Experiment Design K ( · ) is non-increasing function Analysis is based on a Lyapunov drift � 17 / 30

  53. SortPM: Upper Bound Motivation & Setup Examles Corollary. [ Lalitha, Ronqullio and J. 17 ] Relying on hard-detected Noisy Search output symbols, the asymptotic adaptivity gain for B/δ → ∞ is: Code to Search Non-adaptive Search Strategies ⊲ Upper Bound τ NA opt − E [ τ A opt ] 1 Prior Work C BPSK ( q ∗ , Bσ 2 ) − 1 . lim = Generalizations I log B δ → 0 Generalizations II δ Generalization III Break Experiment Design τ NA opt − E [ τ A ≥ σ 2 δ opt ] lim log e. B δ log B B →∞ δ 17 / 30

  54. SortPM: Upper Bound Motivation & Setup Corollary. [ Lalitha, Ronqullio and J. 17 ] Relying on hard-detected Examles output symbols, the asymptotic adaptivity gain for B/δ → ∞ is: Noisy Search Code to Search Non-adaptive τ NA opt − E [ τ A opt ] 1 Search Strategies C BPSK ( q ∗ , Bσ 2 ) − 1 . lim = ⊲ Upper Bound log B δ → 0 Prior Work δ Generalizations I Generalizations II Generalization III τ NA opt − E [ τ A ≥ σ 2 δ opt ] Break lim log e. B δ log B B →∞ Experiment Design δ 17 / 30

  55. Prior Work: Measurement Independent Noise Motivation & Setup Examles Generalized binary search [Burnashev and Zigangirov ’74] Noisy Search � Code to Search Non-adaptive Search Strategies Channel coding over DMC with feedback [Burnashev ’75], � Upper Bound ⊲ Prior Work [Yamamato and Itoh ’79], ... [Naghshvar, Wigger and J ’13] Generalizations I Generalizations II Generalization III Break Posterior matching [Shayevitz and Feder ’11] � Experiment Design Bisection search with noisy responses [Horstein ’63], � [Waeber, Frazier, Henderson ’13] 18 / 30

  56. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search Code to Search Non-adaptive Search Strategies Upper Bound Prior Work ⊲ Generalizations I Generalizations II Generalization III Break Experiment Design 19 / 30

  57. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search 24 Code to Search Non-adaptive 22 Search Strategies 20 Upper Bound Prior Work ⊲ Generalizations I 18 Generalizations II 16 Generalization III 14 Break Experiment Design 12 10 8 10 1 10 2 19 / 30

  58. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search 24 Code to Search Non-adaptive 22 Search Strategies 20 Upper Bound Prior Work ⊲ Generalizations I 18 Generalizations II 16 Generalization III 14 Break Experiment Design 12 10 8 10 1 10 2 19 / 30

  59. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search 24 Code to Search Non-adaptive 22 Search Strategies 20 Upper Bound Prior Work ⊲ Generalizations I 18 Generalizations II 16 Generalization III 14 Break 12 Experiment Design 10 8 10 1 10 2 19 / 30

  60. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search Fixed (hierarchical) beam patterns � Code to Search Non-adaptive Search Strategies Upper Bound Prior Work ⊲ Generalizations I Generalizations II Generalization III Break Experiment Design 19 / 30

  61. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search Fixed (hierarchical) beam patterns � Code to Search Non-adaptive Search Strategies Upper Bound Prior Work ⊲ Generalizations I Generalizations II Generalization III Break Experiment Design 19 / 30

  62. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search Fixed (hierarchical) beam patterns � Code to Search Non-adaptive Search Strategies Upper Bound Prior Work ⊲ Generalizations I Generalizations II Generalization III Break Experiment Design 19 / 30

  63. Generalizations General noise model: Y ( t ) = A ( t ) W + ˆ Z , ˆ Motivation & Setup Z = f ( Z , A ( t )) � Examles Noisy Search Fixed (hierarchical) beam patterns � Code to Search Non-adaptive Search Strategies Upper Bound Prior Work ⊲ Generalizations I Generalizations II Generalization III Break Experiment Design 19 / 30

  64. Generalizations and On-going Work Motivation & Setup Search for multiple target ( K > 1 ) � Examles Noisy Search Code to Search Non-adaptive Search Strategies Upper Bound Prior Work Generalizations I ⊲ Generalizations II Dynamic case: W ( t ) � Generalization III Break Experiment Design Beyond Gaussian � 20 / 30

  65. Generalizations and On-going Work Motivation & Setup Search for multiple target ( K > 1 ) � Examles Noisy sequential group testing [Atia and Saligrama ’12]; � Noisy Search Mapped to an OR MAC [Kaspi, Shayevitz, J ’15] Code to Search Non-adaptive Search Strategies Upper Bound Prior Work Generalizations I ⊲ Generalizations II Dynamic case: W ( t ) � Generalization III Break Experiment Design Beyond Gaussian � 20 / 30

  66. Generalizations and On-going Work Motivation & Setup Search for multiple target ( K > 1 ) � Examles Noisy sequential group testing [Atia and Saligrama ’12]; � Noisy Search Mapped to an OR MAC [Kaspi, Shayevitz, J ’15] Code to Search Non-adaptive 1 Factor of K in rate, where K bounds (is) the number of � Search Strategies targets Upper Bound Prior Work Generalizations I ⊲ Generalizations II Dynamic case: W ( t ) � Generalization III Break Experiment Design Beyond Gaussian � 20 / 30

  67. Generalizations and On-going Work Motivation & Setup Search for multiple target ( K > 1 ) � Examles Noisy sequential group testing [Atia and Saligrama ’12]; � Noisy Search Mapped to an OR MAC [Kaspi, Shayevitz, J ’15] Code to Search Non-adaptive 1 Factor of K in rate, where K bounds (is) the number of � Search Strategies targets Upper Bound Prior Work ‡ Case of an adder channel Generalizations I ⊲ Generalizations II Dynamic case: W ( t ) � Generalization III Break Experiment Design Beyond Gaussian � 20 / 30

  68. Generalizations and On-going Work Motivation & Setup Search for multiple target ( K > 1 ) � Examles Noisy sequential group testing [Atia and Saligrama ’12]; � Noisy Search Mapped to an OR MAC [Kaspi, Shayevitz, J ’15] Code to Search Non-adaptive 1 Factor of K in rate, where K bounds (is) the number of � Search Strategies targets Upper Bound Prior Work ‡ Case of an adder channel Generalizations I ⊲ Generalizations II Dynamic case: W ( t ) � Generalization III Break Results generalizes to unknown but constant speed (cut � Experiment Design rate by half) Beyond Gaussian � 20 / 30

  69. Generalizations and On-going Work Motivation & Setup Search for multiple target ( K > 1 ) � Examles Noisy sequential group testing [Atia and Saligrama ’12]; � Noisy Search Mapped to an OR MAC [Kaspi, Shayevitz, J ’15] Code to Search Non-adaptive 1 Factor of K in rate, where K bounds (is) the number of � Search Strategies targets Upper Bound Prior Work ‡ Case of an adder channel Generalizations I ⊲ Generalizations II Dynamic case: W ( t ) � Generalization III Break Results generalizes to unknown but constant speed (cut � Experiment Design rate by half) Beyond Gaussian � Similar results for the binary symmetric noise (hard � decoding) [Kaspi, Shayevitz, J ’14] 20 / 30

  70. Empirical Network Parameter tuning Network performance function of network parameter f : X → R . Motivation & Setup Examles Assumptions: Noisy Search X is the set of network parameters and protocols � Code to Search Non-adaptive f ( x ) is the network performance; f ( x 1 ) and f ( x 2 ) � Search Strategies ”correlated” Upper Bound Prior Work f observed w noise: y = f ( x ) + η ( x ) , η non-presistent noise � Generalizations I Generalizations II ⊲ Generalization III Goal: Design a sequential strategy of selecting n query points Break x 1 , . . . , x n to identify a global optimizer of f . Experiment Design Performance measures: � Simple regret: S n = f ( x ∗ ) − f ( x ∗ n ) – Cumulative regret: R n = � n t =1 f ( x ∗ ) − f ( x t ) – 21 / 30

  71. Empirical Network Parameter tuning Network performance function of network parameter f : X → R . Motivation & Setup Examles Assumptions: Noisy Search X is the set of network parameters and protocols � Code to Search Non-adaptive f ( x ) is the network performance; f ( x 1 ) and f ( x 2 ) � Search Strategies ”correlated” Upper Bound Prior Work f observed w noise: y = f ( x ) + η ( x ) , η non-presistent noise � Generalizations I Generalizations II ⊲ Generalization III Goal: Design a sequential strategy of selecting n query points Break x 1 , . . . , x n to identify a global optimizer of f . Experiment Design Performance measures: � Simple regret: S n = f ( x ∗ ) − f ( x ∗ n ) – Cumulative regret: R n = � n t =1 f ( x ∗ ) − f ( x t ) – 21 / 30

  72. Empirical Network Parameter tuning Network performance function of network parameter f : X → R . Motivation & Setup Examles Assumptions: Noisy Search X is the set of network parameters and protocols � Code to Search Non-adaptive f ( x ) is the network performance; f ( x 1 ) and f ( x 2 ) � Search Strategies ”correlated” Upper Bound Prior Work f observed w noise: y = f ( x ) + η ( x ) , η non-presistent noise � Generalizations I Generalizations II ⊲ Generalization III Goal: Design a sequential strategy of selecting n query points Break x 1 , . . . , x n to identify a global optimizer of f . Experiment Design Performance measures: � Simple regret: S n = f ( x ∗ ) − f ( x ∗ n ) – Cumulative regret: R n = � n t =1 f ( x ∗ ) − f ( x t ) – 21 / 30

  73. Empirical Network Parameter tuning Network performance function of network parameter f : X → R . Motivation & Setup Examles Assumptions: Noisy Search X is the set of network parameters and protocols � Code to Search Non-adaptive f ( x ) is the network performance; f ( x 1 ) and f ( x 2 ) � Search Strategies ”correlated” Upper Bound Prior Work f observed w noise: y = f ( x ) + η ( x ) , η non-presistent noise � Generalizations I Generalizations II ⊲ Generalization III Goal: Design a sequential strategy of selecting n query points Break x 1 , . . . , x n to identify a global optimizer of f . Experiment Design Performance measures: � Simple regret: S n = f ( x ∗ ) − f ( x ∗ n ) – Cumulative regret: R n = � n t =1 f ( x ∗ ) − f ( x t ) – 21 / 30

  74. Empirical Network Parameter tuning Network performance function of network parameter f : X → R . Motivation & Setup Examles Assumptions: Noisy Search X is the set of network parameters and protocols � Code to Search Non-adaptive f ( x ) is the network performance; f ( x 1 ) and f ( x 2 ) � Search Strategies ”correlated” Upper Bound Prior Work f observed w noise: y = f ( x ) + η ( x ) , η non-presistent noise � Generalizations I Generalizations II ⊲ Generalization III Goal: Design a sequential strategy of selecting n query points Break x 1 , . . . , x n to identify a global optimizer of f . Experiment Design Performance measures: � Simple regret: S n = f ( x ∗ ) − f ( x ∗ n ) – Cumulative regret: R n = � n t =1 f ( x ∗ ) − f ( x t ) [bandit] – 21 / 30

  75. Motivation & Setup Examles Noisy Search Code to Search ⊲ Break Experiment Design Questions? 22 / 30

  76. Motivation & Setup Examles Noisy Search Code to Search Break Experiment ⊲ Design Experiment Design Experiment Design: Single-shot Intuitive Overview Heuristic Approaches Notations Mutual Information EJS Achievability 23 / 30

  77. Design of Experiments [Blackwell ’51] Motivation & Setup M mutually exclusive hypotheses: H i ⇔ { θ = i } , � Examles i = 1 , 2 , . . . , M Noisy Search Code to Search Prior ρ (0) = [ ρ 1 (0) , . . . , ρ M (0)] , ρ i (0) = P ( θ = i ) Break � Experiment Design Experiment ⊲ Design Intuitive Overview Heuristic Approaches Notations Mutual Information EJS Achievability 24 / 30

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend