robust quantum minimum finding with an application to
play

Robust Quantum Minimum Finding with an application to hypothesis - PowerPoint PPT Presentation

Robust Quantum Minimum Finding with an application to hypothesis selection Yihui Quek (Joint work with Clement Canonne (IBM Research Almaden), Patrick Rebentrost (CQT-NUS)) Stanford University 21 Apr 2020 1/30 Problem: Hypothesis selection


  1. Robust Quantum Minimum Finding with an application to hypothesis selection Yihui Quek (Joint work with Clement Canonne (IBM Research Almaden), Patrick Rebentrost (CQT-NUS)) Stanford University 21 Apr 2020 1/30

  2. Problem: Hypothesis selection Problem (Hypothesis selection) Given ◮ Unknown probability distribution p 0 , sample access to it. ◮ N known candidate distributions: P = { p 1 , . . . p N } ; PDF comparator between every pair. Task: Output a distribution ˆ p ∈ P with small ℓ 1 -distance to p 0 with as few samples from p 0 as possible. Remark: Maximum likelihood does not work for ℓ 1 -distance. 2/30

  3. A closely related problem Problem (Robust minimum finding) Given ◮ A list of N elements { x i } N i =1 ◮ A well-defined distance metric d ( x i , x j ) Task: Find the minimum using an imprecise pairwise comparator between elements. ◮ Comparator imprecision: Outputs correct answer if the elements are far enough apart; otherwise no guarantees. √ ◮ Result: can do this in ˜ O ( N ) comparator invocations. 3/30

  4. Noisy Comparator Input to comparator: indices i , j . � argmin { x i , x j } if d ( x i , x j ) > 1 NoisyComp( i , j ) = unknown (possibly adversarial) otherwise. (1) Definition (Oracle notation) Will denote oracle implementing noisy comparator as ˆ O and noiseless comparator as ˆ O (0) . Will count the number of calls of either ˆ O or ˆ O (0) . 4/30

  5. Classical noisy minimum selection Definition ( t -approximation) An element y ∈ L is a t-approximation of the true minimum y ∗ if it satisfies d ( y , y ∗ ) < t. Lemma (Optimal approximation guarantee) To get a t-approximation for t < 2 , P [ error ] ≥ 1 1 2 − 2N . Hence, will aim for a 2-approximation guarantee. 5/30

  6. Classical noisy minimum selection, part 2 Run time dependence on N ? Classically, linear is optimal. Theorem ( COMB (Theorem 15 of [AFJOS’16])) A classical randomized algorithm, COMB ( δ, S ) , outputs a N log 1 � � 2 -approximation of the minimum w.p ≥ 1 − δ , using O δ queries to the noisy comparator. √ We will do this in sublinear – i.e. ˜ O ( N ) time. Assumption There exists ∆ ∈ [ N ] ′ such that at most 2∆ elements are contained in the fudge zone of any element in the list. ◮ Reasonable assumption in most cases, including hypothesis selection. 6/30

  7. Recap: D¨ urr-Høyer Quantum Minimum Finding Figure: Durr-Hoyer ’96. Exponential search algorithm = BBHT ’98. Key point: quantum exponential search rapidly moves the pivot to lower ranks. 7/30

  8. Some initial observations ◮ What happens if we naively run D¨ urr-Høyer with noisy comparator? Problem: we could in principle go back up the ranks! ◮ Will show that we still make (on expectation) positive progress down the ranks, if rank of pivot is Ω(1 + ∆). ◮ However, this stops working when pivot is o (1 + ∆) ranks from the minimum. ◮ V1 algorithm: stop iterating when pivot is, on expectation, ≤ O (1 + ∆) ranks from the minimum: already an improvement from O ( N ). 8/30

  9. Subroutine: QSearchWithCutoff ◮ We add an explicit run time cutoff to exponential search and allow to use noisy oracle, ˆ O . Lemma Let the current pivot y be of rank r > ∆ . Then � QSearchWithCutoff ( ˆ N O , y , 9 r − ∆ ) succeeds in finding a marked element with probability at least 1 2 . Proof. Follows from running for twice the expected runtime. 9/30

  10. Will present 3 algorithms. Algorithm 1: Pivot Counting QMF

  11. Algorithm 1: Pivot-Counting QMF Differs from D¨ urr-Høyer Quantum Minimum Finding in 2 ways: ◮ At each iteration, we run QSearchWithCutoff with a � N finite cutoff at 9 1+∆ – i.e. may fail to change the pivot. ◮ Instead of cutting off total runtime, we cut off the number of runs of QSearchWithCutoff at N trials runs. ill differentiate between ‘attempted’ vs. ‘successful’ pivot change. 11/30

  12. PCQMF pseudocode Algorithm 1 Pivot-Counting Quantum Minimum Finding Piv- otQMF ( ˆ O , ∆, N trials ) Input: comparison oracle ˆ O , ∆ ∈ [ N ] ′ , N trials ∈ Z + k ← 0. y ← Unif[ N ]. for k ∈ [ N trials ] do � ( y ′ , 0) ← QSearchWithCutoff ( ˆ N O , y , 9 1+∆ ). if O y ( y ′ ) = 1 then y ← y ′ . Output: y For the next few slides, we will pretend all pivot changes are successful. 12/30

  13. Lemma: worst-case expected improvement ◮ For ‘high’ rank’: Lemma (Worst-case expected rank change for r i > 3(∆ + 1)) For r i > 3(∆ + 1) , r i +∆ 1 s = r i + ∆ + 1 < 2 r i � E [ r i +1 | r i ] ≤ 3 . r i + ∆ 2 s =1 ◮ For ‘low’ rank: Lemma For r i ≤ 3(∆ + 1) , E [ r i +1 | r i ] ≤ 4∆ + 3 . 13/30

  14. Crucial intuition With noiseless comparator: E [ r i +1 | r i ] ≤ r i 2 . i.e. length of ‘remaining list’ halves with every pivot change. ◮ With every successful pivot change, we still make positive progress down the ranks with noisy comparator! – with worse factor: 2 / 3. ◮ Hence number of successful pivot changes necessary still logarithmic-in- N ∼ log 3 / 2 ( N ). 14/30

  15. Completing the argument � � log ( 4∆+3 ) N Argument so far: with N p = successful pivot changes, log(3 / 2) expected final rank ≤ 4∆ + 3. To finish off: ◮ Each attempt succeeds with probability > 1 2 . Chernoff bound number of attempts to get N p successful pivot changes. ◮ Markov’s inequality bounds actual final rank (pay a multiplicative factor). Name Success prob. Final guarantee Run time � 3 ˜ N rank( y ) ≤ 16∆ + 16 O ( 1+∆ ) PivotQMF 4 15/30

  16. Algorithm 2: Repeated PivotQMF

  17. Algorithm 2: Repeated PivotQMF ◮ Use PivotQMF as a basic subroutine to find some element of “pretty-good” rank with constant probability ◮ Repeat log(1 /δ ) times to get a pool of indices. ◮ Use classical min selection on the pool. Algorithm 2 Repeated Pivot Quantum Minimum Finding Repeat- edPivotQMF ( ˆ O , δ, ∆) Input: ˆ O , δ, ∆ S ← ∅ . Stage I: Finding pretty-small element w.h.p. for i = 1 , . . . log 4 (2 /δ ) do � � O , ∆ , ⌈ 8 max( log( N / (4∆+3) ˆ y ← PivotQMF , 2 ln N ) ⌉ log(3 / 2) S ← S ∪ { y } . Stage II: Classical minimum selection with noisy comparator Perform COMB ( δ/ 2, S ). Output: Output of COMB ( δ/ 2, S ). 17/30

  18. Guarantees Success Final guarantee Run time prob. � ˜ 1+∆ log( N ) + log 2 (1 /δ )) N 1 − δ rank( y ) ≤ 18∆ + 16 O ( Intuition: ◮ Use quick and dirty quantum subroutine to find a ‘representative’ element. Bootstrap with repetitions. ◮ Use slow and precise classical subroutine to select the best of the repetitions. 18/30

  19. Algorithm 3: RobustQMF

  20. RobustQMF overview ◮ Rank approximation guarantee of PivotQMF can be strengthened to a distance guarantee. ◮ Key idea of RobustQMF : ◮ Run PivotQMF , get a “pretty-good element” Y out ∼ O (∆ + 1) rank ◮ ∗ Get classical sublist of elements ranking below Y out ◮ Run classical minimum-selection algorithm. ◮ Final approximation guarantee: a 2-approximation of the minimum! ∼ optimal. ∗ This almost works, but our runtime cutoff for exponential search � N at 9 1+∆ now comes back to bite us... 20/30

  21. Two refinements ◮ Getting the classical sublist of elements ranking below Y out ◮ Fix Y out as a pivot, then repeatedly apply � QSearchWithCutoff ( ˆ N O , Y out , 9 1+∆ ). ◮ The run time cutoff problem: For Y out of rank < 1 + 2∆, � N expected run time of exponential search is > O ( 1+∆ )), hence the above run time cutoff may be premature! ◮ Denominator of run time cutoff comes from number of marked elements. ◮ Append K dummy elements to the list that will always be marked ( K = 2∆ works). 21/30

  22. RobustQMF Algorithm 3 Robust Quantum Minimum Finding Ro- bustQMF ( ˆ O , δ, ∆) Input: ˆ O , δ Stage I: Finding a “pretty-small” element with RepeatedPiv- otQMF Y out ← RepeatedPivotQMF ( ˆ O , δ/ 2) Stage II: Finding even smaller elements S ← { Y out } ˜ T ← 19∆ + 16 δ ) ˜ for i = 1 , . . . 2 ln 2 log( 4 T do � ( y k , g ) ← QSearchWithCutoff ( ˆ N O , Y out , 9 1+∆ , list = L ′ ). if O Y out ( y k ) = 1 , and y k is not a dummy element then S ← S ∪{ y k } Remove repeated elements from S . Stage III: Classical minimum-selection with noisy comparator Perform COMB ( δ/ 4, S ). Output: Output of COMB ( δ/ 4, S ). 22/30

  23. Table of main algorithms Name Success Final guarantee Run time prob. � ˜ 3 N rank( y ) ≤ 16∆ + 16 O ( 1+∆ ) PivotQMF 4 � ˜ N 1 − δ rank( y ) ≤ 18∆ + 16 O ( 1+∆ ) Repeated PivotQMF ˜ � RobustQMF 1 − δ d ( y , y ∗ ) ≤ 2 (optimal) O ( N (1 + ∆)) Table: Comparison of quantum minimum-finding algorithms. y ∗ : true minimum. 23/30

  24. Application: sublinear-time hypothesis selection

  25. Scheff´ e test (algorithm) Unknown distribution q (sample access). Want to choose the closer of two hypotheses, p i , p j . Algorithm 4 Scheff´ e test Input: Access to distributions p i ∈ P and p j ∈ P , { x k } K k =1 i.i.d. samples from unknown distribution q . Compute µ S = 1 � K k =1 1 x k ∈S ij . K Output: If | p i ( S ij ) − µ S | ≤ | p j ( S ij ) − µ S | output p i , otherwise output p j 25/30

  26. Algorithm by picture For a pair of hypotheses, p 1 , p 2 ... Figure: Scheffe test 26/30

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend