probabilistic computation
play

Probabilistic Computation CSC5802, Computational Complexity Derek - PowerPoint PPT Presentation

Probabilistic Computation CSC5802, Computational Complexity Derek Kern December 11 th , 2010 What will we be doing? We will be exploring the field of probabilistic computation We'll see: Examples Turing machines Deterministic


  1. Probabilistic Computation CSC5802, Computational Complexity Derek Kern December 11 th , 2010

  2. What will we be doing?  We will be exploring the field of probabilistic computation  We'll see: Examples − Turing machines − Deterministic  Nondeterministic  Probabilistic  Probabilistic Complexity Classes − RP/co-RP  ZPP  BPP  PP  Probabilistic Class Relationships −

  3. PRIME(x)  The decision problem PRIME(x): Given an integer x, is x in the set of prime integers?  The Sieve of Eratostenes – First known solution − Given an integer x whose primality is in question, write down all of the integers i, 1 < i ≤ x. Starting with the smallest integer in the list q (at the beginning, this will always be 2), remove from the list all multiples of 2. Next, remove from the list multiples of the next remaining integer q in the list; repeat this step until q' > x or until x is removed from the list. If at any point, x is removed from the list, then x is composite; otherwise, it is prime. − Unfortunately, Eratostenes' Sieve is exponential in the length of its input  For each q whose multiples must be removed, this q is a prime number. So, for each prime q less than n, the leftover list entries must be traversed; the number of these entries is on the order of n.

  4. PRIME(x)  Yet, PRIME(x) is known to be in P [AKS2004] − PRIME(x) is O(n log 6 n)  Prior to 2004, the best known algorithms for solving PRIME were probabilistic  These probabilistic algorithms have been used for prime testing in a variety of applications (e.g. encryption) − Many encryption algorithms need prime numbers to operate. Thus, in order to generate a prime number, they choose a number within a range and then test its primality

  5. Solovay/Strassen  Let's look at the Solovay/Strassen probabilistic algorithm for PRIME(x) (actually, for COMPOSITE)  Quick note − Euler's Criterion (EC): Given odd prime p and base integer a , a (p-1)2 ≡ ( a | p)(mod p) − This congruence holds for all odd primes p over all bases. It also holds for some odd composites over some bases a ; these bases are called 'Euler liars' − For any odd composite c and bases a < c, half of these bases are liars and half are not  The algorithm – Give an integer to test q − Randomly choose base a where 2 < a < q − If EC does not hold for a and q, then return “COMPOSITE” − If EC holds, return “PROBABLY PRIME”

  6. Solovay/Strassen  Given a q and a , imagine that we received the answer “PROBABLY PRIME” − What do we do?  Since only ½ the bases less than q and greater than 2 are liars, we repeat by randomly choosing another base − Each time we iterate, our confidence in the answer “PROBABLY PRIME” grows. − In fact, if we repeat the process k times (receiving “PROBABLY PRIME” each time), then the chance that q isn't prime is 2 -k − Somewhere around k ≈ 20, our confidence in the primality of q would be bounded by our confidence in the functioning of the computer

  7. Solovay/Strassen  Given that an iteration of SS runs in O(n), iterating SS constant number of times provides us with a practical method for testing the primality of a given integer to a specific tolerance in polynomial-time  The Solovay/Strassen algorithm is the complexity class RP (a.k.a. Randomized Polynomial-time). We'll see RP later...

  8. Turing Machines revisited  Reminder: A deterministic TM (DTM) M is defined by the quadruple (K, Σ, δ, s) [CP1994] − K is a finite set of states − Σ is a finite alphabet (typically {1, 0, #, b}) − δ is a transition function. It maps K x Σ to K x Σ x {m: tape head movement = m} − s is a member of K. It is the initial state  Reminder: A nondeterministic TM (NDTM) is defined by the quadruple (K, Σ, δ, s) [CP1994] − Essentially, a NDTM is the same as a DTM, except that δ is not a transition function but a relation − Since a relation allows many different outcomes given a K x Σ combination, an oracle is used to guide computation

  9. Turing Machines revisited  So, what about Probabilistic Turing Machines? How are they defined?  A probabilistic TM (PTM) M is defined by the quintuple (K, Σ, δ 1 , δ 2 , s) − Note that a PTM has two transition functions: δ 1 and δ 2 − At each state, a PTM randomly chooses whether to use δ 1 or δ 2 − Taken together, δ 1 and δ 2 amount to a relation where the path choice is determined randomly − For an actual PTM, δ 1 and δ 2 will often have most of their transitions in common, which means that random choice will only affect a few transitions

  10. Turing Machines revisited  How do PTMs relate to DTMs? − All DTMs are PTMs since we can simply let δ 1 = δ 2 − PTMs can also be defined in deterministic terms by giving them an extra-tape whose cell values have been randomized  How do PTMs relate to NDTMs? − The class relationships between PTMs and NDTMs are no more settled than those between DTMs and NDTMs − As is evidenced by P ?= NP, L ?= NL and EXPTIME ?= NEXPTIME, we do not understand whether non- determinism truly allows us to overcome intractability Thus, this issue will likely remain unsettled until we know the status of P ?= NP

  11. Gamble anyone?  Wisdom from Papadimitriou [CS1994] – “In some very real sense, computation is inherently randomized. It can be argued that the probability that a computer will be destroyed by a meteorite during any given microsecond its operation is at least 2 -100 .”

  12. Monte Carlo vs Las Vegas  There are two general types of probabilistic algorithms – Monte Carlo Algorithms [ER2009] • These algorithms always run efficiently • However, they can, at times, return an incorrect answer, i.e. false positives or false negatives • The probability of an incorrect answer is known – Las Vegas Algorithms • These algorithms always return correct answers; no false positives or false negatives • They may not run efficiently • The probability of ineffecient operation is known  Solovay/Strassen was a Monte Carlo algorithm. It always runs efficiently but may return an incorrect answer  Las Vegas algorithms were intially used for graph isomorphism testing

  13. Background: Error Reduction  Question for the class: Which is of more practical use? − Choice #1: An algorithm that returns YES or NO such that there is a 0% chance of a false positive and a 51% chance of a false negative. This algorithm is O(n). − Choice #2: An algorithm that returns YES or NO such that there is a 0% chance of a false positive and a 8% chance of a false negative. This algorithm is O(n 2 )  Let k be the number of iterations, the probability of error over k-iterations is: (1 – ϵ) k [CP1994]. So, error diminishes very quickly as we iterate (and receive confirming answers)  If we iterate Choice #1 for k = 4 iterations, then the chance of false positive is ~6%. Furthermore, since k is a constant, Choice #1 is still in O(n)  Thus, Choice #1 is better

  14. Background: Error Reduction  So, the lesson is: Even though many of the probabilistic complexity classes include error bounds in their definitions, don't get too focused on the precise error bounds [CS1994] – Given a randomized algorithm A with a chance of incorrectness of 1 – ϵ, we can always create an A' that simple iterates A so that the error bounds of A' fits within prescribed error bounds

  15. RP  Also known as Randomized Polynomial  Essentially, RP is a class of languages accepted by Monte Carlo algorithms  Definition of RP – Given a language L and PTM M • If x is in L, then the probability that M accepts x is greater than or equal to ½ • If x is not in L, then the probability that M accepts x is zero – In other words, we can get false negatives when determining membership in an RP language – RP contains all languages L for which there is a PTM M that behaves as described above  Remember Solovay/Strassen? It shows that PRIME(x) is in RP

  16. co-RP  Complement of RP  Like RP, co-RP is a class of languages accepted by Monte Carlo algorithms  Definition of co-RP – Given a language L and PTM M • If x is in L, then the probability that M accepts x is one • If x is not in L, then the probability that M accepts x is less than ½ – In other words, we can get false positives when determining membership in an co-RP language – co-RP contains all languages L for which there is a PTM M that behaves as described above

  17. ZPP  Also known as Zero-error Probabilistic Polynomial Time  Essentially, ZPP is the class of languages accepted by Las Vegas algorithms  Definition of ZPP [ER2009] – Given a language L and PTM M • If x is in L, then M accepts with probability one • If x is not in L, then M rejects with probability one • There is some polynomial function p(n) such that for all inputs w and |w| = n the expected running time is less than p(n). On some random occasions, M may run longer than p(n); on these occasions M halts and returns 'UNKNOWN'  So, for algorithms that accept languages in ZPP, YES/NO answers are always accurate. However, there is a chance that we won't get an answer

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend