minimum circuit size graph isomorphism and related
play

Minimum Circuit Size, Graph Isomorphism, and Related Problems - PowerPoint PPT Presentation

Minimum Circuit Size, Graph Isomorphism, and Related Problems Andrew Morgan University of WisconsinMadison November 1st, 2018 Based on work with E. Allender, J. Grochow, D. van Melkebeek, and C. Moore 1 Minimum Circuit Size MCSP = { ( x ,


  1. Minimum Circuit Size, Graph Isomorphism, and Related Problems Andrew Morgan University of Wisconsin–Madison November 1st, 2018 Based on work with E. Allender, J. Grochow, D. van Melkebeek, and C. Moore 1

  2. Minimum Circuit Size MCSP = { ( x , θ ) : x has circuit complexity at most θ } How hard is MCSP ? 2

  3. How hard is MCSP? Some known reductions: • Factoring ∈ ZPP MCSP [Allender–Buhrman–Kouck´ y–van Melkebeek–Ronneburger] • DiscreteLog ∈ ZPP MCSP [Allender–Buhrman–Kouck´ y–van Melkebeek–Ronneburger, Rudow] • GI ∈ RP MCSP [Allender–Das] • SZK ⊆ BPP MCSP [Allender–Das] where GI = graph isomorphism SZK = problems with statistical zero knowledge protocols Can replace MCSP by M µ P for any complexity measure µ polynomially related to circuit size 3

  4. KT Complexity Describe a string x by a program p so that p ( i ) = i -th bit of x KT ( x ) = smallest | p | + T , where • p describes x • p runs in at most T steps for all i MKTP = { ( x , θ ) : KT ( x ) ≤ θ } Time-bounded Turing machines with advice ∼ = Circuits = ⇒ KT polynomially-related to circuit complexity 4

  5. How hard is M µ P? Some known reductions: • Factoring ∈ ZPP M µ P • DiscreteLog ∈ ZPP M µ P • GI ∈ RP M µ P • SZK ⊆ BPP M µ P where GI = graph isomorphism SZK = problems with statistical zero knowledge protocols M µ P = MCSP , MKTP , . . . Eliminate error in GI and SZK reductions? 5

  6. A zero-error reduction Theorem. GI ∈ ZPP MKTP Fundamentally different reduction from before Extends to any ‘explicit’ isomorphism problem, including several where the best known algorithms are still exponential Doesn’t (yet) work for MCSP 6

  7. How do the old reductions work? Hinge on M µ P breaking PRGs PRG from any one-way function [H˚ astad–Impagliazzo–Levin–Luby] Inversion Lemma. There is a poly-time randomized Turing machine M using oracle access to M µ P so that the following holds. For any circuit C , if σ ∼ { 0 , 1 } n , Pr [ C ( τ ) = C ( σ )] ≥ 1 / poly( | C | ) where τ = M ( C , C ( σ )) [Allender–Buhrman–Kouck´ y–van Melkebeek–Ronneburger] Example: Fix a graph G . Let C map a permutation σ to σ ( G ). M inverts C : if σ ( G ) is a random permutation of G , then M ( C , σ ( G )) finds τ s.t. τ ( G ) = σ ( G ) with good probability 7

  8. Example: GI in RPˆMCSP [Allender–Das] Theorem. GI ∈ RP M µ P [Allender–Das] Given G 0 ∼ = G 1 , use M to find an isomorphism Let C ( σ ) = σ ( G 0 ) where σ ∼ S n M inverts C : given random σ ( G 0 ), M finds τ with τ ( G 0 ) = σ ( G 0 ) G 0 ∼ = G 1 implies that σ ( G 1 ) is distributed the same as σ ( G 0 ) So M ( C , σ ( G 1 )) finds τ with τ ( G 0 ) = σ ( G 1 ) ⇒ GI ∈ RP M µ P = 8

  9. Eliminating error? Similar results: • Factoring ∈ ZPP M µ P • DiscreteLog ∈ ZPP M µ P • GI ∈ RP M µ P • SZK ⊆ BPP M µ P How to eliminate error? M µ P is only used to generate witnesses, which are then checked in deterministic polynomial time Thus, showing GI ∈ coRP M µ P using a similar approach implicitly requires GI ∈ coNP, i.e., NP-witnesses for non isomorphism Approach uses MKTP to help with verification 9

  10. A zero-error reduction Theorem. GI ∈ ZPP MKTP Nonisomorphism has NP MKTP witnesses. Key idea : KT complexity is a good estimator for the entropy of samplable distributions 10

  11. Graph Isomorphism in ZPPˆMKTP

  12. Graph Isomorphism GI = decide whether two given graphs ( G 0 , G 1 ) are isomorphic Aut( G ) = group of automorphisms of G Number of distinct permutations of G = n ! / | Aut( G ) | To show GI ∈ ZPP MKTP , suffices to show GI ∈ coRP MKTP , i.e. , to witness nonisomorphism 11

  13. KT Complexity Recall: KT ( x ) = smallest | p | + T where p describes x in time T Intuition for bounding KT ( x ): describe a string x by a program p taking advice α so that p α ( i ) = i -th bit of x KT ( x ) is smallest | p | + | α | + T where • p with advice α describes x • p runs in at most T steps for all i 12

  14. KT Complexity Examples: 1. KT (0 n ) = polylog( n ) Store n in advice, define p ( i ) to output 0 if i ≤ n , and end-of-string otherwise 2. G = adjacency matrix of a graph � n � KT ( G ) ≤ + polylog( n ) 2 3. Let y = t copies of G KT ( y ) ≤ KT ( G ) + polylog( nt ) 4. Let y = sequence of t numbers from { 5 , 10 , 10 300 , − 46 } O (1) bits to describe the set, plus 2 t bits to describe the sequence given the set KT ( y ) ≤ 2 t + polylog( t ) 13

  15. Witnessing nonisomorphism: rigid graphs Let G 0 , G 1 be rigid graphs, i.e. , no non-trivial automorphisms Key fact: if G 0 ∼ = G 1 , there are n ! distinct graphs among permutations of G 0 and G 1 ; if G 0 �∼ = G 1 , there are 2( n !). Consider sampling r ∼ { 0 , 1 } and π ∼ S n uniformly, and outputting the adjacency matrix of π ( G r ). = G 1 , this has entropy s . • If G 0 ∼ = log( n !) • If G 0 �∼ = G 1 , this has entropy s + 1 Main idea: use KT -complexity of a random sample to estimate the entropy 14

  16. Witnessing nonisomorphism: rigid graphs Let y = π ( G r ), π ∼ S n , r ∼ { 0 , 1 } . Hope: KT ( y ) is typically near the entropy, never much larger θ s + 1 s G 0 ∼ = G 1 G 0 �∼ = G 1 where s = log( n !) Then KT ( y ) > θ is a witness of nonisomorphism. 15

  17. Witnessing nonisomorphism: rigid graphs Let y = π 1 ( G r 1 ) π 2 ( G r 2 ) · · · π t ( G r t ), π i ∼ S n , r i ∼ { 0 , 1 } . Truth: KT ( y ) / t is typically near the entropy, never much larger θ s + 1 s G 0 ∼ = G 1 G 0 �∼ = G 1 where s = log( n !) Then KT ( y ) / t > θ is a witness of nonisomorphism. 16

  18. Bounding KT in isomorphic case Let y = π 1 ( G r 1 ) π 2 ( G r 2 ) · · · π t ( G r t ). Goal: KT ( y ) ≪ ts + t . Since G 0 ∼ = G 1 , rewrite y = τ 1 ( G 0 ) τ 2 ( G 0 ) · · · τ t ( G 0 ). Describe y as • Fixed data: n , t , adjacency matrix of G 0 • Per-sample data: τ 1 , . . . , τ t • Decoding algo: to output j -th bit of y , look up appropriate τ i and compute τ i ( G 0 ) Suppose each τ i can be encoded into s bits: KT ( y ) < O (1) + poly( n , log t ) + ts + poly( n , log t ) ���� � �� � � �� � | p | | α | T = ts + poly( n , log t ) ≪ ts + t ( t large) 17

  19. Rigid graphs: Isomorphic case Lehmer Code. There is an indexing of S n by the numbers 1 , . . . , n ! so that the i -th permutation can be decoded from the binary representation of i in time poly( n ). Na¨ ıve conversion to binary: KT ( y ) < t ⌈ s ⌉ + poly( n , log t ) ≪ ts + t ? �≪ ts + t Blocking trick: amortize encoding overhead across samples Yields for some δ > 0, KT ( y ) ≤ ts + t 1 − δ poly( n ), i.e., KT ( y ) / t ≤ s + poly( n ) / t δ 18

  20. Rigid graphs: Recap Let y = π 1 ( G r 1 ) π 2 ( G r 2 ) · · · π t ( G r t ). If G 0 ∼ = G 1 , then KT ( y ) / t ≤ s + o (1) always holds If G 0 �∼ = G 1 , then as y is t independent samples from a distribution of entropy s + 1, KT ( y ) / t ≥ s + 1 − o (1) holds w.h.p. ⇒ coRP MKTP algorithm for GI on rigid graphs = 19

  21. General graphs Assume for simplicity that there are as many distinct permutations of G 0 as of G 1 . Let s be entropy in random permutation of G i : log( n ! / | Aut( G i ) | ) Sample y = π 1 ( G r 1 ) · · · π t ( G r t ), hope KT ( y ) / t looks the same: θ s + 1 s G 0 ∼ = G 1 G 0 �∼ = G 1 If G 0 �∼ = G 1 , KT ( y ) / t > s + 1 − o (1) w.h.p. If G 0 ∼ = G 1 , y has entropy ts , hope a similar encoding shows KT ( y ) / t ≤ s + o (1). 20

  22. General graphs Assume for simplicity that there are as many distinct permutations of G 0 as of G 1 . Let s be entropy in random permutation of G i : log( n ! / | Aut( G i ) | ) Sample y = π 1 ( G r 1 ) · · · π t ( G r t ), hope KT ( y ) / t looks the same: θ s + 1 s G 0 ∼ = G 1 G 0 �∼ = G 1 Two complications: • Encoding distinct permutations of G 0 as numbers 1 , . . . , n ! is too expensive • Knowing θ requires knowing | Aut( G i ) | 20

  23. General graphs: encoding permutations of graphs Indexing the various permutations of a non-rigid graph G as numbers 1 , . . . , n ! is too expensive Need to use numbers 1 , . . . , N where N = n ! / | Aut( G ) | Such a specific encoding exists, but will see a more general-purpose substitute soon 21

  24. General graphs: computing θ It suffices to give a probably-approximately-correct overestimator (PAC overestimator) for θ : θ s + 1 s � θ KT ( y ) / t , G 0 ∼ = G 1 KT ( y ) / t , G 0 �∼ = G 1 Equivalently, it suffices to give a PAC under estimator for log | Aut( G i ) | , since θ = (log n ! − log | Aut( G i ) | ) + 1 / 2 22

  25. General graphs: computing θ Claim. There is an efficient randomized algorithm using MKTP to PAC underestimate log | Aut( G ) | when given G . Proof. Recall that there is a deterministic algorithm using an oracle for GI that computes generators for Aut( G ). Plug in an existing RP MKTP algorithm for the oracle: this gives us generators for a group A with A = Aut( G ) w.h.p. Prune generators of A not in Aut( G ) = ⇒ A ≤ Aut( G ) | A | can be computed efficiently from its generators. Output log | A | . 23

  26. General graphs: Recap y = π 1 ( G r 1 ) π 2 ( G r 2 ) · · · π t ( G r t ) s = log n ! / | Aut( G i ) | θ s + 1 s � θ KT ( y ) / t , G 0 ∼ = G 1 KT ( y ) / t , G 0 �∼ = G 1 Witness of nonisomorphism: KT ( y ) / t > ˜ θ Theorem. GI ∈ ZPP MKTP 24

  27. Generic Encoding Lemma

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend