proofs of proximity for distribution testing
play

proofs of proximity for distribution testing (Distribution testing - PowerPoint PPT Presentation

proofs of proximity for distribution testing (Distribution testing now with proofs!) Tom Gur (UC Berkeley) October 14, 2017 FOCS 2017 Workshop: Frontiers in Distribution Testing Joint work with Alessandro Chiesa (UC Berkeley) proofs of


  1. Known domain (here n 1 n ) x is now a distribution, let’s call it D n Property n , proximity parameter 0 1 Sample access to D The setting: Decide with high probability: Is D , or TV D ? proofs of proximity for distribution testing Same problem, different object ...and access ...and distance Does it really make a difference? 6

  2. x is now a distribution, let’s call it D n Property n , proximity parameter 0 1 Sample access to D Decide with high probability: Is D , or TV D ? proofs of proximity for distribution testing Same problem, different object ...and access ...and distance Does it really make a difference? The setting: Known domain (here [ n ] = { 1 , . . . , n } ) 6

  3. Property n , proximity parameter 0 1 Sample access to D Decide with high probability: Is D , or TV D ? proofs of proximity for distribution testing Same problem, different object ...and access ...and distance Does it really make a difference? The setting: Known domain (here [ n ] = { 1 , . . . , n } ) x is now a distribution, let’s call it D ∈ ∆([ n ]) 6

  4. Sample access to D Decide with high probability: Is D , or TV D ? proofs of proximity for distribution testing Same problem, different object ...and access ...and distance Does it really make a difference? The setting: Known domain (here [ n ] = { 1 , . . . , n } ) x is now a distribution, let’s call it D ∈ ∆([ n ]) Property Π ⊆ ∆([ n ]) , proximity parameter ε ∈ ( 0 , 1 ] 6

  5. Decide with high probability: Is D , or TV D ? proofs of proximity for distribution testing Same problem, different object ...and access ...and distance Does it really make a difference? The setting: Known domain (here [ n ] = { 1 , . . . , n } ) x is now a distribution, let’s call it D ∈ ∆([ n ]) Property Π ⊆ ∆([ n ]) , proximity parameter ε ∈ ( 0 , 1 ] Sample access to D 6

  6. Decide with high probability: Is D , or TV D ? proofs of proximity for distribution testing Same problem, different object ...and access ...and distance Does it really make a difference? The setting: Known domain (here [ n ] = { 1 , . . . , n } ) x is now a distribution, let’s call it D ∈ ∆([ n ]) Property Π ⊆ ∆([ n ]) , proximity parameter ε ∈ ( 0 , 1 ] Sample access to D 6

  7. proofs of proximity for distribution testing Same problem, different object ...and access ...and distance Does it really make a difference? The setting: Known domain (here [ n ] = { 1 , . . . , n } ) x is now a distribution, let’s call it D ∈ ∆([ n ]) Property Π ⊆ ∆([ n ]) , proximity parameter ε ∈ ( 0 , 1 ] Sample access to D Decide with high probability: Is D ∈ Π , or δ TV ( D , Π) > ε ? 6

  8. sample access to D n explicit access to 0 and a proof : * For every D , there exists proof s.t. T D 1 * For every TV D and any “proof” , T D 0 2 3 MA distribution testers NP distribution testers that are allowed to toss coins IP distribution testers MA distribution testers that interact with a prover different types of proofs NP distribution testers Deterministic algorithm T with 7

  9. explicit access to 0 and a proof : * For every D , there exists proof s.t. T D 1 * For every TV D and any “proof” , T D 0 2 3 MA distribution testers NP distribution testers that are allowed to toss coins IP distribution testers MA distribution testers that interact with a prover different types of proofs NP distribution testers Deterministic algorithm T with sample access to D ∈ ∆([ n ]) 7

  10. * For every D , there exists proof s.t. T D 1 * For every TV D and any “proof” , T D 0 2 3 MA distribution testers NP distribution testers that are allowed to toss coins IP distribution testers MA distribution testers that interact with a prover different types of proofs NP distribution testers Deterministic algorithm T with sample access to D ∈ ∆([ n ]) explicit access to ε > 0 and a proof π : 7

  11. * For every TV D and any “proof” , T D 0 2 3 MA distribution testers NP distribution testers that are allowed to toss coins IP distribution testers MA distribution testers that interact with a prover different types of proofs NP distribution testers Deterministic algorithm T with sample access to D ∈ ∆([ n ]) explicit access to ε > 0 and a proof π : * For every D ∈ Π , there exists proof π s.t. T D ( ε, π ) = 1 7

  12. MA distribution testers NP distribution testers that are allowed to toss coins IP distribution testers MA distribution testers that interact with a prover different types of proofs NP distribution testers Deterministic algorithm T with sample access to D ∈ ∆([ n ]) explicit access to ε > 0 and a proof π : * For every D ∈ Π , there exists proof π s.t. T D ( ε, π ) = 1 * For every δ TV ( D , Π) > ε and any “proof” π , Pr[ T D ( ε, π ) = 0 ] ≥ 2 / 3 7

  13. MA distribution testers NP distribution testers that are allowed to toss coins IP distribution testers MA distribution testers that interact with a prover different types of proofs NP distribution testers Deterministic algorithm T with sample access to D ∈ ∆([ n ]) explicit access to ε > 0 and a proof π : * For every D ∈ Π , there exists proof π s.t. T D ( ε, π ) = 1 * For every δ TV ( D , Π) > ε and any “proof” π , Pr[ T D ( ε, π ) = 0 ] ≥ 2 / 3 7

  14. IP distribution testers MA distribution testers that interact with a prover different types of proofs NP distribution testers Deterministic algorithm T with sample access to D ∈ ∆([ n ]) explicit access to ε > 0 and a proof π : * For every D ∈ Π , there exists proof π s.t. T D ( ε, π ) = 1 * For every δ TV ( D , Π) > ε and any “proof” π , Pr[ T D ( ε, π ) = 0 ] ≥ 2 / 3 MA distribution testers NP distribution testers that are allowed to toss coins 7

  15. different types of proofs NP distribution testers Deterministic algorithm T with sample access to D ∈ ∆([ n ]) explicit access to ε > 0 and a proof π : * For every D ∈ Π , there exists proof π s.t. T D ( ε, π ) = 1 * For every δ TV ( D , Π) > ε and any “proof” π , Pr[ T D ( ε, π ) = 0 ] ≥ 2 / 3 MA distribution testers NP distribution testers that are allowed to toss coins IP distribution testers MA distribution testers that interact with a prover 7

  16. ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? some questions This is all very nice, but: 8

  17. Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? ∙ If so, to what extent? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? 8

  18. ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? Polynomially better? Exponentially better? Large classes? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? 8

  19. ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? Polynomially better? Exponentially better? Large classes? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? 8

  20. ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? Exponentially better? Large classes? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? 8

  21. ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? Large classes? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? 8

  22. ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? 8

  23. ∙ What can and cannot be achieved with each proof system? Randomness? Interaction? Private coins? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? 8

  24. ∙ What can and cannot be achieved with each proof system? Randomness? Interaction? Private coins? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? 8

  25. ∙ What can and cannot be achieved with each proof system? Interaction? Private coins? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? Randomness? 8

  26. ∙ What can and cannot be achieved with each proof system? Private coins? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? Randomness? Interaction? 8

  27. ∙ What can and cannot be achieved with each proof system? some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? Randomness? Interaction? Private coins? 8

  28. some questions This is all very nice, but: ∙ Are proof-augmented testers stronger than standard testers? ∙ If so, to what extent? Polynomially better? Exponentially better? Large classes? ∙ What are the most important resources? Randomness? Interaction? Private coins? ∙ What can and cannot be achieved with each proof system? 8

  29. functions vs distributions 9

  30. functions vs distributions 10

  31. first example

  32. This is a hard problem (requires n n samples [Val11]) ...unless a prover is giving us support! Or rather, a prover is specifying supp D ... Then we only need O 1 samples to detect whether is D is -far from SuppSize k Caveat: this requires a long proof (O n n bits) support size Consider the support size problem: SuppSize ≤ n / 2 = { D ∈ ∆([ n ]) : | supp ( D ) | ≤ n / 2 } 12

  33. (requires n n samples [Val11]) ...unless a prover is giving us support! Or rather, a prover is specifying supp D ... Then we only need O 1 samples to detect whether is D is -far from SuppSize k Caveat: this requires a long proof (O n n bits) support size Consider the support size problem: SuppSize ≤ n / 2 = { D ∈ ∆([ n ]) : | supp ( D ) | ≤ n / 2 } This is a hard problem 12

  34. ...unless a prover is giving us support! Or rather, a prover is specifying supp D ... Then we only need O 1 samples to detect whether is D is -far from SuppSize k Caveat: this requires a long proof (O n n bits) support size Consider the support size problem: SuppSize ≤ n / 2 = { D ∈ ∆([ n ]) : | supp ( D ) | ≤ n / 2 } This is a hard problem (requires Ω( n / log( n )) samples [Val11]) 12

  35. Or rather, a prover is specifying supp D ... Then we only need O 1 samples to detect whether is D is -far from SuppSize k Caveat: this requires a long proof (O n n bits) support size Consider the support size problem: SuppSize ≤ n / 2 = { D ∈ ∆([ n ]) : | supp ( D ) | ≤ n / 2 } This is a hard problem (requires Ω( n / log( n )) samples [Val11]) ...unless a prover is giving us support! 12

  36. Then we only need O 1 samples to detect whether is D is -far from SuppSize k Caveat: this requires a long proof (O n n bits) support size Consider the support size problem: SuppSize ≤ n / 2 = { D ∈ ∆([ n ]) : | supp ( D ) | ≤ n / 2 } This is a hard problem (requires Ω( n / log( n )) samples [Val11]) ...unless a prover is giving us support! Or rather, a prover is specifying supp ( D ) ... 12

  37. Caveat: this requires a long proof (O n n bits) support size Consider the support size problem: SuppSize ≤ n / 2 = { D ∈ ∆([ n ]) : | supp ( D ) | ≤ n / 2 } This is a hard problem (requires Ω( n / log( n )) samples [Val11]) ...unless a prover is giving us support! Or rather, a prover is specifying supp ( D ) ... Then we only need O ( 1 /ε ) samples to detect whether is D is ε -far from SuppSize ≤ k 12

  38. support size Consider the support size problem: SuppSize ≤ n / 2 = { D ∈ ∆([ n ]) : | supp ( D ) | ≤ n / 2 } This is a hard problem (requires Ω( n / log( n )) samples [Val11]) ...unless a prover is giving us support! Or rather, a prover is specifying supp ( D ) ... Then we only need O ( 1 /ε ) samples to detect whether is D is ε -far from SuppSize ≤ k Caveat: this requires a long proof (O ( n log n ) bits) 12

  39. The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property ) The tester has explicit access to the proof If f it can directly check whether Hence, it boils down to test that f is identical to which can easily be done using O 1 queries...for functions on long proofs – functions 13

  40. For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property ) The tester has explicit access to the proof If f it can directly check whether Hence, it boils down to test that f is identical to which can easily be done using O 1 queries...for functions on long proofs – functions The proof length is a key complexity measure for proofs of proximity 13

  41. (How to check that function f has property ) The tester has explicit access to the proof If f it can directly check whether Hence, it boils down to test that f is identical to which can easily be done using O 1 queries...for functions on long proofs – functions The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? 13

  42. The tester has explicit access to the proof If f it can directly check whether Hence, it boils down to test that f is identical to which can easily be done using O 1 queries...for functions on long proofs – functions The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property Π ) 13

  43. If f it can directly check whether Hence, it boils down to test that f is identical to which can easily be done using O 1 queries...for functions on long proofs – functions The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property Π ) The tester has explicit access to the proof π 13

  44. Hence, it boils down to test that f is identical to which can easily be done using O 1 queries...for functions on long proofs – functions The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property Π ) The tester has explicit access to the proof π If π = f it can directly check whether π ∈ Π 13

  45. which can easily be done using O 1 queries...for functions on long proofs – functions The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property Π ) The tester has explicit access to the proof π If π = f it can directly check whether π ∈ Π Hence, it boils down to test that f is identical to π 13

  46. ...for functions on long proofs – functions The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property Π ) The tester has explicit access to the proof π If π = f it can directly check whether π ∈ Π Hence, it boils down to test that f is identical to π which can easily be done using O ( 1 /ε ) queries 13

  47. on long proofs – functions The proof length is a key complexity measure for proofs of proximity For functions, linear-length proofs completely trivialize the model! Why? (How to check that function f has property Π ) The tester has explicit access to the proof π If π = f it can directly check whether π ∈ Π Hence, it boils down to test that f is identical to π which can easily be done using O ( 1 /ε ) queries...for functions 13

  48. O n 2 max ...or even O D 16 2 3 [VV17] where 2 3 denotes the 2 3 quasi-norm, and D 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass 16 1 ...or perhaps O 1 c [BCG17] where c 0 is a constant, and D is the K-functional between D 1 and 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? The description of D n may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D What if D , but D approx is close to, yet not in ? We can use a tolerant tester to make sure it rules the same max on long proofs – distributions For distribution testing, testing identity is much harder : 14

  49. max ...or even O D 16 2 3 [VV17] where 2 3 denotes the 2 3 quasi-norm, and D 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass 16 1 ...or perhaps O 1 c [BCG17] where c 0 is a constant, and D is the K-functional between D 1 and 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? The description of D n may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D What if D , but D approx is close to, yet not in ? We can use a tolerant tester to make sure it rules the same max on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) 14

  50. 1 ...or perhaps O 1 c [BCG17] where c 0 is a constant, and D is the K-functional between D 1 and 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? The description of D n may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D What if D , but D approx is close to, yet not in ? We can use a tolerant tester to make sure it rules the same on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) ...or even O ( ∥ D − max − ε/ 16 ∥ 2 / 3 ) [VV17] where ∥ · ∥ 2 / 3 denotes the ℓ 2 / 3 quasi-norm, and D − max − ε/ 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass ε/ 16 14

  51. But wait, how can the proof fully describe the distribution? The description of D n may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D What if D , but D approx is close to, yet not in ? We can use a tolerant tester to make sure it rules the same on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) ...or even O ( ∥ D − max − ε/ 16 ∥ 2 / 3 ) [VV17] where ∥ · ∥ 2 / 3 denotes the ℓ 2 / 3 quasi-norm, and D − max − ε/ 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass ε/ 16 ...or perhaps O ( κ − 1 D ( 1 − c ε )) [BCG17] where c > 0 is a constant, and κ D is the K-functional between ℓ 1 and ℓ 2 with respect to the distribution D 14

  52. The description of D n may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D What if D , but D approx is close to, yet not in ? We can use a tolerant tester to make sure it rules the same on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) ...or even O ( ∥ D − max − ε/ 16 ∥ 2 / 3 ) [VV17] where ∥ · ∥ 2 / 3 denotes the ℓ 2 / 3 quasi-norm, and D − max − ε/ 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass ε/ 16 ...or perhaps O ( κ − 1 D ( 1 − c ε )) [BCG17] where c > 0 is a constant, and κ D is the K-functional between ℓ 1 and ℓ 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? 14

  53. Luckily, it suffices to send a granular approximation D approx of D What if D , but D approx is close to, yet not in ? We can use a tolerant tester to make sure it rules the same on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) ...or even O ( ∥ D − max − ε/ 16 ∥ 2 / 3 ) [VV17] where ∥ · ∥ 2 / 3 denotes the ℓ 2 / 3 quasi-norm, and D − max − ε/ 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass ε/ 16 ...or perhaps O ( κ − 1 D ( 1 − c ε )) [BCG17] where c > 0 is a constant, and κ D is the K-functional between ℓ 1 and ℓ 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? The description of D ∈ ∆([ n ]) may be very large (even infinite...) 14

  54. What if D , but D approx is close to, yet not in ? We can use a tolerant tester to make sure it rules the same on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) ...or even O ( ∥ D − max − ε/ 16 ∥ 2 / 3 ) [VV17] where ∥ · ∥ 2 / 3 denotes the ℓ 2 / 3 quasi-norm, and D − max − ε/ 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass ε/ 16 ...or perhaps O ( κ − 1 D ( 1 − c ε )) [BCG17] where c > 0 is a constant, and κ D is the K-functional between ℓ 1 and ℓ 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? The description of D ∈ ∆([ n ]) may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D 14

  55. We can use a tolerant tester to make sure it rules the same on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) ...or even O ( ∥ D − max − ε/ 16 ∥ 2 / 3 ) [VV17] where ∥ · ∥ 2 / 3 denotes the ℓ 2 / 3 quasi-norm, and D − max − ε/ 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass ε/ 16 ...or perhaps O ( κ − 1 D ( 1 − c ε )) [BCG17] where c > 0 is a constant, and κ D is the K-functional between ℓ 1 and ℓ 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? The description of D ∈ ∆([ n ]) may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D What if D ∈ Π , but D approx is close to, yet not in Π ? 14

  56. on long proofs – distributions For distribution testing, testing identity is much harder : O ( √ n /ε 2 ) ...or even O ( ∥ D − max − ε/ 16 ∥ 2 / 3 ) [VV17] where ∥ · ∥ 2 / 3 denotes the ℓ 2 / 3 quasi-norm, and D − max − ε/ 16 is the distribution obtained by removing the maximal element of D as well as removing a maximal set of elements of total mass ε/ 16 ...or perhaps O ( κ − 1 D ( 1 − c ε )) [BCG17] where c > 0 is a constant, and κ D is the K-functional between ℓ 1 and ℓ 2 with respect to the distribution D But wait, how can the proof fully describe the distribution? The description of D ∈ ∆([ n ]) may be very large (even infinite...) Luckily, it suffices to send a granular approximation D approx of D What if D ∈ Π , but D approx is close to, yet not in Π ? We can use a tolerant tester to make sure it rules the same 14

  57. functions vs distributions 15

  58. ∙ there exists a (hard) property that is testable via O 1 samples Can we get significant savings via short (sublinear) proofs? Yes! Theorem There exists a property that requires n samples to test, yet , given a proof of length O n can be tested using O n 1 samples But can we do better? Not much... (not without interaction) what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples 16

  59. Can we get significant savings via short (sublinear) proofs? Yes! Theorem There exists a property that requires n samples to test, yet , given a proof of length O n can be tested using O n 1 samples But can we do better? Not much... (not without interaction) what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples ∙ there exists a (hard) property that is testable via O ( 1 /ε ) samples 16

  60. Can we get significant savings via short (sublinear) proofs? Yes! Theorem There exists a property that requires n samples to test, yet , given a proof of length O n can be tested using O n 1 samples But can we do better? Not much... (not without interaction) what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples ∙ there exists a (hard) property that is testable via O ( 1 /ε ) samples 16

  61. Yes! Theorem There exists a property that requires n samples to test, yet , given a proof of length O n can be tested using O n 1 samples But can we do better? Not much... (not without interaction) what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples ∙ there exists a (hard) property that is testable via O ( 1 /ε ) samples Can we get significant savings via short (sublinear) proofs? 16

  62. But can we do better? Not much... (not without interaction) what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples ∙ there exists a (hard) property that is testable via O ( 1 /ε ) samples Can we get significant savings via short (sublinear) proofs? Yes! Theorem Ω( √ n ) samples to test, yet ∀ β , There exists a property that requires ˜ given a proof of length ˜ O ( n β ) can be tested using O ( n 1 − β ) samples 16

  63. But can we do better? Not much... (not without interaction) what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples ∙ there exists a (hard) property that is testable via O ( 1 /ε ) samples Can we get significant savings via short (sublinear) proofs? Yes! Theorem Ω( √ n ) samples to test, yet ∀ β , There exists a property that requires ˜ given a proof of length ˜ O ( n β ) can be tested using O ( n 1 − β ) samples 16

  64. Not much... (not without interaction) what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples ∙ there exists a (hard) property that is testable via O ( 1 /ε ) samples Can we get significant savings via short (sublinear) proofs? Yes! Theorem Ω( √ n ) samples to test, yet ∀ β , There exists a property that requires ˜ given a proof of length ˜ O ( n β ) can be tested using O ( n 1 − β ) samples But can we do better? 16

  65. what about short proofs? So far we saw that: ∙ any property can be tested using O ( √ n /ε 2 ) -ish samples ∙ there exists a (hard) property that is testable via O ( 1 /ε ) samples Can we get significant savings via short (sublinear) proofs? Yes! Theorem Ω( √ n ) samples to test, yet ∀ β , There exists a property that requires ˜ given a proof of length ˜ O ( n β ) can be tested using O ( n 1 − β ) samples But can we do better? Not much... (not without interaction) 16

  66. Distribution testers are not only non-adaptive w.r.t. the samples, but also w.r.t. the proof Thus, testers can emulate all possible proofs reusing the samples! Since there are 2 p possible proofs, we need to amplify the soundness to assure no error occurs w.h.p. To this end, we invoke the tester O p times, increasing the sample complexity to O p s . limitations of non-interactive proofs of proximity Lemma For every Π and MA distribution tester for Π with proof length p and sample complexity s, it holds that p · s = Ω( SAMP (Π)) The idea 17

  67. but also w.r.t. the proof Thus, testers can emulate all possible proofs reusing the samples! Since there are 2 p possible proofs, we need to amplify the soundness to assure no error occurs w.h.p. To this end, we invoke the tester O p times, increasing the sample complexity to O p s . limitations of non-interactive proofs of proximity Lemma For every Π and MA distribution tester for Π with proof length p and sample complexity s, it holds that p · s = Ω( SAMP (Π)) The idea Distribution testers are not only non-adaptive w.r.t. the samples, 17

  68. Thus, testers can emulate all possible proofs reusing the samples! Since there are 2 p possible proofs, we need to amplify the soundness to assure no error occurs w.h.p. To this end, we invoke the tester O p times, increasing the sample complexity to O p s . limitations of non-interactive proofs of proximity Lemma For every Π and MA distribution tester for Π with proof length p and sample complexity s, it holds that p · s = Ω( SAMP (Π)) The idea Distribution testers are not only non-adaptive w.r.t. the samples, but also w.r.t. the proof 17

  69. Since there are 2 p possible proofs, we need to amplify the soundness to assure no error occurs w.h.p. To this end, we invoke the tester O p times, increasing the sample complexity to O p s . limitations of non-interactive proofs of proximity Lemma For every Π and MA distribution tester for Π with proof length p and sample complexity s, it holds that p · s = Ω( SAMP (Π)) The idea Distribution testers are not only non-adaptive w.r.t. the samples, but also w.r.t. the proof Thus, testers can emulate all possible proofs reusing the samples! 17

  70. To this end, we invoke the tester O p times, increasing the sample complexity to O p s . limitations of non-interactive proofs of proximity Lemma For every Π and MA distribution tester for Π with proof length p and sample complexity s, it holds that p · s = Ω( SAMP (Π)) The idea Distribution testers are not only non-adaptive w.r.t. the samples, but also w.r.t. the proof Thus, testers can emulate all possible proofs reusing the samples! Since there are 2 p possible proofs, we need to amplify the soundness to assure no error occurs w.h.p. 17

  71. limitations of non-interactive proofs of proximity Lemma For every Π and MA distribution tester for Π with proof length p and sample complexity s, it holds that p · s = Ω( SAMP (Π)) The idea Distribution testers are not only non-adaptive w.r.t. the samples, but also w.r.t. the proof Thus, testers can emulate all possible proofs reusing the samples! Since there are 2 p possible proofs, we need to amplify the soundness to assure no error occurs w.h.p. To this end, we invoke the tester O ( p ) times, increasing the sample complexity to O ( p · s ) . 17

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend