mansour s conjecture is true for random dnf formulas
play

MansoursConjectureisTrue forRandomDNFFormulas AdamKlivans - PowerPoint PPT Presentation

MansoursConjectureisTrue forRandomDNFFormulas AdamKlivans HominK.Lee AndrewWan UTAustin UTAustin Columbia AConjecture [M94] LetfbeattermDNFformula. arealt O(log1/


  1. Mansour’s
Conjecture
is
True for
Random
DNF
Formulas Adam
Klivans Homin
K.
Lee Andrew
Wan UT‐Austin UT‐Austin Columbia

  2. A
Conjecture [M94] Let
f
be
a
t‐term
DNF
formula. 
a
real
t O(log
1/ ε ) ‐term
poly
p
s.t. E 3 / 16 x 1 x 6 
‐
 1 / 8 x 3 x 7 x 8 E

[(p(x)‐f(x)) 2 ]
 · ε 
 x 2 {0,1} n

  3. Sparse Approximators [KM91] Thm:
If
 8 f
 2 
C
has
an
s‐sparse
 ε ‐approx
p, then
there
is
a
uniform
distribution
MQ
PAC learner
for
C
that
runs
in
time
poly(n,s, ε ‐1 ) . MC
 ) PAC‐learning
DNF

  4. The
Harmonic
Sieve [J94] A
MQ
PAC‐learner
for poly(n)‐term
DNF
formulas over
the
uniform
distribution. •Didn’t
prove
MC.
 •Used
weak‐approximator
+
boosting.

  5. …and
a
decade passed.

  6. Sparse Approximators [GKK08] Thm:
If
 8 f
 2 
C
has
an
s‐sparse
 ε ‐approx
p,
then there
is
a
uniform
distribution
MQ
 agnostic learner
for
C
that
runs
in
time
poly(n,s, ε ‐1 ) . MC
 ) agnostic‐learning
DNF

  7. Agnostic
Learning • f
arbitrary
Boolean
function • opt
=
min c 2 C Pr x [c(x)
 ≠ 
f(x)] An
 agnostic
learner 
is
given
MQ
to
f w.h.p.
outputs
h
s.t. Pr x [h(x)
 ≠ 
f(x)]
 · opt
+
 ε

  8. Previous
Results E

[(p(x)‐f(x)) 2 ]
 · ε 
 x 2 {0,1} n f
a
t‐term
DNF
formula;




 ε ‐approx
p
with: E • degree
O(log(t/ ε ) 2 ) [LMN89] • t O(loglog
t
log(1/ ε )) 
terms [M92] • degree
O(log(t/ ε )) [H01]

  9. Our
Results E

[(p(x)‐f(x)) 2 ]
 · ε 
 x 2 {0,1} n ε ‐approx
p
with
t O(log(1/ ε )) 
terms
for E • f
a
t‐term
 random 
DNF
formula • f
a
t‐term
 read‐k 
DNF
formula (and
[GKK08]
gives
agnostic
learners)

  10. Outline 1. Intro 2. How
we
didn’t
prove
it. 3. How
we
did
prove
it. a) Read‐once
DNF
formulas b) Random
DNF
formulas c) Read‐k
DNF
formulas 4. Pseudorandomness

  11. How
we
didn’t
prove Mansour’s
Conjecture
1 Every
f
has
a
unique
real
polynomial representation
with
coeffs
f(S) ^ (the
Fourier
representation). Analyze
the
large
coeffs
using
Håstad’s random
restriction
machinery [LMN89,M92,H01].

  12. How
we
didn’t
prove Mansour’s
Conjecture
2 Entropy‐Influence
Conjecture:
E(f)=O(I(f)) ∑ S
 f(S) 2
 =
1
 ^ E(f)
:=
 ∑ S
 ‐ 
 f(S) 2 log(f(S) 2 ) ^ ^ I(f)
:=
 ∑ S
 |S| 
 f(S) 2 ^ EI
 ) MC

  13. Outline 1. Intro 2. How
we
didn’t
prove
it. 3. How
we
did
prove
it. a) Read‐once
DNF
formulas b) Random
DNF
formulas c) Read‐k
DNF
formulas 4. Pseudorandomness

  14. Polynomial
Interpolation f
=
T 1 
 Ç 
T 2 
 Ç 
∙
∙
∙
 Ç 
T t Let
y f (x)
=
T 1 +T 2 +
∙
∙
∙
+
T t (#
of
terms
satisfied
by
x.) Interpolate
the
values of
f
on
{x
:
y f (x)
 · d}

  15. The
Polynomial P d (y)
=
((‐1) d+1 /d!)(y‐1)
(y‐2)∙
∙
∙(y‐d)
+
1 •P d (0)=0 •P d (y)=1,
y=1…d y •|P d (y)|<( d ),
y>d

  16. The
Polynomial P d (y)
=
((‐1) d+1 /d!)(y‐1)
(y‐2)∙
∙
∙(y‐d)
+
1 •P d (y f (x))
has
t O(d) 
terms. •P d (y f (x))=f(x)
when
x
satisfies
at
most d
terms. •Need
to
show
that
x
satisfies
more terms
with
small
probability.

  17. Read‐once
DNF
Formulas Read‐once:
each
var
appears
at
most
once _ _ x 1 x 5 x 8 
 Ç x 2 x 3 x 18 x 31
 Ç x 4 x 7 ) terms
are
satisfied
independently. How
do
we
show
that
sums
of
independent variables
are
concentrated
in
a
narrow range?

  18. Chernoff
Bounds t •T
=
 ∑ i=1 T i 

(i.r.v.’s
T i =1
w.p.
 µ i ) t • µ 
=
 ∑ i=1 µ i
 =
E[T] •Can
assume 


 µ 
 · 
log(1/ ε ),
or
f ≈ 1. Chernoff
:
Pr[
T
=
j
]
 · (e µ /j) j

  19. MC
is
true
for
RO
DNFs E

[(p(x)‐f(x)) 2 ]
 · ε 
 x 2 {0,1} n t 




 ∑ j=0
 Pr[y f (x)=j]
(P d (y f (x))‐f(x)) 2 t j · 
 ∑ j=d+1
 (ed/j) j
 ( d ) 2
 · ε for
d=
log(1/ ε ).

  20. Outline 1. Intro 2. How
we
didn’t
prove
it. 3. How
we
did
prove
it. a) Read‐once
DNF
formulas b) Random
DNF
formulas c) Read‐k
DNF
formulas 4. Pseudorandomness

  21. MC
is
true
for
random
DNFs Our
model:
choose
each
term
of
a
t‐term DNF
from
the
set
of
all
terms
of
length log(t). Show
that
w.h.p.
random
DNFs
behave
like RO
DNFs
using
the
method
of
bouded differences.

  22. Outline 1. Intro 2. How
we
didn’t
prove
it. 3. How
we
did
prove
it. a) Read‐once
DNF
formulas b) Random
DNF
formulas c) Read‐k
DNF
formulas 4. Pseudorandomness

  23. Read‐k
DNF
Formulas Read‐k:
each
var
appears
at
most
k
times _ _ x 1 x 5 x 8 
 Ç x 1 x 2 x 3 x 4
 Ç x 5 x 7 Terms
are
no
longer
independent!

  24. The
Modified
Construction f
=
T 1 
 Ç 
T 2 
 Ç 
∙
∙
∙
 Ç 
T t

 (ordered
from
longest
to
shortest) Let
z f (x)
=
A 1 +A 2 +
∙
∙
∙
A t A i 
=
T i 
 Æ ( Æ j
 » 
i,
j
 · 
i
 ¬ T j ) (#
of
ind.
terms
sat.
by
x) Interpolate
the
values of
f
on
{x
:
z f (x)
 · d}

  25. The
Polynomial P d (z)
=
((‐1) d+1 /d!)(z‐1)
(z‐2)∙
∙
∙(z‐d)
+
1 •P d (z f (x))
has
t O(kd) 
terms. •P d (z f (x))=f(x)
when
x
satisfies
 · 
d
ind. terms. •Need
to
show
that
x
satisfies
more indep.
terms
with
small
probability.

  26. Concentration
for
Read‐k •T i 
are
r.v.’s
1
w.p.
 µ i t • µ 
=
 ∑ i=1
 µ i t •A
=
 ∑ i=1 A i 

(A i 
=
T i Æ ( Æ j
 » 
i,
j
 · 
i
 ¬ T j )) •Pr[
A
=
j
]
 · 
 ∑ |S|=j Π i 2 S T i 
 · (e µ /j) j

  27. Janson
Bounds • T i 
are
r.v.’s
1
w.p.
 µ i t • µ 
=
 ∑ i=1
 µ i • Δ 
=
 ∑ i
~
j
 E[
T i
 T j
 ] • Pr[
T=0
]
 · exp(‐ µ 2 / Δ ) By
Janson,
can
assume
 


 µ 
 · 
16 k log(1/ ε ),
or
f ≈ 1.

  28. Recap E

[(p(x)‐f(x)) 2 ]
 · ε 
 x 2 {0,1} n ε ‐approx
p
with
t O(log(1/ ε )) 
terms
for E f
a
t‐term
 random 
DNF
formula
w.h.p. k ε ‐approx
p
with
t O(16


log(1/ ε )) 
terms
for E f
a
t‐term
 read‐k 
DNF
formula

  29. Outline 1. Intro 2. How
we
didn’t
prove
it. 3. How
we
did
prove
it. a) Read‐once
DNF
formulas b) Random
DNF
formulas c) Read‐k
DNF
formulas 4. Pseudorandomness

  30. Pseudorandomness A
distribution
X
 φ ‐fools
C
if
 8 f 2 
C |E[f(X)]
‐
E[f(U)]|
 · φ Seed
length
is
#
of
random
bits
used
by
X.

  31. PRGs
against
DNFs Seed
length
for
pseudorandom generators
against
t‐term
DNF
formulas: • O(log 4 (tn/ φ ))

 

 





[LVW93] • O(log(n)log 2 (t/ φ )) 





[B07] • O(log(n)
 + log 2 (t/ φ )loglog(t/ φ ))
[DETT10]

  32. The
Sandwich
Bound If
 9 
s( φ )‐sparse
g
&
h
s.t. 8 x,
g(x)
 · 
f(x)
 · 
h(x) E[h(x)
‐
f(x)]
 · φ , E[f(x)
‐
g(x)]
 · φ Then
 9 
dist.
that
 φ ‐fools
f
with
seed length
O(log 
 n
+
log
s( φ ))
[B07,DETT10]

  33. The
Polynomial P d (y)
=
((‐1) d+1 /d!)(y‐1)
(y‐2)∙
∙
∙(y‐d)
+
1 •P d (0)=0 •P d (y)=1,
y=1…d y •|P d (y)|<( d ) •P d (y)>1,
y>d,
d
odd •P d (y)<0,
y>d,
d
even

  34. PRGs
against
DNFs • t‐term
random
DNFs
are
fooled
by
PRGs w/
seed
length O(log(n)
 + log 
 (t)log(1/ φ ))
w.h.p. • t‐term
read‐k
DNFs
are
fooled
by
PRGs w/
seed
length O(log(n)
 + log 
 (t)16 k log(1/ φ )) ([DETT10]
showed
O(log(n)
 + log 
 (t)log(1/ φ ))
for
RO
DNFs)

  35. Open
Problems • Prove
Mansour’s
Conjecture
for
all
t‐term DNF
formulas. • Show
PRGs
against
DNFs
with
seed
length O(log 
 (t)log(1/ φ )).

  36. The
End

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend