The Probabilistic Method The Probabilistic Method Topics on - - PowerPoint PPT Presentation

the probabilistic method the probabilistic method
SMART_READER_LITE
LIVE PREVIEW

The Probabilistic Method The Probabilistic Method Topics on - - PowerPoint PPT Presentation

The Probabilistic Method The Probabilistic Method Topics on Randomized Computation Topics on Randomized Computation Spring Semester Spring Semester Co.Re.Lab. N.T.U.A. N.T.U.A. Co.Re.Lab. Overview Overview In the first part we will


slide-1
SLIDE 1

The Probabilistic Method The Probabilistic Method

Topics on Randomized Computation Topics on Randomized Computation Spring Semester Spring Semester Co.Re.Lab. Co.Re.Lab. N.T.U.A. N.T.U.A.

slide-2
SLIDE 2

Overview Overview

In the first part we will see simple methods (basically In the first part we will see simple methods (basically through examples) through examples)

1. 1.

The counting method The counting method

2. 2.

The first moment method The first moment method

3. 3.

The deletion method The deletion method

4. 4.

The second moment method The second moment method

5. 5.

Derandomization Derandomization with conditional probabilities with conditional probabilities

The second part is THE The second part is THE part(y part(y): ):

1. 1.

General Lovasz Local Lemma General Lovasz Local Lemma

2. 2.

Other (usual and helpful) forms of LLL Other (usual and helpful) forms of LLL

3. 3.

Constructive proof of LLL Constructive proof of LLL

slide-3
SLIDE 3

Counting Expanders Counting Expanders

We will relay on We will relay on Definition: An Definition: An ( (n,d,a,c n,d,a,c) OR ) OR concentrator concentrator is a bipartite is a bipartite multigraph multigraph G(L,R,E) G(L,R,E) such that: such that:

  • Each vertex in

Each vertex in L L has degree at most has degree at most d d

  • For any

For any

Theorem: There is an integer Theorem: There is an integer n n0

0 such that for all

such that for all n>n n>n0 there is an there is an (n,18,1/3,2) OR (n,18,1/3,2) OR concentrator concentrator. . We will choose a random graph from a suitable We will choose a random graph from a suitable probabilistic space and we will show that it has positive probabilistic space and we will show that it has positive probability of being an probability of being an (n,18,1/3,2) OR (n,18,1/3,2) OR concentrator concentrator. . Pr( ( )) ( ) Q x xQ x > ⇒ ∃ :| | | ( ) | | | S L S a n N S c S ⊆ ≤ ⋅ → ≥

slide-4
SLIDE 4

Counting Expanders Counting Expanders

Proof: Our random bipartite graph will have Proof: Our random bipartite graph will have

  • Vertex set

Vertex set

  • Each

Each “ “chooses chooses” ” d d times a neighbor (in times a neighbor (in R R) uniformly ) uniformly (multiple edges become one edge). (multiple edges become one edge).

Let Let E Es

s be the event that a subset with

be the event that a subset with s s vertices of vertices of L L has has fewer than fewer than cs cs neighbors. neighbors. We will bound and then sum up for all the We will bound and then sum up for all the values to get a bound on the probability of values to get a bound on the probability of failure failure Fix an of size Fix an of size s s and a of size and a of size cs cs. .

v L ∈ V L R = ∪ Pr[ ]

S

E s an ≤ S L ⊆ T R ⊆

slide-5
SLIDE 5

Counting Expanders Counting Expanders

  • There are ways of choosing

There are ways of choosing S S

  • There are ways of choosing

There are ways of choosing T T

  • The probability that

The probability that T T contains all neighbors of contains all neighbors of S S is is Thus Thus Simplifying for a=1/3, c=2, d=18 and using we get Simplifying for a=1/3, c=2, d=18 and using we get Summing up we get Summing up we get

n s             n cs            

ds

cs n     ≤       

1 1

Pr[ ]

s ds s cs ds d c c d c S

n n cs ne ne cs s E e c s cs n s cs n n

− − + −

                              ≤   ≤ ≤                                                       s an ≤

( ) ( )

1 1 18 1 3 1 1

1 2 Pr[ ] 3 3 3 3 3

s s s s d c d c d c c d c c d c S

s c E e c e c e e n

− − − − + + − + −

                                ≤ ≤ ≤ ≤                                                     Pr[ ] Pr[ ] 1

S S

failure E ≤ <

slide-6
SLIDE 6

The First Moment Method The First Moment Method

1. 1.

At first we design a At first we design a “ “thought experiment thought experiment” ” in in which a random process plays a role which a random process plays a role

2. 2.

We analyze the random experiment and draw a We analyze the random experiment and draw a conclusion using conclusion using the first moment principle the first moment principle: :

[ ] Pr( ) E X t X t ≤ ⇒ ≤ >

slide-7
SLIDE 7

Example 1 Example 1

Theorem: Theorem:

For any undirected graph For any undirected graph G(V,E) G(V,E) with with n n vertices and vertices and m m edges there edges there is a partition of the vertex set into two sets is a partition of the vertex set into two sets A, B A, B such that such that

Proof: Proof:

  • Assign each vertex independently and equiprobably in either A

Assign each vertex independently and equiprobably in either A

  • r B.
  • r B.
  • Let X

Let X{u,v}

{u,v}=1

=1 when when { {u,v u,v} has endpoints in different sets and } has endpoints in different sets and X X{u,v}

{u,v}=0

=0 otherwise:

  • therwise:
  • By linearity of expectations:

By linearity of expectations:

|{{ , } | }| 2 m u v E u A v B ∈ ∈ ∧ ∈ ≥

{ , } { , }

Pr[ 1] 1/ 2 [ ] 1/ 2

u v u v

X E X = = ⇒ =

{ , } { , }

[| |] [ ] / 2

u v u v E

E separated edges E X m

= =

slide-8
SLIDE 8

Example 2 Example 2

Theorem: Theorem:

For any set of For any set of m m clauses there is a truth assignment that satisfies at clauses there is a truth assignment that satisfies at least least m/2 m/2 clauses. (a clause is )

  • clauses. (a clause is )

Proof: Proof:

  • Independently set each variable

Independently set each variable TRUE TRUE or

  • r FALSE

FALSE

  • For each clause let

For each clause let Z Zi

i=1

=1 if the if the i i th th clause is satisfied and clause is satisfied and Z Zi

i=0

=0

  • therwise
  • therwise
  • If the

If the i i th th clause has clause has k k literals: literals:

  • For every clause:

For every clause:

  • The expected number of satisfied clauses is

The expected number of satisfied clauses is

[ ] 1/ 2

i

E Z ≥

1 1

[ ] [ ] 2

m m i i i i

m E Z E Z

= =

= ≥

∑ ∑

1 2 3

( ... )

k

x x x x ∨¬ ∨ ∨ ∨

Pr( 1) 1 2 k

i

Z

= = −

slide-9
SLIDE 9

Example 3 Example 3

Theorem: Theorem:

Any instance of Any instance of k k Sat Sat with with <2 <2k

k clauses is

clauses is satisfiable satisfiable

Proof: Proof:

  • Independently set each variable

Independently set each variable TRUE TRUE or

  • r FALSE

FALSE

  • For each clause let

For each clause let Z Zi

i=0

=0 if the if the i i th th clause is satisfied clause is satisfied and and Z Zi

i=1

=1 otherwise:

  • therwise:
  • For every clause:

For every clause:

  • The expected number of unsatisfied clauses is

The expected number of unsatisfied clauses is

1 1

[ ] [ ] 2 1

m m k i i i i

E Z E Z m

− = =

= = <

∑ ∑

Pr( 1) 2 k

i

Z

= = [ ] 2 k

i

E Z

=

slide-10
SLIDE 10

The Deletion Method The Deletion Method

( (“ “sample and modify sample and modify” ” method) method)

We want to prove that a combinatorial We want to prove that a combinatorial

  • bject
  • bject F

F exist exist

1. 1.

At first we show that there exist an At first we show that there exist an F F’ ’ very very “ “close close” ” to to F. F.

2. 2.

Then we change Then we change F F’ ’ to to F F and show that the and show that the probability of existence remains positive probability of existence remains positive

slide-11
SLIDE 11

Turan Turan’ ’s Theorem* s Theorem*

Theorem: Let Theorem: Let G(V,E) G(V,E) be a graph. If be a graph. If |V|=n |V|=n and and |E|=nk/2 |E|=nk/2 then then Proof: Using probabilistic arguments we will prove the existence Proof: Using probabilistic arguments we will prove the existence of a

  • f a

subset that has many more vertices than edges subset that has many more vertices than edges Deleting vertices corresponding to these edges we get an indepen Deleting vertices corresponding to these edges we get an independent dent set. set. Let Let S S be a subset of be a subset of V V containing each vertex with probability containing each vertex with probability p p (to be (to be fixed later). We have fixed later). We have E[|S|]= E[|S|]=np np Let Let G G’ ’ be the subgraph induced by be the subgraph induced by S S. . For every define For every define Y Ye

e=1

=1 if and if and Y Ye

e=0

=0 otherwise. Then:

  • therwise. Then:

( ) / 2 a G n k ≥ e E ∈ ( ') e E G ∈

2

[ ]

e

E Y p =

slide-12
SLIDE 12

Turan Turan’ ’s Theorem* s Theorem*

Let Let Y=|E(G Y=|E(G’ ’)| )| the number of edges in the induced the number of edges in the induced

  • subgraph. Then:
  • subgraph. Then:

Deletion time: We drop all edges (by deleting vertices) Deletion time: We drop all edges (by deleting vertices) from from G G’ ’ and we get an independent set and we get an independent set S*. S*. We have: We have: We fix We fix p p to maximize this expression. It to maximize this expression. It’ ’s a parabola, s a parabola, which attains its maximum at which attains its maximum at p=1/k p=1/k and so and so

2

[ ] [ ] [ ] 2

e e e E e E

nk E Y E Y E Y p

∈ ∈

= = =

∑ ∑

2

[| *|] [| | ] [| |] [ ] 2 nk E S E S Y E S E Y np p ≥ − = − = − [| *|] 2 n E S k ≥

slide-13
SLIDE 13

Erdos Erdos’ ’s Theorem s Theorem

Definitions: Definitions:

1. 1.

The chromatic number, The chromatic number, x(G x(G) ), of a graph , of a graph G G is the minimum number is the minimum number

  • f colors needed to color the vertices of
  • f colors needed to color the vertices of G

G such that adjacent such that adjacent vertices have different colors. vertices have different colors.

2. 2.

By By a(G a(G) ) we denote the cardinality of a maximum independent set of we denote the cardinality of a maximum independent set of G G. .

3. 3.

The girth, The girth, g(G g(G) ), of a graph , of a graph G G is the length of a shortest cycle in is the length of a shortest cycle in G G Theorem: For any naturals Theorem: For any naturals k k, , l l there exist a graph there exist a graph G G such that: such that: and . and . We will find a graph that has We will find a graph that has

  • Small

Small a(G a(G) )

  • Not many

Not many “ “bad bad” ” cycles of length < cycles of length <l l (that will be destroyed!) (that will be destroyed!) In the end we In the end we’ ’ll use that and ll use that and “ “force force” ” x(G x(G) ) to get big to get big ( ) x G k ≥ ( ) g G l ≥ | ( ) | ( ) ( ) V G x G a G ≤

slide-14
SLIDE 14

Erdos Erdos’ ’s Theorem s Theorem

Proof: We Proof: We choose a random graph choose a random graph G Gnp

np (

(n n vertices and each edge chosen vertices and each edge chosen independently with probability independently with probability p p). ).

  • Small

Small p p gives large independent sets and thus small chromatic number gives large independent sets and thus small chromatic number

  • Large

Large p p gives small cycles. gives small cycles.

Let Let p=n p=nθ

θ 1 1 and we

and we’ ’ll fix ll fix θ θ later later Let ( Let (b b1

1,

,… …, ,b bc

c) an ordered sequence of vertices. The probability of

) an ordered sequence of vertices. The probability of b b1

1

b b2

2

… …

  • b

bc

c

b b1

1 being a cycle is

being a cycle is p pc

  • c. Let

. Let Y YB

B=1

=1 when this happens and when this happens and Y YB

B=0

=0

  • therwise
  • therwise

For a subset For a subset B B’ ’={b ={b1

1,

,… …, ,b bc

c}

} of

  • f V

V there are there are c! c! ways to form cyclic ordered ways to form cyclic ordered sequences with the vertices of sequences with the vertices of B B’ ’. There are ways of choosing . There are ways of choosing B B. . Let Let X Xc

c be the number of cycles of length

be the number of cycles of length c c in in G

  • G. Combining we get:

. Combining we get:

n c            

{ }

1 ( 1)! [ ] 2 2

C

c c B B V

n c E X E Y p c c

    −     = =              

slide-15
SLIDE 15

Erdos Erdos’ ’s Theorem s Theorem

Let X be the number of cycles of length no greater than Let X be the number of cycles of length no greater than l l: : By Markov By Markov’ ’s inequality: s inequality: Fixing Fixing θ θ<1/ <1/l l we get: we get: Let Let Y Y be the number of independent sets of size be the number of independent sets of size y y (to be fixed later) in (to be fixed later) in G

  • G. By Markov

. By Markov’ ’s inequality: s inequality: Now let . We get: Now let . We get: So So

1 3 3 3 3 3

( 1)! ! 1 [ ] [ ] ( ) 2 ( )!2 2 2

i l l l l l i i i i c c i i i i

n i n n E X E X p p n n i n i i i i

θ θ− = = = = =

  −   = =  = ≤ ≤      −  

∑ ∑ ∑ ∑ ∑

1 1 3 3

[ ] 2 Pr( / 2) ( 2) / 2 2

i i l l l i i

E X n n X n l n n n i i

θ θ θ − − = =

≥ ≤ ≤ = < −

∑ ∑

Pr ( / 2)

n

X n

→∞

≥ = 3 ln y n p =

( 1)/ 2 ( 1)/ 2

Pr( ( ) ) Pr( 1) [ ] (1 ) ( )

y y y p y y

n a G y Y E Y p n e y

− − −

    ≥ = ≥ ≤ =  − <       

3ln 2 2

( 1)/ 2 2

1 Pr( ( ) ) ( ) ( )

p n

y p p y y y

a G y ne ne e n

+ − −

     ≥ ≤ ≤ =        Pr ( ( ) )

n

a G y

→∞

≥ =

slide-16
SLIDE 16

Erdos Erdos’ ’s Theorem s Theorem

By taking By taking n n large enough we manage both events large enough we manage both events

  • and

and

  • ,

,

to have probability <1/2. to have probability <1/2. So there is a G such that: and So there is a G such that: and Deletion time: We remove one vertex from each of the at most n/2 Deletion time: We remove one vertex from each of the at most n/2 “ “bad bad” ” cycles. Thus we get a G

  • cycles. Thus we get a G’

’ with , more than n/2 vertices with , more than n/2 vertices and and Putting it all together: Putting it all together: for large for large enough n. enough n. G G’ ’ is our Graph is our Graph. .

( ) a G y < / 2 X n < ( ) a G y ≥ / 2 X n ≥ ( ) g G l ≥ ( ') ( ) a G a G ≤

3

| ( ') | | ( ) | / 2 / 2 ( ') ( ') ( ) ln 6ln

p

V G V G n n x G k a G a G n n

θ

≥ ≥ ≥ = ≥

slide-17
SLIDE 17

The Second Moment Method The Second Moment Method

  • Method based on Chebysev

Method based on Chebysev’ ’s inequality: s inequality: reaching conclusions using concentration results reaching conclusions using concentration results

  • Useful tool for determining the threshold

Useful tool for determining the threshold function of an event function of an event A A: :

  • Below threshold,

Below threshold, Pr(A Pr(A) ) tends to 0 tends to 0

  • Above it,

Above it, Pr(A Pr(A) ) tends to 1 tends to 1

2

var[ ] Pr(| [ ]| ) X X E X t t − ≥ ≤

slide-18
SLIDE 18

Distinct sums Distinct sums

Let . Define , Let . Define , where where s(I s(I) ) is the sum of the elements of is the sum of the elements of I I. . Question: How large can a subset of Question: How large can a subset of {1, {1,… …,n} ,n} with distinct with distinct sums be? sums be? One of size is . One of size is . On the other hand every sum is at most On the other hand every sum is at most kn kn and so and so Theorem: if has distinct sums then Theorem: if has distinct sums then

1 2

{ , ,..., }

k

A a a a = ( ) { ( ) : } S I s I I A = ⊆

1

{2 | 1,..., }

i

A i k

= = log 1 k n   = +   2 log loglog (1)

k

kn k n n O ≤ ⇒ ≤ + + {1,..., } A n ⊂

1 2

| | log loglog (1) A n n O ≤ + +

slide-19
SLIDE 19

Distinct sums Distinct sums

Proof: To get an A Proof: To get an A “ “close close” ” to the upper bound we need to the upper bound we need

  • S(A)

S(A) “ “close close” ” to {1, to {1,… …, ,kn kn} }

  • The sums of the subsets of A to be spread evenly.

The sums of the subsets of A to be spread evenly.

Using Chebysev Using Chebysev’ ’s inequality we s inequality we’ ’ll prove that ll prove that most of the most of the sums are around the middle. sums are around the middle. Picking at random a sum from Picking at random a sum from S(A) S(A) is equivalent to picking is equivalent to picking a random subset a random subset I I of

  • f A

A and then computing its sum. and then computing its sum. Let and . Let Let and . Let X= X=s(I s(I). ). We have We have

1

{ ,..., }

k

A a a =

1

i i

X a I = ⇔ ∈

1 1

1 [ ] [ ] 2

k k i i i i i

E X a E X a

= =

= =

∑ ∑

slide-20
SLIDE 20

Distinct sums Distinct sums

Var(X Var(X): ): By Chebysev By Chebysev’ ’s inequality s inequality Thus at least Thus at least ¾ ¾ of the sums are inside an interval of length

  • f the sums are inside an interval of length

Therefore Therefore

2 2 2 2 1 1 1 2 2 2 1 1 1 1 2 2 1 1 2 2 2 2 1

[ ] [( ) ] [( 2 ] 1 1 [ ] 2 [ ] 2 2 1 1 [ ] 4 2 1 var[ ] [ ] [ ] 4 4

k k i i i i i j i j i i i j k k k i i i j i j i i j i i j k i i j k k i i j i i j k t i i

E X E a X E a X a a X X a E X a a E X X a a a E X a a a n k X E X E X a

= = ≤ < ≤ = ≤ < ≤ = ≤ < ≤ = ≤ < ≤ =

= = + = + = + = + ⇒ = − = <

∑ ∑ ∑ ∑ ∑ ∑ ∑ ∑ ∑ ∑

  • (

)

2

var[ ] 1 Pr(| [ ]| 2 var[ ]) Pr(| [ ]| ) 4 2 var[ ] X X E X X X E X n k X − ≥ ≤ ⇒ − ≥ ≤ 2n k 3 1 2 2 log loglog (1) 4 2

k

n k k n n O ≤ ⇒ ≤ + +

slide-21
SLIDE 21

Threshold for Clique Threshold for Clique

Theorem: Let Theorem: Let G Gn,p

n,p a graph and

a graph and K K the number of cliques with the number of cliques with 4 4 vertices. vertices.

  • If

If p=o p=o( (n n

2/3 2/3) then

) then

  • If

If p= p=ω ω( (n n

2/3 2/3) then

) then

Proof: Let be the enumeration of the Proof: Let be the enumeration of the 4 4 tuples tuples, and , and X Xi

i =1

=1 when when C Ci

i induces a

induces a 4 4 clique clique and and X Xi

i=0

=0 otherwise.

  • therwise.
  • In the first case , so

In the first case , so and and

  • Unfortunately in the second case we get

Unfortunately in the second case we get Chebysev Chebysev’ ’s inequality proves useful. After bounding s inequality proves useful. After bounding Var Var[ [K K] we can use ] we can use the fact: the fact:

Pr ( 1)

n

K

→∞

≥ = Pr ( 1) 1

n

K

→∞

≥ =

1

{ ,..., }

t

C C

4 n            

1 t i i

K X

=

=∑

4 6 6

[ ] 4 24 n n p E K p     =  ≈       

2 / 3

4 6 ( )

Pr ( 1) [ ] lim 24

p

  • n

n n n

n p K E K

= →∞ →∞ →∞

≥ ≤ ≈ =

2 / 3

4 6 ( )

Pr ( 1) [ ] lim 24

p n n n n

n p K E K

ω

= →∞ →∞ →∞

≥ ≤ ≈ = ∞

2

[ ] Pr( 0) Pr(| [ ]| [ ]) ( [ ]) Var K K K E K E K E K = ≤ − ≥ ≤

slide-22
SLIDE 22

Threshold for Clique Threshold for Clique

To compute To compute Var[K Var[K]=E[K ]=E[K2

2]

] E[K] E[K]2

2

  • E[K]

E[K]2

2:

:

  • E[K

E[K2

2]

]: :

1. 1.

If then . If then .

2. 2.

If then If then . We count . We count such instances. such instances.

3. 3.

If then If then . We count . We count such instances. such instances.

Thus Thus Finally: Finally:

2 2 2 1 1

[ ] ( [ ]) [ ] [ ] [ ]

t t i i i j i i i j

E K E X E X E X E X

= = ≠

= = +

∑ ∑ ∑

2 2 2 1 1

[ ] [( ) ] [ ] [ ]

t t i i i j i i i j

E K E X E X E X X

= = ≠

= = +

∑ ∑ ∑

| | 1

i j

C C ∩ ≤ [ ] [ ] [ ]

i j i j

E X X E X E X = | | 2

i j

C C ∩ =

5 5 11

[ ]

i j

E X X p p p p = ⋅ ⋅ = | | 3

i j

C C ∩ =

3 3 3 9

[ ]

i j

E X X p p p p = ⋅ ⋅ = 4 4 4 2 2 n n     −                             4 4 4 3 1 n n     −                            

( 2/3) 6 11 9 8 12 2

4 4 4 4 [ ] ( ) ( [ ] ) 4 4 2 2 4 3 1

p n

n n n n n Var K p p p

  • n p
  • E X

ω = −

          − −               ≤  +    +    = =                                             

2 2 1 1

[ ] [ ] [ ] [ ] [ ] [ ]

t t i i i j i j i i i j i j

Var K E X E X E X X E X E X

= = ≠ ≠

= − + − ⇒

∑ ∑ ∑ ∑

2

[ ] lim Pr( 0) lim ( [ ])

n n

Var K K E K

→∞ →∞

= ≤ =

slide-23
SLIDE 23

Derandomizing Derandomizing

F F boolean formula in boolean formula in CNF CNF with variables with variables x x1

1,

,… …,x ,xn

n.

. Set Set x xi

i=

=True True or

  • r False

False equiprobably and let equiprobably and let X X denote the number of denote the number of unsatisfied clauses. unsatisfied clauses. Suppose that Suppose that E[X] E[X]< <1 1 (e.g. (e.g. k k Sat Sat instance with less than instance with less than 2 2k

k clauses), so

clauses), so there is a truth assignment that satisfies the formula there is a truth assignment that satisfies the formula Derandomize Derandomize..: ..:

  • Set

Set x x1

1 =

=True True simplify simplify F F and compute and compute E[X| x E[X| x1

1=True].

=True].

  • Set

Set x x1

1 =

=False False simplify simplify F F and compute and compute E[X| x E[X| x1

1=False].

=False].

It is It is E[X| x E[X| x1

1=True]<1

=True]<1 or

  • r E[X| x

E[X| x1

1=False]<1

=False]<1. . Keep a value of Keep a value of x x1

1 that keeps

that keeps E[X| x E[X| x1

1]<1.

]<1. Repeat for all variables and you get Repeat for all variables and you get E[X| x E[X| x1

1,

,… …, x , xn

n]<1.

]<1. The values for The values for x x1

1,

,… …, x , xn

n is the satisfying truth assignment

is the satisfying truth assignment

slide-24
SLIDE 24

Conditional Probabilities Conditional Probabilities

Generalizing the previous technique we get the Generalizing the previous technique we get the “ “method of method of conditional probabilities conditional probabilities” ”. . In general it is something like this: In general it is something like this:

  • X is a random variable determined by a sequence of random trials

X is a random variable determined by a sequence of random trials T T1

1,

,… …, ,T Tn

n.

.

  • We want to find a set of outcomes such that

We want to find a set of outcomes such that

  • There must be a . We find it.

There must be a . We find it.

  • We repeat to find the outcome

We repeat to find the outcome

  • At the end we get . B

At the end we get . But there is no ut there is no randomness left thus we have determined a desired set of randomness left thus we have determined a desired set of

  • utcomes for which
  • utcomes for which

In order to In order to succed succed we need we need

1. 1.

“ “Small Small” ” number of trials number of trials

2. 2.

The computations for determining The computations for determining t ti

i can be carried out efficiently

can be carried out efficiently

[ ] X E X ≤

1 1 1

: [ | ] [ ] t E X T t E X = ≤

1 1 1 1 1 1 1 1

: [ | ,..., , ] [ | ,..., ] [ ]

i i i i i i i

t E X T t T t T t E X T t T t E X

− − − −

= = = ≤ = = ≤

1 1

[ | ,..., ] [ ]

n n

E X T t T t E X = = ≤ [ ] X E X ≤

slide-25
SLIDE 25

Max Max-

  • cut

cut

Theorem: For any undirected graph Theorem: For any undirected graph G(V,E) G(V,E) with with n n vertices and vertices and m m edges there is a partition of the vertex set into two sets edges there is a partition of the vertex set into two sets A, B A, B such such that that Let C(A,B) denote the number of edges between A, B. We have Let C(A,B) denote the number of edges between A, B. We have when vertices equiprobably go either to when vertices equiprobably go either to A or B. A or B.

  • To begin with: v

To begin with: v1

1 goes to

goes to A A (or (or B B) and we get ) and we get

  • For the intermediate steps when the k first nodes are in some se

For the intermediate steps when the k first nodes are in some set t then then

  • We can compute the cut that these vertices

We can compute the cut that these vertices “ “give give” ” in the final cut in the final cut

  • Each of the edges that are

Each of the edges that are “ “incomplete incomplete” ” have have ½ ½ probability to be in probability to be in the cut the cut

  • So and

So and can be can be computed efficiently. We keep the big one. computed efficiently. We keep the big one. We We’ ’ll do ll do n n steps to fully determine steps to fully determine A A, , B

  • B. Each step needs polynomial

. Each step needs polynomial time time

|{{ , } | }| 2 m u v E u A v B ∈ ∈ ∧ ∈ ≥ [ ( , )] 2 m E C A B ≥

1

[ ( , ) | ] [ ( , )] E C A B v E C A B ≥

1 1

[ ( , ) | ,..., , ]

k k

E C A B v v v A

+ ∈ 1 1

[ ( , ) | ,..., , ]

k k

E C A B v v v B

+ ∈

slide-26
SLIDE 26

The Lovasz Local Lemma The Lovasz Local Lemma

Let Let A A1

1,

,… …,A ,An

n be some

be some “ “bad bad” ” events and for all events and for all i i: : If If A Ai

i are

are pairwise pairwise independent then we could assert that none of these independent then we could assert that none of these will happen with probability will happen with probability The Lovasz Local Lemma states that if each event is dependent t The Lovasz Local Lemma states that if each event is dependent to

“few few” ”

  • ther events then there is a probability that none of this will
  • ther events then there is a probability that none of this will happen.

happen. Definition: Dependency graph of events Definition: Dependency graph of events A A1

1,

,… …,A ,An

n is a digraph G in which

is a digraph G in which

  • For every

For every A Ai

i there is a vertex corresponding to it

there is a vertex corresponding to it

  • A

Ai

i is independent to all other

is independent to all other A Aj

j’

’s s such that such that ( (A Ai

i,A

,Aj

j)

) is not an edge of is not an edge of G G

Theorem: Let Theorem: Let G(V,E) G(V,E) be a dependency graph of the events be a dependency graph of the events A A1

1,

,… …,A ,An

  • n. Then

. Then

1 Pr( ) 2

i

A <

( , ) 1 1

: Pr( ) (1 ) Pr (1 )

n n i i i j i i i j E i i

i x A x x A x

∈ = =

        ∀ ∃ ≤ − ⇒ ≥ −            

∏ ∏ ∩

( ) ( ) ( ) ( )

( )

( )

( )

( )

( )

( )

( ) ( )

1 2 1 1 1

1 1 2 1 1 1 1 1 Pr 1 Pr | ... 1 Pr | ...

Pr ... Pr Pr | ... Pr | ... Pr ... Pr

n n

n n n n A A A A A A

A A A A A A A A A A

− − ⋅ − ⋅ ⋅ − ∩ ∩

∩ ∩ = ⋅ ⋅ ⋅ ∩ ∩ = ⋅ ⋅ >

slide-27
SLIDE 27

Lovasz Local Lemma Proof Lovasz Local Lemma Proof

Let . By induction on Let . By induction on k=|S| k=|S| we will show that for any S and we will show that for any S and : : For For k=0 k=0 the result follows from the result follows from For the inductive step we want to compute For the inductive step we want to compute . Separate . Separate S S to and . to and . By definition: By definition: Numerator: Numerator: Denominator: Denominator: To complete the proof: To complete the proof:

{1,..., } S n ⊆ i S ∉ Pr( | )

i j i j S

A A x

( , )

: Pr( ) (1 )

i i i j i j E

i x A x x

∀ ∃ ≤ −

Pr( | )

i j i j S

A A x

1

{ :( , ) } S j S i j E = ∈ ∈

2 1

\ S S S =

( ) ( )

1 2 1 2

Pr | Pr( | ) Pr |

i j j j S j S i j j S j j j S j S

A A A A A A A

∈ ∈ ∈ ∈ ∈

∩ =

∩ ∩ ∩ ∩ ∩

( ) ( )

( )

1 2 2

( , )

Pr | Pr | Pr (1 )

i j j i j i i j j S j S j S i j E

A A A A A A x x

∈ ∈ ∈ ∈

∩ ≤ = ≤ −

∏ ∩ ∩ ∩

( )

1 2 1

( , )

Pr | (1 ) (1 )

j j j j j S j S j S i j E

A A x x

∈ ∈ ∈ ∈

≥ − ≥ −

∏ ∏ ∩ ∩

1 1 2 1 1 1 1

Pr (1 Pr( ))(1 Pr( | ))...(1 Pr( | )) (1 )

n n n i n i i i i i

A A A A A A x

− = = =

    = − − − ≥ −        

∏ ∩ ∩

slide-28
SLIDE 28

Other forms of LLL Other forms of LLL

  • The basic form

The basic form: If : If

1. 1. 2. 2.

For all For all i i: : A Ai

i is mutually independent of all but at most

is mutually independent of all but at most d d of the other events

  • f the other events

3. 3.

( or ) ( or )

Then with positive probability none of the events will occur Then with positive probability none of the events will occur

  • The Asymmetric form

The Asymmetric form: If for all : If for all i i: :

1. 1.

A Ai

i is mutually independent of for some

is mutually independent of for some D Di

i 2. 2. 3. 3.

Then with positive probability none of the events will occur Then with positive probability none of the events will occur

  • The weighted form

The weighted form: If : If

1. 1.

A Ai

i is mutually independent of for some

is mutually independent of for some D Di

i 2. 2.

There are and such t There are and such that for all hat for all i i: :

  • Then with positive probability none of the events will occur

Then with positive probability none of the events will occur

: Pr( ) 1

i

i A p ∀ ≤ < 4 1 pd < \ ( )

i i

A D A ∪ 1 Pr( ) 8

i

A ≤ 1 Pr( ) 4

j i

i A D

A

\ ( )

i i

A D A ∪

1,..., n

t t 1 :0 8 p p ≤ < Pr( )

i

t i

A p ≤ (2 ) 4

j j i

t i A D

t p

( 1) 1 ep d + <

slide-29
SLIDE 29

Some Proofs Some Proofs

  • The general (compact!) form

The general (compact!) form

  • For the Weighted LLL set

For the Weighted LLL set

  • For the Asymmetric LLL set

For the Asymmetric LLL set This way: This way:

  • For the Basic LLL we can assume* d>1 and then

For the Basic LLL we can assume* d>1 and then and and

( , ) 1 1

: Pr( ) (1 ) Pr (1 )

n n i i i j i i i j E i i

i x A x x A x

∈ = =

        ∀ ∃ ≤ − ⇒ ≥ −            

∏ ∏ ∩

1 8

1.2 14

(2 ) ( ) (1 )

i i i i

p t t t x i i i

x p x x e

< > −

= ⇒ < ⇒ − ≥

1.2

1 2Pr( ) (1 ) 4

i

x i i i i

x A x x e− = ⇒ ≤ ⇒ − ≥

1.2 2Pr( ) 1.2 0.6

(1 ) 2Pr( ) 2Pr( ) Pr( )

j A D j j j i j i

A x i j i i i i A D A D

x x x e A e A e A

− − ∈ ∈

∑ − ≥ ≥ ⋅ ≥ ⋅ >

∏ ∏

1 Pr( ) 8

i

A ≤ 1 1 Pr( ) ( )4 4 4

j i

i A D

A pd pd

≤ = <

1.2 (2 ) 1.2 0.6

(1 ) 2 Pr( ) 2 Pr( ) Pr( )

ti A D j j i i j i j i

p x t t i j i i i i A D A D

x x x e A e A e A

− − ∈ ∈

∑ − ≥ ≥ ⋅ ≥ ⋅ >

∏ ∏

slide-30
SLIDE 30

Ramsey numbers Ramsey numbers

Using the basic form of the Lovasz Local Lemma we will get a low Using the basic form of the Lovasz Local Lemma we will get a lower er bound for bound for R(k,k R(k,k). ). Theorem: Theorem:

( 1) 2

( , ) 2 (1 (1))

k

k R k k n O e

+

> ≈ +

Definition: Definition: R(k,l R(k,l) ) is the minimal n such that if the edges of the is the minimal n such that if the edges of the complete graph on complete graph on n n vertices, vertices, K Kn

n, are colored

, are colored Red Red or

  • r Blue

Blue, then , then there is (as a subgraph) a there is (as a subgraph) a K Kk

k with all edges

with all edges Red Red or a

  • r a K

Kl

l with all

with all edges edges Blue Blue. . For example R(3,3)>5. In particular R(3,3)=6 For example R(3,3)>5. In particular R(3,3)=6

slide-31
SLIDE 31

Ramsey numbers Ramsey numbers

Proof: Color the edges of Proof: Color the edges of K Kn

n uniformly at random.

uniformly at random. For every let For every let A As

s be the event that

be the event that S S induces a induces a monochromatic monochromatic k k clique

  • clique. It is

. It is Let G be a dependency graph of the events Let G be a dependency graph of the events A As

  • s. The events

. The events A As

s A

As

s’ ’ , are

, are dependent only if dependent only if S S and and S S’ ’ share at least one edge. Thus share at least one edge. Thus By Lovasz Local Lemma if By Lovasz Local Lemma if 4pd<1 4pd<1 then there is positive probability that then there is positive probability that no monochromatic no monochromatic K Kk

k exists.

exists. The maximal The maximal n n for which the above holds is our lower bound. for which the above holds is our lower bound. ,| | S V S k ⊂ =

1 2

1 Pr( ) 2

k S

A p

    −        

    = =        2 ( ) 2 2 k n d G k    −     = * ≤             −   

1 2

2 1 4 1 4 1 2 2 2

k

k n pd k

    −        

   −         < ⇔ ⋅ ⋅  <                  −   

slide-32
SLIDE 32

Coloring Hypergraphs Coloring Hypergraphs

A A hypergraph hypergraph H=(V,E) H=(V,E) is a generalization of a graph where is a generalization of a graph where E E is . is . H H is: is:

  • k

k uniform uniform if each edge contains exactly if each edge contains exactly k k vertices and vertices and

  • K

K regular regular if every vertex participates in exactly if every vertex participates in exactly k k edges edges

Let Let H H be a be a hypergraph

  • hypergraph. Graph

. Graph H H has has property B property B if there if there exist a exist a 2 2 coloring coloring of the

  • f the vertices

vertices such that none of the such that none of the edges is monochromatic edges is monochromatic Theorem: Let Theorem: Let H H be a be a k k uniform uniform, , k k regular regular hypergraph hypergraph. . Then for all Then for all k>8 k>8, , H H has has property B property B . .

2V E ⊆

slide-33
SLIDE 33

Coloring Hypergraphs Coloring Hypergraphs

Proof: Color the vertices uniformly at random and let Proof: Color the vertices uniformly at random and let A Af

f be

be the event that edge the event that edge f f is monochromatic. is monochromatic.

  • H

H is is k k uniform

  • uniform. Thus for all

. Thus for all f f : :

  • H

H is is k k regular regular, so each edge intersects with at most , so each edge intersects with at most k(k k(k 1) 1) other edges and thus:

  • ther edges and thus:

Getting it all together: Getting it all together: So using Lovasz Local Lemma: So using Lovasz Local Lemma:

( )

1

1 Pr( ) 2

k f

A p

= = ( 1) d k k ≤ +

( )

( )

1

1 8: ( 1) ( 1) 1 1 2

k

k ep d e k k

∀ > + ≤ + + < Pr

f f

A     >        

slide-34
SLIDE 34

Edge Edge-

  • disjoint Paths

disjoint Paths

Assume we have a network and Assume we have a network and n n pairs of users who wish pairs of users who wish to communicate via to communicate via edge edge disjoint disjoint paths. paths. Each pair of users, Each pair of users, i i, has a collection , has a collection F Fi

i of

  • f m

m possible possible paths from which it chooses his path. paths from which it chooses his path. If the possible paths do not share too many edges then If the possible paths do not share too many edges then there is a set of edge there is a set of edge disjoint paths that does the work. disjoint paths that does the work. Theorem: If any path in Theorem: If any path in F Fi

i shares edges with no more

shares edges with no more than than k k paths in paths in F Fj

j ( ) and then there are

( ) and then there are n n disjoint paths connecting the disjoint paths connecting the n n pairs pairs

8 1 nk m ≤

j i ∀ ≠

slide-35
SLIDE 35

Edge Edge-

  • disjoint Paths

disjoint Paths

Proof: Each pair chooses equiprobably one path from his Proof: Each pair chooses equiprobably one path from his m m possible paths possible paths Let Let E Ei,j

i,j denote the event that the paths of

denote the event that the paths of i i and and j j share a share a common edge. It is common edge. It is Event Event E Ei,j

i,j could be depended only to events

could be depended only to events E Ei,t

i,t or

  • r E

Ej,t

j,t. So

. So d<2n d<2n. . It is . So using LLL we get: It is . So using LLL we get:

,

Pr( )

i j

k p E m = ≤ 8 4 1 nk dp m < <

,

Pr

i j i j

E

    >        

slide-36
SLIDE 36

Expanders Expanders

Definition: A graph Definition: A graph G(V,E) G(V,E) is called is called β β expander expander if if We will show that if we have a We will show that if we have a β β expander expander G(V,E) G(V,E), we , we can partition can partition E E to to E E1

1,

, E E2

2, so that both

, so that both G G1

1(V,E

(V,E1

1)

) and and G G2

2(V,E

(V,E2

2)

) are nearly a are nearly a ( (β β/2) /2) expander. expander. Theorem: Let Theorem: Let ε ε>0 >0, , r r 3 3 and and β β sufficiently large in terms of sufficiently large in terms of ε ε, r , r. If . If G G is an is an r r regular regular β β expander expander then there is a then there is a partition such that each partition such that each E Ei

i induces a

induces a

  • expander.

expander. Proof: We will (as usual) place each edge equiprobably to Proof: We will (as usual) place each edge equiprobably to

  • ne of the two sets. We will
  • ne of the two sets. We will “

“define define” ” what is what is “ “bad bad” ” and and use the use the weighted weighted version of LLL. version of LLL.

1 2

| | | | ( , ) | | S V S V E S S S β   ∀ ⊂ ≤ → ≥    

1 2

( ) E G E E = ∪

1 2

( ) β ε −

slide-37
SLIDE 37

Expanders Expanders

For each (connected) of size , let For each (connected) of size , let A As

s be the event that

be the event that S S “ “fails fails” ”: : It is: It is: It is like throwing It is like throwing β β|S| |S| coins and coins and |E |E1

1|

| |heads|=|E |heads|=|E’ ’1

1|, |E

|, |E2

2|

| |tails|=|E |tails|=|E’ ’2

2|

|. . So: So: Using Chernoff Bound: Using Chernoff Bound:

S V ⊂

1 2

| | | | S V ≤

1 1 1 2 2 2

( , ) ( ) | | ( , ) ( ) | | E S S S E S S S β ε β ε     < − ∨ < −        

1 1 2 2

Pr ( , ) ( ) | | Pr | | ( , ) | |

i i

E S S S S E S S S β ε β εβ     < − = − >        

2 2 2 2 12

| | 2 | | 3 / | | 1 3 2 1 | |

Pr[ ] Pr | | | ( , ) | | | 2 2

S S n S S i S n

A S E S S S e e p

ε β ε β β

β εβ

− − =

  ≤ − > ≤ ≤ =    

/ / 1 2

/ / 1 1 2 1 2 2 ( , ) ( , ) | |

Pr | | ( , ) | | Pr ( , ) | | | |

E S S E S S S

S E S S S E S S S S

β

β εβ β εβ

+ =

    − > − >        

=

/ 1 1 1 2 2 1,2 1,2

Pr[ ] Pr | | ( , ) | | Pr | | | ( , ) | | |

S i i

A S E S S S S E S S S β εβ β εβ

=

    ≤ − > − >        

≤ ∑

/ 1 1 2 2

Pr | | ( , ) | | Pr | | ( , ) | |

i i

S E S S S S E S S S β εβ β εβ     − > ≤ − >         1. 1. 2. 2.

slide-38
SLIDE 38

Expanders Expanders

G G is is r r regular

  • regular. It is known

. It is known… … that in this case every vertex lies in at that in this case every vertex lies in at most connected subsets of size most connected subsets of size t. t. A As

s is dependent to at most

is dependent to at most

  • ther events
  • ther events A

As

s’ ’ for which

for which |S |S’ ’|=t |=t We have We have

  • as long as (for )

as long as (for ) The weighted version of LLL does the work The weighted version of LLL does the work

( )

2 2 ' 3 '

/ 2 / 2 1 1 1

| | 2 | | (2 ) ( ) | | (2 ) 4

S S S

n n t t t t A D t t

S p S p er S re

ε β − ∈ = =

≤ ⋅ = <

∑ ∑ ∑

( )

t t t

rt ert er t t          < =              

2

2 | | | | 3

Pr[ ] 2

S S S

A e p

ε β −

≤ =

( ) | |

t

er S

2 2 3

1

1 2 5 re

ε β −

<

2

3log(10 ) 2 er β ε >