Advanced Algorithms (VIII) Shanghai Jiao Tong University Chihao - - PowerPoint PPT Presentation

advanced algorithms viii
SMART_READER_LITE
LIVE PREVIEW

Advanced Algorithms (VIII) Shanghai Jiao Tong University Chihao - - PowerPoint PPT Presentation

Advanced Algorithms (VIII) Shanghai Jiao Tong University Chihao Zhang April 26, 2020 The Probabilistic Method The Probabilistic Method Design a probability space The Probabilistic Method Design a probability space Show that Pr[the object


slide-1
SLIDE 1

Advanced Algorithms (VIII)

Shanghai Jiao Tong University

Chihao Zhang

April 26, 2020

slide-2
SLIDE 2

The Probabilistic Method

slide-3
SLIDE 3

The Probabilistic Method

Design a probability space Ω

slide-4
SLIDE 4

The Probabilistic Method

Design a probability space Ω Show that Pr[the object exists] > 0

slide-5
SLIDE 5

The Probabilistic Method

Design a probability space Ω Show that Pr[the object exists] > 0 Bad events , each happens w.p.

A1, A2, …, Am pi

slide-6
SLIDE 6

The Probabilistic Method

Design a probability space Ω Show that Pr[the object exists] > 0 Bad events , each happens w.p.

A1, A2, …, Am pi

Is ?

Pr[ ¯ A1 ∧ ¯ A2… ∧ ¯ Am] > 0

slide-7
SLIDE 7
slide-8
SLIDE 8

We can apply the union bound

slide-9
SLIDE 9

We can apply the union bound Pr ⋂

i∈[m]

¯ Ai = 1 − Pr ⋃

i∈[m]

Ai ≥ 1 − ∑

i∈[m]

pi

slide-10
SLIDE 10

We can apply the union bound Pr ⋂

i∈[m]

¯ Ai = 1 − Pr ⋃

i∈[m]

Ai ≥ 1 − ∑

i∈[m]

pi So if

Pr ⋂

i∈[m]

¯ Ai > 0 ∑

i∈[m]

pi < 1

slide-11
SLIDE 11

We can apply the union bound Pr ⋂

i∈[m]

¯ Ai = 1 − Pr ⋃

i∈[m]

Ai ≥ 1 − ∑

i∈[m]

pi So if

Pr ⋂

i∈[m]

¯ Ai > 0 ∑

i∈[m]

pi < 1

The union bound is tight when bad events are disjoint

slide-12
SLIDE 12
slide-13
SLIDE 13

On the other hand, if the bad events are mutually independent…

slide-14
SLIDE 14

On the other hand, if the bad events are mutually independent… Pr ⋂

i∈[m]

¯ Ai = ∏

i∈[m]

(1 − pi)

slide-15
SLIDE 15

On the other hand, if the bad events are mutually independent… Pr ⋂

i∈[m]

¯ Ai = ∏

i∈[m]

(1 − pi) So as long as none of

Pr ⋂

i∈[m]

¯ Ai > 0 pi = 1

slide-16
SLIDE 16

On the other hand, if the bad events are mutually independent… Pr ⋂

i∈[m]

¯ Ai = ∏

i∈[m]

(1 − pi) So as long as none of

Pr ⋂

i∈[m]

¯ Ai > 0 pi = 1

The two cases correspond to two extremes of the dependency

slide-17
SLIDE 17

Lovász Local Lemma

slide-18
SLIDE 18

Lovász Local Lemma

The Lovász local lemma (LLL) captures partial dependency between bad events

slide-19
SLIDE 19

Lovász Local Lemma

The Lovász local lemma (LLL) captures partial dependency between bad events Erdős and Lovász, Infinite and Finite Sets, 1975

slide-20
SLIDE 20

The Dependency Graph

slide-21
SLIDE 21

The Dependency Graph

We describe the dependency of bad events in a graph

slide-22
SLIDE 22

The Dependency Graph

We describe the dependency of bad events in a graph

A1 A3 A2 A4

slide-23
SLIDE 23

The Dependency Graph

We describe the dependency of bad events in a graph V = {A1, …, An}

A1 A3 A2 A4

slide-24
SLIDE 24

The Dependency Graph

We describe the dependency of bad events in a graph V = {A1, …, An}

A1 A3 A2 A4

N(Ai) = {Aj ∣ Ai ∼ Aj}

slide-25
SLIDE 25

The Dependency Graph

We describe the dependency of bad events in a graph V = {A1, …, An}

A1 A3 A2 A4

N(Ai) = {Aj ∣ Ai ∼ Aj} Δ = max

i∈[m] |N(Ai)|

slide-26
SLIDE 26

The Dependency Graph

We describe the dependency of bad events in a graph V = {A1, …, An}

A1 A3 A2 A4

N(Ai) = {Aj ∣ Ai ∼ Aj} Δ = max

i∈[m] |N(Ai)|

Ai ⊥ {Aj}j∉N(Ai) Pr[Ai] ≤ p 4Δp ≤ 1

slide-27
SLIDE 27

The Dependency Graph

We describe the dependency of bad events in a graph V = {A1, …, An}

A1 A3 A2 A4

N(Ai) = {Aj ∣ Ai ∼ Aj} Δ = max

i∈[m] |N(Ai)|

Ai ⊥ {Aj}j∉N(Ai) Pr[Ai] ≤ p 4Δp ≤ 1 ⟹

slide-28
SLIDE 28

The Dependency Graph

We describe the dependency of bad events in a graph V = {A1, …, An}

A1 A3 A2 A4

N(Ai) = {Aj ∣ Ai ∼ Aj} Δ = max

i∈[m] |N(Ai)|

Ai ⊥ {Aj}j∉N(Ai) Pr[Ai] ≤ p 4Δp ≤ 1 ⟹ Pr ⋂

i∈[m]

¯ Ai > 0

slide-29
SLIDE 29

Proof of (Symmetric) LLL

slide-30
SLIDE 30
slide-31
SLIDE 31

For , we prove by induction on that

S ⊆ [m] |S|

slide-32
SLIDE 32

For , we prove by induction on that

S ⊆ [m] |S|

∀i ∉ S, Pr Ai ∣ ⋂

j∈S

¯ Aj ≤ 2p

slide-33
SLIDE 33

For , we prove by induction on that

S ⊆ [m] |S|

∀i ∉ S, Pr Ai ∣ ⋂

j∈S

¯ Aj ≤ 2p Assume and the statement holds for smaller

|S| = s S

slide-34
SLIDE 34

For , we prove by induction on that

S ⊆ [m] |S|

∀i ∉ S, Pr Ai ∣ ⋂

j∈S

¯ Aj ≤ 2p Assume and the statement holds for smaller

|S| = s S

For every , we use to denote the event

T ⊆ [m] FT ⋂

i∈T

¯ Ai

slide-35
SLIDE 35
slide-36
SLIDE 36

It is clear that for every ,

T ∈ ( [m] ≤ s) Pr[FT] ≥ (1 − 2p)s > 0

slide-37
SLIDE 37

It is clear that for every ,

T ∈ ( [m] ≤ s) Pr[FT] ≥ (1 − 2p)s > 0

We partition into where

S S = S1 ∪ S2 S1 = {j ∣ j ∼ i}

slide-38
SLIDE 38

It is clear that for every ,

T ∈ ( [m] ≤ s) Pr[FT] ≥ (1 − 2p)s > 0

We partition into where

S S = S1 ∪ S2 S1 = {j ∣ j ∼ i}

If , then

|S2| = s Pr[Ai ∣ S] = Pr[Ai ∣ S2] ≤ p

slide-39
SLIDE 39

It is clear that for every ,

T ∈ ( [m] ≤ s) Pr[FT] ≥ (1 − 2p)s > 0

We partition into where

S S = S1 ∪ S2 S1 = {j ∣ j ∼ i}

If , then

|S2| = s Pr[Ai ∣ S] = Pr[Ai ∣ S2] ≤ p

Otherwise,

Pr[Ai ∣ FS] = Pr[Ai ∣ FS1 ∩ FS2] = Pr[Ai ∩ FS1 ∩ FS2] Pr[FS1 ∩ FS2]

slide-40
SLIDE 40
slide-41
SLIDE 41

Pr[Ai ∣ FS] = Pr[Ai ∩ FS1 ∩ FS2] Pr[FS1 ∩ FS2] = Pr[Ai ∩ FS1 ∣ FS2] Pr[FS1 ∣ FS2]

slide-42
SLIDE 42

Pr[Ai ∣ FS] = Pr[Ai ∩ FS1 ∩ FS2] Pr[FS1 ∩ FS2] = Pr[Ai ∩ FS1 ∣ FS2] Pr[FS1 ∣ FS2] Pr[Ai ∩ FS1 ∣ FS2] ≤ Pr[Ai ∣ FS2] ≤ p

slide-43
SLIDE 43

Pr[Ai ∣ FS] = Pr[Ai ∩ FS1 ∩ FS2] Pr[FS1 ∩ FS2] = Pr[Ai ∩ FS1 ∣ FS2] Pr[FS1 ∣ FS2] Pr[FS1 ∣ FS2] = 1 − Pr ⋃

j∈S1

Aj ∣ FS2 ≥ 1 − 2dp ≥ 1 2 Pr[Ai ∩ FS1 ∣ FS2] ≤ Pr[Ai ∣ FS2] ≤ p

slide-44
SLIDE 44

Pr[Ai ∣ FS] = Pr[Ai ∩ FS1 ∩ FS2] Pr[FS1 ∩ FS2] = Pr[Ai ∩ FS1 ∣ FS2] Pr[FS1 ∣ FS2] Pr[FS1 ∣ FS2] = 1 − Pr ⋃

j∈S1

Aj ∣ FS2 ≥ 1 − 2dp ≥ 1 2 Pr[Ai ∩ FS1 ∣ FS2] ≤ Pr[Ai ∣ FS2] ≤ p

}

slide-45
SLIDE 45

Pr[Ai ∣ FS] = Pr[Ai ∩ FS1 ∩ FS2] Pr[FS1 ∩ FS2] = Pr[Ai ∩ FS1 ∣ FS2] Pr[FS1 ∣ FS2] Pr[FS1 ∣ FS2] = 1 − Pr ⋃

j∈S1

Aj ∣ FS2 ≥ 1 − 2dp ≥ 1 2 Pr[Ai ∩ FS1 ∣ FS2] ≤ Pr[Ai ∣ FS2] ≤ p

}

slide-46
SLIDE 46

Pr[Ai ∣ FS] = Pr[Ai ∩ FS1 ∩ FS2] Pr[FS1 ∩ FS2] = Pr[Ai ∩ FS1 ∣ FS2] Pr[FS1 ∣ FS2] Pr[FS1 ∣ FS2] = 1 − Pr ⋃

j∈S1

Aj ∣ FS2 ≥ 1 − 2dp ≥ 1 2 Pr[Ai ∩ FS1 ∣ FS2] ≤ Pr[Ai ∣ FS2] ≤ p

}

⟹ Pr[Ai ∣ FS] ≤ 2p

slide-47
SLIDE 47

Applications of LLL

slide-48
SLIDE 48

Edge-Disjoint Paths

slide-49
SLIDE 49

Edge-Disjoint Paths

pairs of users, each has a collection of paths connecting them

n m Fi

slide-50
SLIDE 50

Edge-Disjoint Paths

pairs of users, each has a collection of paths connecting them

n m Fi

Each path in shares edges with no more than paths in for any

Fi k Fj j ≠ i

slide-51
SLIDE 51

Edge-Disjoint Paths

pairs of users, each has a collection of paths connecting them

n m Fi

Each path in shares edges with no more than paths in for any

Fi k Fj j ≠ i

If , then there is a way to choose edge-disjoint paths connecting pairs

8nk ≤ m n n

slide-52
SLIDE 52
slide-53
SLIDE 53

Define the probability space as

slide-54
SLIDE 54

Define the probability space as “Each pair of users chooses a path from its collection uniformly at random”

slide-55
SLIDE 55

Define the probability space as “Each pair of users chooses a path from its collection uniformly at random” For every , define the bad event as

i ≠ j Eij

slide-56
SLIDE 56

Define the probability space as “Each pair of users chooses a path from its collection uniformly at random” For every , define the bad event as

i ≠ j Eij

“the path chosen in

  • verlaps with the path

chosen in ”

Fi Fj

slide-57
SLIDE 57

Define the probability space as “Each pair of users chooses a path from its collection uniformly at random” For every , define the bad event as

i ≠ j Eij

“the path chosen in

  • verlaps with the path

chosen in ”

Fi Fj

So we only need to show Pr

{i,j}∈(

n 2)

¯ Eij > 0

slide-58
SLIDE 58
slide-59
SLIDE 59

For each , we have

{i, j} ∈ ( n 2) Pr[Eij] ≤ k m

slide-60
SLIDE 60

For each , we have

{i, j} ∈ ( n 2) Pr[Eij] ≤ k m

and are dependent only when

Eij Ei′

j′

{i, j} ∩ {i′ , j′ } ≠ ∅

slide-61
SLIDE 61

For each , we have

{i, j} ∈ ( n 2) Pr[Eij] ≤ k m

and are dependent only when

Eij Ei′

j′

{i, j} ∩ {i′ , j′ } ≠ ∅

So the maximum degree of the dependency graph is at most 2n

slide-62
SLIDE 62

For each , we have

{i, j} ∈ ( n 2) Pr[Eij] ≤ k m

and are dependent only when

Eij Ei′

j′

{i, j} ∩ {i′ , j′ } ≠ ∅

So the maximum degree of the dependency graph is at most 2n

slide-63
SLIDE 63

For each , we have

{i, j} ∈ ( n 2) Pr[Eij] ≤ k m

and are dependent only when

Eij Ei′

j′

{i, j} ∩ {i′ , j′ } ≠ ∅

So the maximum degree of the dependency graph is at most 2n The LLL condition is then 8nk ≤ m

slide-64
SLIDE 64

Satisfiability

slide-65
SLIDE 65

Satisfiability

Recall that -SAT problem is

  • hard for

k NP k ≥ 3

slide-66
SLIDE 66

Satisfiability

Recall that -SAT problem is

  • hard for

k NP k ≥ 3

On the other hand, if the formula is sparse, then it is always satisfiable

slide-67
SLIDE 67

Satisfiability

Recall that -SAT problem is

  • hard for

k NP k ≥ 3

On the other hand, if the formula is sparse, then it is always satisfiable Given , where each

ϕ = C1 ∧ C2… ∧ Cm |Ci| = k

slide-68
SLIDE 68

Satisfiability

Recall that -SAT problem is

  • hard for

k NP k ≥ 3

On the other hand, if the formula is sparse, then it is always satisfiable Given , where each

ϕ = C1 ∧ C2… ∧ Cm |Ci| = k

The degree of a variable is the number of clauses that or belongs to.

x x ¯ x

slide-69
SLIDE 69

Satisfiability

Recall that -SAT problem is

  • hard for

k NP k ≥ 3

On the other hand, if the formula is sparse, then it is always satisfiable Given , where each

ϕ = C1 ∧ C2… ∧ Cm |Ci| = k

The degree of a variable is the number of clauses that or belongs to.

x x ¯ x

Let be the maximum degree of variables in

d ϕ

slide-70
SLIDE 70
slide-71
SLIDE 71

Theorem. If , then is satisfiable

4kd ≤ 2k ϕ

slide-72
SLIDE 72

Theorem. If , then is satisfiable

4kd ≤ 2k ϕ

The probability space is the uniform distribution

  • ver {0,1}V
slide-73
SLIDE 73

Theorem. If , then is satisfiable

4kd ≤ 2k ϕ

The probability space is the uniform distribution

  • ver {0,1}V

Each clause defines a bad event “ is not satisfied”

Ci Ai := Ci

slide-74
SLIDE 74

Theorem. If , then is satisfiable

4kd ≤ 2k ϕ

The probability space is the uniform distribution

  • ver {0,1}V

Each clause defines a bad event “ is not satisfied”

Ci Ai := Ci

We need to show Pr

i∈[m]

¯ Ai > 0

slide-75
SLIDE 75
slide-76
SLIDE 76

Each clause satisfies

Ci Pr[ ¯ Ai] = 2−k

slide-77
SLIDE 77

Two clauses are dependent only if they share some variables Each clause satisfies

Ci Pr[ ¯ Ai] = 2−k

slide-78
SLIDE 78

Two clauses are dependent only if they share some variables Therefore, the maximum degree of the dependency graph is at most kd Each clause satisfies

Ci Pr[ ¯ Ai] = 2−k

slide-79
SLIDE 79

Two clauses are dependent only if they share some variables Therefore, the maximum degree of the dependency graph is at most kd The LLL condition is 4kd ≤ 2k Each clause satisfies

Ci Pr[ ¯ Ai] = 2−k

slide-80
SLIDE 80

Asymmetric LLL

slide-81
SLIDE 81

Asymmetric LLL

In many cases, bad events happen with different probabilities

slide-82
SLIDE 82

Asymmetric LLL

In many cases, bad events happen with different probabilities Assume there exist such that

x1, …, xn ∈ [0,1]

Pr[Ai] ≤ xi∏

j∼i

(1 − xj)

slide-83
SLIDE 83

Asymmetric LLL

In many cases, bad events happen with different probabilities Assume there exist such that

x1, …, xn ∈ [0,1]

Pr[Ai] ≤ xi∏

j∼i

(1 − xj) Then Pr [

n

i=1

¯ Ai] ≥

n

i=1

(1 − xi)

slide-84
SLIDE 84

Algorithmic LLL

slide-85
SLIDE 85

Algorithmic LLL

LLL guarantees the existence of a solution

slide-86
SLIDE 86

Algorithmic LLL

LLL guarantees the existence of a solution Can we find one efficiently?

slide-87
SLIDE 87

Algorithmic LLL

LLL guarantees the existence of a solution Can we find one efficiently?