Maximization of Submodular Functions Seffi Naor Lecture 1 4th - - PowerPoint PPT Presentation

maximization of submodular functions seffi naor lecture 1
SMART_READER_LITE
LIVE PREVIEW

Maximization of Submodular Functions Seffi Naor Lecture 1 4th - - PowerPoint PPT Presentation

Maximization of Submodular Functions Seffi Naor Lecture 1 4th Cargese Workshop on Combinatorial Optimization Seffi Naor Maximization of Submodular Functions Submodular Maximization Optimization Problem Family of allowed subsets M 2 N . f


slide-1
SLIDE 1

Maximization of Submodular Functions Seffi Naor Lecture 1 4th Cargese Workshop on Combinatorial Optimization

Seffi Naor Maximization of Submodular Functions

slide-2
SLIDE 2

Submodular Maximization

Optimization Problem Family of allowed subsets M ⊆ 2N . max f (S) s.t. S ∈ M

Seffi Naor Maximization of Submodular Functions

slide-3
SLIDE 3

Submodular Maximization

Optimization Problem Family of allowed subsets M ⊆ 2N . max f (S) s.t. S ∈ M Question - how is f given ?

Seffi Naor Maximization of Submodular Functions

slide-4
SLIDE 4

Submodular Maximization

Optimization Problem Family of allowed subsets M ⊆ 2N . max f (S) s.t. S ∈ M Question - how is f given ? Value Oracle Model: Returns f (S) for given S ⊆ N .

Seffi Naor Maximization of Submodular Functions

slide-5
SLIDE 5

Unconstrained Submodular Maximization

Definition - Unconstrained Submodular Maximization Input: Ground set N and non-monotone submodular f : 2N → R+. Goal: Find S ⊆ N maximizing f (S).

Seffi Naor Maximization of Submodular Functions

slide-6
SLIDE 6

Unconstrained Submodular Maximization

Definition - Unconstrained Submodular Maximization Input: Ground set N and non-monotone submodular f : 2N → R+. Goal: Find S ⊆ N maximizing f (S).

𝑇

Seffi Naor Maximization of Submodular Functions

slide-7
SLIDE 7

Undirected Graphs: Cut Function

G = (V, E) ⇒ N = V , f (S) = δ(S).

Seffi Naor Maximization of Submodular Functions

slide-8
SLIDE 8

Unconstrained Submodular Maximization (Cont.)

Captures combinatorial optimization problems: Max-Cut in graphs and hypergraphs. Max-DiCut. Max Facility-Location. Variants of Max-SAT.

Seffi Naor Maximization of Submodular Functions

slide-9
SLIDE 9

Unconstrained Submodular Maximization (Cont.)

Captures combinatorial optimization problems: Max-Cut in graphs and hypergraphs. Max-DiCut. Max Facility-Location. Variants of Max-SAT. Additional settings:

Marketing over social networks [Hartline-Mirrokni-Sundararajan-08] Utility maximization with discrete choice [Ahmed-Atamt¨ urk-09] Least core value in supermodular cooperative games [Schulz-Uhan-07] Approximating market expansion [Dughmi-Roughgarden-Sundararajan-09]

Seffi Naor Maximization of Submodular Functions

slide-10
SLIDE 10

Unconstrained Submodular Maximization (Cont.)

Operations Research:

[Cherenin-62] [Khachaturov-68] [Minoux-77] [Lee-Nemhauser-Wang-95] [Goldengorin-Sierksma-Tijssen-Tso-98] [Goldengorin-Tijssen-Tso-99] [Goldengorin-Ghosh-04] [Ahmed-Atamt¨ urk-09]

Seffi Naor Maximization of Submodular Functions

slide-11
SLIDE 11

Unconstrained Submodular Maximization (Cont.)

Operations Research:

[Cherenin-62] [Khachaturov-68] [Minoux-77] [Lee-Nemhauser-Wang-95] [Goldengorin-Sierksma-Tijssen-Tso-98] [Goldengorin-Tijssen-Tso-99] [Goldengorin-Ghosh-04] [Ahmed-Atamt¨ urk-09]

Algorithmic Bounds: 1/4 random solution

[Feige-Mirrokni-Vondrak-07]

1/3 local search

[Feige-Mirrokni-Vondrak-07]

2/5 non-oblivious local search

[Feige-Mirrokni-Vondrak-07]

Seffi Naor Maximization of Submodular Functions

slide-12
SLIDE 12

Unconstrained Submodular Maximization (Cont.)

Operations Research:

[Cherenin-62] [Khachaturov-68] [Minoux-77] [Lee-Nemhauser-Wang-95] [Goldengorin-Sierksma-Tijssen-Tso-98] [Goldengorin-Tijssen-Tso-99] [Goldengorin-Ghosh-04] [Ahmed-Atamt¨ urk-09]

Algorithmic Bounds: 1/4 random solution

[Feige-Mirrokni-Vondrak-07]

1/3 local search

[Feige-Mirrokni-Vondrak-07]

2/5 non-oblivious local search

[Feige-Mirrokni-Vondrak-07]

≈ 0.41 simulated annealing

[Gharan-Vondrak-11]

Seffi Naor Maximization of Submodular Functions

slide-13
SLIDE 13

Unconstrained Submodular Maximization (Cont.)

Operations Research:

[Cherenin-62] [Khachaturov-68] [Minoux-77] [Lee-Nemhauser-Wang-95] [Goldengorin-Sierksma-Tijssen-Tso-98] [Goldengorin-Tijssen-Tso-99] [Goldengorin-Ghosh-04] [Ahmed-Atamt¨ urk-09]

Algorithmic Bounds: 1/4 random solution

[Feige-Mirrokni-Vondrak-07]

1/3 local search

[Feige-Mirrokni-Vondrak-07]

2/5 non-oblivious local search

[Feige-Mirrokni-Vondrak-07]

≈ 0.41 simulated annealing

[Gharan-Vondrak-11]

≈ 0.42 structural similarity

[Feldman-N-Schwartz-11]

Seffi Naor Maximization of Submodular Functions

slide-14
SLIDE 14

Unconstrained Submodular Maximization (Cont.)

Operations Research:

[Cherenin-62] [Khachaturov-68] [Minoux-77] [Lee-Nemhauser-Wang-95] [Goldengorin-Sierksma-Tijssen-Tso-98] [Goldengorin-Tijssen-Tso-99] [Goldengorin-Ghosh-04] [Ahmed-Atamt¨ urk-09]

Algorithmic Bounds: 1/4 random solution

[Feige-Mirrokni-Vondrak-07]

1/3 local search

[Feige-Mirrokni-Vondrak-07]

2/5 non-oblivious local search

[Feige-Mirrokni-Vondrak-07]

≈ 0.41 simulated annealing

[Gharan-Vondrak-11]

≈ 0.42 structural similarity

[Feldman-N-Schwartz-11]

Hardness: Cannot get better than 1/2 absolute! [Feige-Mirrokni-Vondrak-07]

Seffi Naor Maximization of Submodular Functions

slide-15
SLIDE 15

Unconstrained Submodular Maximization (Cont.)

Operations Research:

[Cherenin-62] [Khachaturov-68] [Minoux-77] [Lee-Nemhauser-Wang-95] [Goldengorin-Sierksma-Tijssen-Tso-98] [Goldengorin-Tijssen-Tso-99] [Goldengorin-Ghosh-04] [Ahmed-Atamt¨ urk-09]

Algorithmic Bounds: 1/4 random solution

[Feige-Mirrokni-Vondrak-07]

1/3 local search

[Feige-Mirrokni-Vondrak-07]

2/5 non-oblivious local search

[Feige-Mirrokni-Vondrak-07]

≈ 0.41 simulated annealing

[Gharan-Vondrak-11]

≈ 0.42 structural similarity

[Feldman-N-Schwartz-11]

Hardness: Cannot get better than 1/2 absolute! [Feige-Mirrokni-Vondrak-07] Question Is 1/2 the correct answer?

Seffi Naor Maximization of Submodular Functions

slide-16
SLIDE 16

Unconstrained Submodular Maximization (Cont.)

Operations Research:

[Cherenin-62] [Khachaturov-68] [Minoux-77] [Lee-Nemhauser-Wang-95] [Goldengorin-Sierksma-Tijssen-Tso-98] [Goldengorin-Tijssen-Tso-99] [Goldengorin-Ghosh-04] [Ahmed-Atamt¨ urk-09]

Algorithmic Bounds: 1/4 random solution

[Feige-Mirrokni-Vondrak-07]

1/3 local search

[Feige-Mirrokni-Vondrak-07]

2/5 non-oblivious local search

[Feige-Mirrokni-Vondrak-07]

≈ 0.41 simulated annealing

[Gharan-Vondrak-11]

≈ 0.42 structural similarity

[Feldman-N-Schwartz-11]

Hardness: Cannot get better than 1/2 absolute! [Feige-Mirrokni-Vondrak-07] Question Is 1/2 the correct answer? Yes! [Buchbinder, Feldman, N., Schwartz FOCS 2012]

Seffi Naor Maximization of Submodular Functions

slide-17
SLIDE 17

Failure of Greedy Approach

Greedy: Useful for monotone f (discrete and continuous setting).

[Fisher-Nemhauser-Wolsey-78] [Calinescu-Chekuri-Pal-Vondrak-07]

Seffi Naor Maximization of Submodular Functions

slide-18
SLIDE 18

Failure of Greedy Approach

Greedy: Useful for monotone f (discrete and continuous setting).

[Fisher-Nemhauser-Wolsey-78] [Calinescu-Chekuri-Pal-Vondrak-07]

Fails for Unconstrained Submodular Maximization Greedy is unbounded!

Seffi Naor Maximization of Submodular Functions

slide-19
SLIDE 19

Failure of Greedy Approach

Greedy: Useful for monotone f (discrete and continuous setting).

[Fisher-Nemhauser-Wolsey-78] [Calinescu-Chekuri-Pal-Vondrak-07]

Fails for Unconstrained Submodular Maximization Greedy is unbounded! Key Insight f is submodular ⇒ f (S) f (N \ S) is submodular Optimal solution of f is N \ OPT. Both optima have the same value.

Seffi Naor Maximization of Submodular Functions

slide-20
SLIDE 20

Failure of Greedy Approach

Greedy: Useful for monotone f (discrete and continuous setting).

[Fisher-Nemhauser-Wolsey-78] [Calinescu-Chekuri-Pal-Vondrak-07]

Fails for Unconstrained Submodular Maximization Greedy is unbounded! Key Insight f is submodular ⇒ f (S) f (N \ S) is submodular Optimal solution of f is N \ OPT. Both optima have the same value. Questions: Why start with an empty solution and add elements? Why not start with N and remove elements?

Seffi Naor Maximization of Submodular Functions

slide-21
SLIDE 21

Attempt I - Geometric Interpretation

Seffi Naor Maximization of Submodular Functions

slide-22
SLIDE 22

Attempt I - Geometric Interpretation

A= 0,0,0 B= 1,1,1

Seffi Naor Maximization of Submodular Functions

slide-23
SLIDE 23

Attempt I - Geometric Interpretation

∆1 ∆2

B= 1,1,1 A= 0,0,0

Seffi Naor Maximization of Submodular Functions

slide-24
SLIDE 24

Attempt I - Geometric Interpretation

∆2 ∆1> ∆2

A= 0,0,0 B= 1,1,1

Seffi Naor Maximization of Submodular Functions

slide-25
SLIDE 25

Attempt I - Geometric Interpretation

A= 1,0,0 B= 1,1,1

Seffi Naor Maximization of Submodular Functions

slide-26
SLIDE 26

Attempt I - Geometric Interpretation

∆2 ∆1

A= 1,0,0 B= 1,1,1

Seffi Naor Maximization of Submodular Functions

slide-27
SLIDE 27

Attempt I - Geometric Interpretation

∆1 ∆1< ∆2

A= 1,0,0 B= 1,1,1

Seffi Naor Maximization of Submodular Functions

slide-28
SLIDE 28

Attempt I - Geometric Interpretation

B= 1,0,1 A= 1,0,0

Seffi Naor Maximization of Submodular Functions

slide-29
SLIDE 29

Attempt I - Geometric Interpretation

∆1, ∆2

A= 1,0,0 B= 1,0,1

Seffi Naor Maximization of Submodular Functions

slide-30
SLIDE 30

Attempt I - Geometric Interpretation

∆1, ∆2 ∆1> ∆2

A= 1,0,0 B= 1,0,1

Seffi Naor Maximization of Submodular Functions

slide-31
SLIDE 31

Attempt I - Geometric Interpretation

A=B= 1,0,1

Seffi Naor Maximization of Submodular Functions

slide-32
SLIDE 32

Attempt I - Geometric Interpretation

A=B= 1,0,1

Seffi Naor Maximization of Submodular Functions

slide-33
SLIDE 33

Attempt I - Algorithm

Notation: N = {u1, u2, . . . , un} Algorithm I

1

A ← ∅, B ← N .

2

for i = 1 to n do: ∆1 ← f (A ∪ {ui}) − f (A). ∆2 ← f (B \ {ui}) − f (B). if ∆1 ∆2 then A ← A ∪ {ui}. else B ← B \ {ui}.

3

Return A.

Seffi Naor Maximization of Submodular Functions

slide-34
SLIDE 34

Attempt I - Algorithm

Notation: N = {u1, u2, . . . , un} Algorithm I

1

A ← ∅, B ← N .

2

for i = 1 to n do: ∆1 ← f (A ∪ {ui}) − f (A). ∆2 ← f (B \ {ui}) − f (B). if ∆1 ∆2 then A ← A ∪ {ui}. else B ← B \ {ui}.

3

Return A. Theorem [Buchbinder-Feldman-N-Schwartz] Algorithm I is a (1/3)-approximation for Unconstrained Submodular Maximization.

Seffi Naor Maximization of Submodular Functions

slide-35
SLIDE 35

Attempt I - Analysis

Observation By submodularity, ∆1 + ∆2 0.

Seffi Naor Maximization of Submodular Functions

slide-36
SLIDE 36

Attempt I - Analysis

Observation By submodularity, ∆1 + ∆2 0. Ai−1 = elements in A at the end of iteration i − 1. Bi−1 = elements in B at the end of iteration i − 1. ∆1 + ∆2 = ( f (Ai−1 ∪ ui) − f (Ai−1)) + ( f (Bi−1 \ ui) − f (Bi−1)) = ( f (Ai−1 ∪ ui) + f (Bi−1 \ ui)) − ( f (Ai−1) + f (Bi−1)) ( f ((Ai−1 ∪ ui) ∪ (Bi−1 \ ui)

  • Bi−1

) + f ((Ai−1 ∪ ui) ∩ (Bi−1 \ ui)

  • Ai−1

) − ( f (Ai−1) + f (Bi−1)) = 0

  • Seffi Naor

Maximization of Submodular Functions

slide-37
SLIDE 37

Attempt I - Analysis

Intuition: Bound loss in terms of gain.

Seffi Naor Maximization of Submodular Functions

slide-38
SLIDE 38

Attempt I - Analysis

Intuition: Bound loss in terms of gain. Define: OPTi (OPT ∪ Ai) ∩ Bi agrees with Ai and Bi on first i elements agrees with OPT on last n − i elements

Seffi Naor Maximization of Submodular Functions

slide-39
SLIDE 39

Attempt I - Analysis

Intuition: Bound loss in terms of gain. Define: OPTi (OPT ∪ Ai) ∩ Bi agrees with Ai and Bi on first i elements agrees with OPT on last n − i elements Evolution of OPTi: f (OPT0) = f (OPT ∪ A0) ∩ B0)) = f (OPT) . . . f (OPTn) = f (OPT ∪ An) ∩ Bn)) = f (ALG)

Seffi Naor Maximization of Submodular Functions

slide-40
SLIDE 40

Attempt I - Analysis

Lemma f (OPTi−1) − f (OPTi)

  • f (Ai) − f (Ai−1) = ∆1

ui ∈ Ai f (Bi) − f (Bi−1) = ∆2 ui / ∈ Bi

Seffi Naor Maximization of Submodular Functions

slide-41
SLIDE 41

Attempt I - Analysis

Lemma f (OPTi−1) − f (OPTi)

  • f (Ai) − f (Ai−1) = ∆1

ui ∈ Ai f (Bi) − f (Bi−1) = ∆2 ui / ∈ Bi Case (1) ui ∈ Ai (⇒ ∆1 ∆2). Thus, OPTi = OPTi−1 ∪ ui, Ai = Ai−1 ∪ ui, Bi = Bi−1.

Seffi Naor Maximization of Submodular Functions

slide-42
SLIDE 42

Attempt I - Analysis

Lemma f (OPTi−1) − f (OPTi)

  • f (Ai) − f (Ai−1) = ∆1

ui ∈ Ai f (Bi) − f (Bi−1) = ∆2 ui / ∈ Bi Case (1) ui ∈ Ai (⇒ ∆1 ∆2). Thus, OPTi = OPTi−1 ∪ ui, Ai = Ai−1 ∪ ui, Bi = Bi−1. If ui ∈ OPTi−1: f (OPTi−1) − f (OPTi) = 0 ∆1, since ∆1 ∆2 and ∆1 −∆2.

Seffi Naor Maximization of Submodular Functions

slide-43
SLIDE 43

Attempt I - Analysis

Lemma f (OPTi−1) − f (OPTi)

  • f (Ai) − f (Ai−1) = ∆1

ui ∈ Ai f (Bi) − f (Bi−1) = ∆2 ui / ∈ Bi Case (1) ui ∈ Ai (⇒ ∆1 ∆2). Thus, OPTi = OPTi−1 ∪ ui, Ai = Ai−1 ∪ ui, Bi = Bi−1. If ui ∈ OPTi−1: f (OPTi−1) − f (OPTi) = 0 ∆1, since ∆1 ∆2 and ∆1 −∆2. Else ui / ∈ OPTi−1: f (OPTi−1) − f (OPTi) = f (OPTi−1) − f (OPTi−1 ∪ ui) f (Bi−1 \ ui) − f (Bi−1) = ∆2 ∆1 By submodularity, since OPTi−1 ⊆ Bi−1 \ ui

Seffi Naor Maximization of Submodular Functions

slide-44
SLIDE 44

Attempt I - Analysis

Lemma f (OPTi−1) − f (OPTi)

  • f (Ai) − f (Ai−1) = ∆1

ui ∈ Ai f (Bi) − f (Bi−1) = ∆2 ui / ∈ Bi Case (1) ui ∈ Ai (⇒ ∆1 ∆2). Thus, OPTi = OPTi−1 ∪ ui, Ai = Ai−1 ∪ ui, Bi = Bi−1. If ui ∈ OPTi−1: f (OPTi−1) − f (OPTi) = 0 ∆1, since ∆1 ∆2 and ∆1 −∆2. Else ui / ∈ OPTi−1: f (OPTi−1) − f (OPTi) = f (OPTi−1) − f (OPTi−1 ∪ ui) f (Bi−1 \ ui) − f (Bi−1) = ∆2 ∆1 By submodularity, since OPTi−1 ⊆ Bi−1 \ ui Case (2): analogous (ui / ∈ Ai)

Seffi Naor Maximization of Submodular Functions

slide-45
SLIDE 45

Attempt I - Analysis Overview (Cont.)

Potential Function:

Seffi Naor Maximization of Submodular Functions

slide-46
SLIDE 46

Attempt I - Analysis Overview (Cont.)

Potential Function: Φi = f (Ai) + f (Bi) + f (OPTi)

Seffi Naor Maximization of Submodular Functions

slide-47
SLIDE 47

Attempt I - Analysis Overview (Cont.)

Potential Function: Φi = f (Ai) + f (Bi) + f (OPTi) Note that: Φ0 f (OPT) Φn = 3 · f (ALG)

Seffi Naor Maximization of Submodular Functions

slide-48
SLIDE 48

Attempt I - Analysis Overview (Cont.)

Potential Function: Φi = f (Ai) + f (Bi) + f (OPTi) Note that: Φ0 f (OPT) Φn = 3 · f (ALG) Potential is non-decreasing: Φi − Φi−1 = f (Ai) + f (Bi) + f (OPTi) − f (Ai−1) − f (Bi−1) − f (OPTi−1) 0

Seffi Naor Maximization of Submodular Functions

slide-49
SLIDE 49

Attempt I - Analysis Overview (Cont.)

Potential Function: Φi = f (Ai) + f (Bi) + f (OPTi) Note that: Φ0 f (OPT) Φn = 3 · f (ALG) Potential is non-decreasing: Φi − Φi−1 = f (Ai) + f (Bi) + f (OPTi) − f (Ai−1) − f (Bi−1) − f (OPTi−1) 0 ⇒ 3 · f (ALG) = Φn Φ0 f (OPT) ⇒ f (ALG) 1 3 · f (OPT) Comment: analysis is tight.

Seffi Naor Maximization of Submodular Functions

slide-50
SLIDE 50

Attempt II

Reminder ∆1 = f (A ∪ {ui}) − f (A) ∆2 = f (B \ {ui}) − f (B)

Seffi Naor Maximization of Submodular Functions

slide-51
SLIDE 51

Attempt II

Reminder ∆1 = f (A ∪ {ui}) − f (A) ∆2 = f (B \ {ui}) − f (B) ∆1 + ∆2 0 always (submodularity). ∆1 0 and ∆2 < 0 ⇒ add ui to A. ∆1 < 0 and ∆2 0 ⇒ remove ui from B.

Seffi Naor Maximization of Submodular Functions

slide-52
SLIDE 52

Attempt II

Reminder ∆1 = f (A ∪ {ui}) − f (A) ∆2 = f (B \ {ui}) − f (B) ∆1 + ∆2 0 always (submodularity). ∆1 0 and ∆2 < 0 ⇒ add ui to A. ∆1 < 0 and ∆2 0 ⇒ remove ui from B. Question: What should we do when ∆1, ∆2 0? or ∆1 = ∆2?

Seffi Naor Maximization of Submodular Functions

slide-53
SLIDE 53

Attempt II

Reminder ∆1 = f (A ∪ {ui}) − f (A) ∆2 = f (B \ {ui}) − f (B) ∆1 + ∆2 0 always (submodularity). ∆1 0 and ∆2 < 0 ⇒ add ui to A. ∆1 < 0 and ∆2 0 ⇒ remove ui from B. Question: What should we do when ∆1, ∆2 0? or ∆1 = ∆2? There is no reason a-priori to prefer one choice over the other! Solution When ∆1, ∆2 0, choose at random in proportion to their values.

Seffi Naor Maximization of Submodular Functions

slide-54
SLIDE 54

Attempt II

Algorithm II

1

A ← ∅, B ← N .

2

for i = 1 to n do: ∆1 ← max{ f (A ∪ {ui}) − f (A), 0}. ∆2 ← max{ f (B \ {ui}) − f (B), 0}. With probability ∆1/(∆1 + ∆2): A ← A ∪ {ui}. With the complement probability: B ← B \ {ui}.

3

Return A.

Seffi Naor Maximization of Submodular Functions

slide-55
SLIDE 55

Attempt II

Algorithm II

1

A ← ∅, B ← N .

2

for i = 1 to n do: ∆1 ← max{ f (A ∪ {ui}) − f (A), 0}. ∆2 ← max{ f (B \ {ui}) − f (B), 0}. With probability ∆1/(∆1 + ∆2): A ← A ∪ {ui}. With the complement probability: B ← B \ {ui}.

3

Return A. Theorem [Buchbinder-Feldman-N-Schwartz] Algorithm II is a linear-time tight (1/2)-approximation for Unconstrained Submodular Maximization.

Seffi Naor Maximization of Submodular Functions

slide-56
SLIDE 56

Attempt II (Analysis)

Potential Function: Φi = f (Ai) + f (Bi) + 2 · OPTi Ai, Bi and OPTi are random variables.

Seffi Naor Maximization of Submodular Functions

slide-57
SLIDE 57

Attempt II (Analysis)

Potential Function: Φi = f (Ai) + f (Bi) + 2 · OPTi Ai, Bi and OPTi are random variables. Note that: Φ0 2 · f (OPT) Φn = 4 · f (ALG) New main observation: E[OPTi−1 − OPTi] 1 2 · E[ f (Ai) − f (Ai−1)] + 1 2 · E[ f (Bi) − f (Bi−1)]

Seffi Naor Maximization of Submodular Functions

slide-58
SLIDE 58

Attempt II (Analysis)

Potential Function: Φi = f (Ai) + f (Bi) + 2 · OPTi Ai, Bi and OPTi are random variables. Note that: Φ0 2 · f (OPT) Φn = 4 · f (ALG) New main observation: E[OPTi−1 − OPTi] 1 2 · E[ f (Ai) − f (Ai−1)] + 1 2 · E[ f (Bi) − f (Bi−1)] E[Φi+1 − Φi] 0 ⇒ 4 · E[ f (ALG)] = E[Φn] Φ0 2 · f (OPT)

Seffi Naor Maximization of Submodular Functions

slide-59
SLIDE 59

Attempt II - Analysis

Lemma E[OPTi−1 − OPTi] 1 2 · E[ f (Ai) − f (Ai−1)] + 1 2 · E[ f (Bi) − f (Bi−1)]

Seffi Naor Maximization of Submodular Functions

slide-60
SLIDE 60

Attempt II - Analysis

Lemma E[OPTi−1 − OPTi] 1 2 · E[ f (Ai) − f (Ai−1)] + 1 2 · E[ f (Bi) − f (Bi−1)] Suffices to condition on Ai−1, Bi−1, OPTi−1 to prove the lemma.

Seffi Naor Maximization of Submodular Functions

slide-61
SLIDE 61

Attempt II - Analysis

Lemma E[OPTi−1 − OPTi] 1 2 · E[ f (Ai) − f (Ai−1)] + 1 2 · E[ f (Bi) − f (Bi−1)] Suffices to condition on Ai−1, Bi−1, OPTi−1 to prove the lemma. Interesting case: ∆1, ∆2 > 0 With probability

∆1 ∆1+∆2 , Ai = Ai−1 ∪ ui, Bi = Bi−1.

With probability

∆2 ∆1+∆2 , Ai = Ai−1, Bi = Bi−1 \ ui.

Seffi Naor Maximization of Submodular Functions

slide-62
SLIDE 62

Attempt II - Analysis

Left Handside: E[OPTi−1 − OPTi] = ∆1 ∆1 + ∆2 · ( f (OPTi−1) − f (OPTi−1 ∪ ui)) + ∆2 ∆1 + ∆2 · ( f (OPTi−1) − f (OPTi−1 \ ui))

  • ?

∆1 · ∆2 ∆1 + ∆2

Seffi Naor Maximization of Submodular Functions

slide-63
SLIDE 63

Attempt II - Analysis

Left Handside: E[OPTi−1 − OPTi] = ∆1 ∆1 + ∆2 · ( f (OPTi−1) − f (OPTi−1 ∪ ui)) + ∆2 ∆1 + ∆2 · ( f (OPTi−1) − f (OPTi−1 \ ui))

  • ?

∆1 · ∆2 ∆1 + ∆2 Right Handside: E[ f (Ai) − f (Ai−1)] + E[ f (Bi) − f (Bi−1)] = ∆1 ∆1 + ∆2 · ( f (Ai−1 ∪ ui) − f (Ai))

  • ∆1

+ ∆2 ∆1 + ∆2 · ( f (Bi−1 \ ui) − f (Bi−1))

  • ∆2

= ∆2

1 + ∆2 2

∆1 + ∆2

Seffi Naor Maximization of Submodular Functions

slide-64
SLIDE 64

Attempt II - Analysis

Clearly: ∆1 · ∆2 ∆1 + ∆2 1 2 · ∆2

1 + ∆2 2

∆1 + ∆2

Seffi Naor Maximization of Submodular Functions

slide-65
SLIDE 65

Attempt II - Analysis

Clearly: ∆1 · ∆2 ∆1 + ∆2 1 2 · ∆2

1 + ∆2 2

∆1 + ∆2 It remains to be convinced that: E[OPTi−1 − OPTi] = ∆1 ∆1 + ∆2 · ( f (OPTi−1) − f (OPTi−1 ∪ ui)) + ∆2 ∆1 + ∆2 · ( f (OPTi−1) − f (OPTi−1 \ ui))

  • ?

∆1 · ∆2 ∆1 + ∆2

Seffi Naor Maximization of Submodular Functions

slide-66
SLIDE 66

Attempt II - Analysis

If ui ∈ OPTi−1: f (OPTi−1) − f (OPTi−1 ∪ ui) = 0 f (OPTi−1) − f (OPTi−1 \ ui) f (Ai−1 ∪ ui) − f (Ai−1) = ∆1 (By submodularity, since Ai−1 ⊆ OPTi−1 \ ui) ⇒ E[OPTi−1 − OPTi] ∆2 ∆1 + ∆2 · ∆1

Seffi Naor Maximization of Submodular Functions

slide-67
SLIDE 67

Attempt II - Analysis

If ui ∈ OPTi−1: f (OPTi−1) − f (OPTi−1 ∪ ui) = 0 f (OPTi−1) − f (OPTi−1 \ ui) f (Ai−1 ∪ ui) − f (Ai−1) = ∆1 (By submodularity, since Ai−1 ⊆ OPTi−1 \ ui) ⇒ E[OPTi−1 − OPTi] ∆2 ∆1 + ∆2 · ∆1 Else ui / ∈ OPTi−1: f (OPTi−1) − f (OPTi−1 \ ui) = 0 f (OPTi−1) − f (OPTi−1 ∪ ui) f (Bi−1 \ ui) − f (Bi−1) = ∆2 (By submodularity, since OPTi−1 ⊆ Bi−1 \ ui) ⇒ E[OPTi−1 − OPTi] ∆1 ∆1 + ∆2 · ∆2

Seffi Naor Maximization of Submodular Functions

slide-68
SLIDE 68

Unconstrained Submodular Maximization - Hardness

𝐵 𝐶 𝑇

|A| = |B| = n 2

  • k = |S ∩ A|

ℓ = |S ∩ B|

Seffi Naor Maximization of Submodular Functions

slide-69
SLIDE 69

Unconstrained Submodular Maximization - Hardness

𝐵 𝐶 𝑇

|A| = |B| = n 2

  • k = |S ∩ A|

ℓ = |S ∩ B| f (S) ≈

  • |S| (n − |S|)

|k − ℓ| εn k(n − 2ℓ) + ℓ(n − 2k)

  • therwise

Seffi Naor Maximization of Submodular Functions

slide-70
SLIDE 70

Unconstrained Submodular Maximization - Hardness

𝐵 𝐶 𝑇

|A| = |B| = n 2

  • k = |S ∩ A|

ℓ = |S ∩ B| f (S) ≈

  • |S| (n − |S|)

|k − ℓ| εn k(n − 2ℓ) + ℓ(n − 2k)

  • therwise

max { f (S)} ≈

  • n2/4

|k − ℓ| εn n2/2

  • therwise

Seffi Naor Maximization of Submodular Functions

slide-71
SLIDE 71

Unconstrained Submodular Maximization - Hardness

𝐵 𝐶 𝑇

|A| = |B| = n 2

  • k = |S ∩ A|

ℓ = |S ∩ B| f (S) ≈

  • |S| (n − |S|)

|k − ℓ| εn k(n − 2ℓ) + ℓ(n − 2k)

  • therwise

max { f (S)} ≈

  • n2/4

|k − ℓ| εn n2/2

  • therwise

Observation (A, B) random ⇓ Pr [|k − ℓ| > εn] 2e−ε2n/4 The case |k − ℓ| > εn is missed!

Seffi Naor Maximization of Submodular Functions

slide-72
SLIDE 72

Continuous Relaxations of Submodular Functions

Minimization: Convex closure f − maximum (pointwise) convex function that lower bounds f: f −(x) = min

D

EU∈D[ f (U)] , ∀x ∈ [0, 1]N D preserves marginals: Pr[ui ∈ U] = xi if x is integral then f (x) = f −(x)

Seffi Naor Maximization of Submodular Functions

slide-73
SLIDE 73

Continuous Relaxations of Submodular Functions

Minimization: Convex closure f − maximum (pointwise) convex function that lower bounds f: f −(x) = min

D

EU∈D[ f (U)] , ∀x ∈ [0, 1]N D preserves marginals: Pr[ui ∈ U] = xi if x is integral then f (x) = f −(x) Submodular Functions: f − = f L (Lovasz extension): f L(x) = Eθ[ f ({i : xi > θ})], θ ∈ [0, 1] u.r., ∀x ∈ [0, 1]N Nice probabilistic interpretation: correlated rounding of elements with respect to uniform choice of θ ∈ [0, 1]. Submodular objective function value is preserved in expectation.

Seffi Naor Maximization of Submodular Functions

slide-74
SLIDE 74

Continuous Relaxations of Submodular Functions

Maximization: Concave closure f + minimum (pointwise) concave function that upper bounds f: f +(x) = max

D

EU∈D[ f (U)] , ∀x ∈ [0, 1]N D preserves marginals: Pr[ui ∈ U] = xi if x is integral then f (x) = f +(x)

Seffi Naor Maximization of Submodular Functions

slide-75
SLIDE 75

Continuous Relaxations of Submodular Functions

Maximization: Concave closure f + minimum (pointwise) concave function that upper bounds f: f +(x) = max

D

EU∈D[ f (U)] , ∀x ∈ [0, 1]N D preserves marginals: Pr[ui ∈ U] = xi if x is integral then f (x) = f +(x) Submodular Functions: no compact representation, even for submodular functions could be helpful for (undirected) max cut ...

Seffi Naor Maximization of Submodular Functions

slide-76
SLIDE 76

Useful Relaxation of Submodular Maximization

Multilinear Extension: F(x) = ∑

R⊆N

f (R) ∏

ui∈R

xi ∏

ui / ∈R

(1 − xi) , ∀x ∈ [0, 1]N Note: Simple probabilistic interpretation: independent rounding of elements to {0, 1} If x is integral then f (x) = F(x) F is neither convex nor concave Rounding in the unconstrained case is easy - sample independently from the distribution.

Seffi Naor Maximization of Submodular Functions

slide-77
SLIDE 77

Continuous Algorithm

Continuous counterpart of Algorithm II The sets A, B ⊆ 2N are replaced by vectors a, b ∈ [0, 1]N Initially: a ← 0N (empty solution), b ← 1N (full solution) In each step:

uses multilinear extension F instead of the submodular function f assigns a fractional value to the elements (when ∆1, ∆2 0) instead of a randomized choice.

Seffi Naor Maximization of Submodular Functions

slide-78
SLIDE 78

Continuous Algorithm

Continuous counterpart of Algorithm II The sets A, B ⊆ 2N are replaced by vectors a, b ∈ [0, 1]N Initially: a ← 0N (empty solution), b ← 1N (full solution) In each step:

uses multilinear extension F instead of the submodular function f assigns a fractional value to the elements (when ∆1, ∆2 0) instead of a randomized choice.

The continuous counterpart of Algorithm II yields: F(ALG) f (OPT) 2 . .

Seffi Naor Maximization of Submodular Functions

slide-79
SLIDE 79

Attempt II - Continuous Approach

Seffi Naor Maximization of Submodular Functions

slide-80
SLIDE 80

Attempt II - Continuous Approach

A= 0,0,0 B= 1,1,1

Seffi Naor Maximization of Submodular Functions

slide-81
SLIDE 81

Attempt II - Continuous Approach

∆2≥ 0 ∆1≥ 0

A= 0,0,0 B= 1,1,1

Seffi Naor Maximization of Submodular Functions

slide-82
SLIDE 82

Attempt II - Continuous Approach

A= 𝛽, 0,0 B= 𝛽, 1,1 𝛽 = ∆1 ∆1 + ∆2

Seffi Naor Maximization of Submodular Functions

slide-83
SLIDE 83

Attempt II - Continuous Approach

∆1≥ 0 ∆2< 0

𝛽 = ∆1 ∆1 + ∆2 A= 𝛽, 0,0 B= 𝛽, 1,1

Seffi Naor Maximization of Submodular Functions

slide-84
SLIDE 84

Attempt II - Continuous Approach

𝛽 = ∆1 ∆1 + ∆2 B= 𝛽, 1,1 A= 𝛽, 1,0

Seffi Naor Maximization of Submodular Functions

slide-85
SLIDE 85

Attempt II - Continuous Approach

∆′1, ∆′2≥ 0

𝛽 = ∆1 ∆1 + ∆2 A= 𝛽, 1,0 B= 𝛽, 1,1

Seffi Naor Maximization of Submodular Functions

slide-86
SLIDE 86

Attempt II - Continuous Approach

𝛽 = ∆1 ∆1 + ∆2 𝛾 = ∆′1 ∆′1 + ∆′2 A=B= 𝛽, 1, 𝛾

Seffi Naor Maximization of Submodular Functions

slide-87
SLIDE 87

Attempt II - Continuous Approach

𝛽 = ∆1 ∆1 + ∆2 𝛾 = ∆′1 ∆′1 + ∆′2 A=B= 𝛽, 1, 𝛾

Seffi Naor Maximization of Submodular Functions

slide-88
SLIDE 88

Open Questions Unconstrained Submodular Maximization:

Can the tight (1/2)-approximation be derandomized? Is there a (1/3)-hardness for any deterministic algorithm?

Seffi Naor Maximization of Submodular Functions