Submodular Maximization applied to Marketing Over Social Networks - - PowerPoint PPT Presentation
Submodular Maximization applied to Marketing Over Social Networks - - PowerPoint PPT Presentation
Submodular Maximization applied to Marketing Over Social Networks Vahab Mirrokni Google Research, NYC Marketing over Social Networks Online Social Networks: MySpace, Facebook. Marketing over Social Networks Online Social Networks:
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook.
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.
◮ Viral Marketing: Word-of-Mouth Advertising.
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.
◮ Viral Marketing: Word-of-Mouth Advertising.
◮ Users influence each others’ valuation on a social network.
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.
◮ Viral Marketing: Word-of-Mouth Advertising.
◮ Users influence each others’ valuation on a social network. ◮ Marketing policy: In what order and at what price, do
we offer an item to buyers?
Application 2: Guaranteed Banner Advertisement
◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.
Application 2: Guaranteed Banner Advertisement
◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.
bj bi
Advertisers Impressions
Application 2: Guaranteed Banner Advertisement
◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.
bj bi
Advertisers Impressions
◮ Guaranteed Delivery:
We pay a penalty for not meeting the demand of each advertiser.
Application 2: Guaranteed Banner Advertisement
◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.
bj bi
Advertisers Impressions
◮ Guaranteed Delivery:
We pay a penalty for not meeting the demand of each advertiser.
◮ Problem: Which set of advertisers should we commit to?
Marketing Over Social Networks Guaranteed Banner Ad Allocation Revenue Maximization
Marketing Over Social Networks Guaranteed Banner Ad Allocation Revenue Maximization
Submodular Maximization
Outline
◮ Submodularity: Definitions, Applications. ◮ Maximizing non-monotone Submodular Functions
◮ Approximation Algorithms. ◮ Hardness Results.
◮ Application 1: Marketing over Social Networks ◮ Application 2: Guaranteed Banner ad Allocation.
Submodular Functions
Model the law of diminishing returns and/or economies of scale.
Submodular Functions
Model the law of diminishing returns and/or economies of scale. Generalize Concave functions to set functions: f is defined
- n subsets of X.
S2 → f(S2) S1 → f(S1) x f(x) X
∆x ∆x ∆x ∆f1 ∆f2 ∆f1 > ∆f2
Submodular Functions
Model the law of diminishing returns and/or economies of scale. Generalize Concave functions to set functions: f is defined
- n subsets of X.
Definition
A set function f : 2X → R is submodular iff for any S, T, f (S) + f (T) ≥ f (S ∪ T) + f (S ∩ T).
Submodular Functions
Model the law of diminishing returns and/or economies of scale. Generalize Concave functions to set functions: f is defined
- n subsets of X.
Definition
A set function f : 2X → R is submodular iff for any S, T, f (S) + f (T) ≥ f (S ∪ T) + f (S ∩ T). Decreasing Marginal Value. j S T f is submodular, iff ∀S ⊂ T, j / ∈ T: f (S ∪ {j}) − f (S) ≥ f (T ∪ {j}) − f (T)
Examples of Submodular Functions
◮ Simple Examples: Additive functions, Concave functions.
◮ Additive function: f (S) =
j∈S cj.
◮ Concave function: f (S) = g(|S|) for a concave function g.
Examples of Submodular Functions
◮ Simple Examples: Additive functions, Concave functions.
◮ Additive function: f (S) =
j∈S cj.
◮ Concave function: f (S) = g(|S|) for a concave function g.
◮ Set coverage:
f (S) = |
j∈S Aj|.
A1 A3 A4 A5 A6 A2
S = {2, 3, 6}
f(S) = |A2∪A3∪A6|
Examples of Submodular Functions
◮ Simple Examples: Additive functions, Concave functions.
◮ Additive function: f (S) =
j∈S cj.
◮ Concave function: f (S) = g(|S|) for a concave function g.
◮ Set coverage:
f (S) = |
j∈S Aj|. ◮ Cuts in graphs and hypergraphs:
f (S) = e(S, S) S ¯ S f (S) = e(S, S) S ¯ S g(S) = e(S → S)
Examples of Submodular Functions
◮ Simple Examples: Additive functions, Concave functions.
◮ Additive function: f (S) =
j∈S cj.
◮ Concave function: f (S) = g(|S|) for a concave function g.
◮ Set coverage:
f (S) = |
j∈S Aj|. ◮ Cuts in graphs and hypergraphs:
f (S) = e(S, S)
◮ Other places:
◮ Maximum Facility location. ◮ Utility functions in economics, Rank function of matroids.
Examples of Submodular Functions
◮ Simple Examples: Additive functions, Concave functions.
◮ Additive function: f (S) =
j∈S cj.
◮ Concave function: f (S) = g(|S|) for a concave function g.
◮ Set coverage:
f (S) = |
j∈S Aj|. ◮ Cuts in graphs and hypergraphs:
f (S) = e(S, S)
◮ Other places:
◮ Maximum Facility location. ◮ Utility functions in economics, Rank function of matroids.
Two categories:
◮ monotone functions, i.e, f (S) ≤ f (T) for S ⊂ T. ◮ non-monotone functions: Cut functions, Maximum Facility
Location.
Submodular Maximization
Given: Value oracle, f : 2X → R is submodular.
S f(S)
Submodular Oracle
Value Query
Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.
Submodular Maximization
Given: Value oracle, f : 2X → R is submodular.
S f(S)
Submodular Oracle
Value Query
Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.
◮ Min-Cut is poly-time solvable. ◮ Submodular function minimization is poly-time solvable.
[Schrijver, Fleischer-Fujishige-Iwata 2001].
Submodular Maximization
Given: Value oracle, f : 2X → R is submodular.
S f(S)
Submodular Oracle
Value Query
Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.
◮ Min-Cut is poly-time solvable. ◮ Submodular function minimization is poly-time solvable.
[Schrijver, Fleischer-Fujishige-Iwata 2001].
◮ Max-Cut is NP-hard.
Submodular Maximization
Given: Value oracle, f : 2X → R is submodular.
S f(S)
Submodular Oracle
Value Query
Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.
◮ Min-Cut is poly-time solvable. ◮ Submodular function minimization is poly-time solvable.
[Schrijver, Fleischer-Fujishige-Iwata 2001].
◮ Max-Cut is NP-hard. ◮ α-approximation: output is at least α times OPT. ◮ For most of this talk, assume f : 2X → R+ is non-negative.
Applications of Submodular Maximization
◮ Maximizing Monotone Submodular Functions (with cardinality
constraint):
◮ Maximum Set Coverage. ◮ Maximizing influence in social networks (Kempe, Kleinberg,
Tardos, KDD).
◮ Optimal sensor installation for outbreak detection (LKGFVG,
KDD).
Applications of Submodular Maximization
◮ Maximizing Monotone Submodular Functions (with cardinality
constraint):
◮ Maximum Set Coverage. ◮ Maximizing influence in social networks (Kempe, Kleinberg,
Tardos, KDD).
◮ Optimal sensor installation for outbreak detection (LKGFVG,
KDD).
◮ Maximizing Non-monotone Submodular Functions.
◮ Maximum Facility Location ◮ Segmentation Problems (Kleinberg, Papadimitriou, Raghavan,
JACM).
◮ Least core value of general supermodular cost games (Schulz,
Uhan, APPROX).
◮ Optimal marketing over social networks (Hartline, M.,
Sundararajan, WWW)
◮ Revenue maximization for banner advertisement (Feige,
Immorlica, M., Nazerzadeh, WWW)
Outline
◮ Submodularity. ◮ Maximizing non-monotone Submodular Functions
◮ Approximation Algorithms. ◮ Hardness Results.
◮ Application 1: Marketing over Social Networks ◮ Application 2: Guaranteed Banner ad Allocation.
Related work
◮ Maximizing monotone submodular functions with cardinality
constraints, (|S| ≤ k): Greedy (1 − 1/e)-approximation. [Nemhauser-Wolsey-Fisher ’78, Feige ’98]
Related work
◮ Maximizing monotone submodular functions with cardinality
constraints, (|S| ≤ k): Greedy (1 − 1/e)-approximation. [Nemhauser-Wolsey-Fisher ’78, Feige ’98]
◮ Maximizing non-monotone submodular functions has been
studied in OR. [Lee-Nemhauser-Wang ’96, Goldengorkin-Tijssen-Tso’99] No guaranteed approximation factor known.
Related work
◮ Maximizing monotone submodular functions with cardinality
constraints, (|S| ≤ k): Greedy (1 − 1/e)-approximation. [Nemhauser-Wolsey-Fisher ’78, Feige ’98]
◮ Maximizing non-monotone submodular functions has been
studied in OR. [Lee-Nemhauser-Wang ’96, Goldengorkin-Tijssen-Tso’99] No guaranteed approximation factor known.
◮ For special cases, known approximation algorithms:
◮ 0.878-approx for Max Cut [Goemans-Williamson ’95] ◮ 0.874-approx for Max Di-Cut [Livnat-Lewin-Zwick ’02] ◮ (1 − 1/2k−1)-approx for Max Cut in k-uniform hypergraphs; ◮ NP-hard to improve 7/8 approximation for k = 4 [H˚
astad ’01].
Our Results: Non-negative submodular functions
Feige, M., Vondrak (FOCS,07)
- 1. Approximation Algorithms for Maximizing non-monotone
non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.
Our Results: Non-negative submodular functions
Feige, M., Vondrak (FOCS,07)
- 1. Approximation Algorithms for Maximizing non-monotone
non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.
◮ 0.33-approximation (deterministic local search).
Our Results: Non-negative submodular functions
Feige, M., Vondrak (FOCS,07)
- 1. Approximation Algorithms for Maximizing non-monotone
non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.
◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”).
Our Results: Non-negative submodular functions
Feige, M., Vondrak (FOCS,07)
- 1. Approximation Algorithms for Maximizing non-monotone
non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.
◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions.
Our Results: Non-negative submodular functions
Feige, M., Vondrak (FOCS,07)
- 1. Approximation Algorithms for Maximizing non-monotone
non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.
◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions.
- 2. Hardness Results:
◮ For any fixed ǫ > 0, a (1/2 + ǫ)-approximation would require
exponentially many queries.
Our Results: Non-negative submodular functions
Feige, M., Vondrak (FOCS,07)
- 1. Approximation Algorithms for Maximizing non-monotone
non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.
◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions.
- 2. Hardness Results:
◮ For any fixed ǫ > 0, a (1/2 + ǫ)-approximation would require
exponentially many queries.
◮ Submodular functions with succinct representation: NP-hard
to achieve (3/4 + ǫ)-approximation.
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
1 2 3 S = {4}
f(S) = 10
10 11 12 4 5 6 7 9 S
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
Improving local operation if f (S′) > f (S).
1 2 3 S = {4, 5}
f(S) = 17
10 11 12 4 5 6 7 9 S
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
Improving local operation if f (S′) > f (S).
1 2 3 S = {4, 5, 7}
f(S) = 23
10 11 12 4 5 6 7 9 S
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
Improving local operation if f (S′) > f (S).
1 2 3 S = {4, 5, 7, 3}
f(S) = 27
10 11 12 4 5 6 7 9 S
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
Improving local operation if f (S′) > f (S).
1 2 3 S = {4, 7, 3}
f(S) = 30
10 11 12 4 5 6 7 9 S
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
Improving local operation if f (S′) > f (S).
1 2 3 S = {4, 7, 3, 6}
f(S) = 33
10 11 12 4 5 6 7 9 S
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
Improving local operation if f (S′) > f (S).
1 2 3 S = {4, 3, 6}
f(L) = f(S) = 34
10 11 12 4 5 6 7 9 L = S
Output L or ¯ L.
Submodular Maximization: Local Search
Local Operation:
◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.
Improving local operation if f (S′) > f (S).
Local Search Algorithm:
◮ Start with S = {v} where v is the singleton of max value. ◮ While there is an improving local operation,
- 1. Perform a local operation s.t. f (S′) > (1 + ǫ/n2)f (S).
◮ Return the better of f (S) and f (S).
Theorem (Feige, M., Vondrak)
The Local Search Algorithm returns at least (1
3 − ǫ n)OPT;
for symmetric functions, at least (1
2 − ǫ n)OPT (tight analysis).
A Structure Lemma
Lemma
For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f (L) ≥ f (C).
A Structure Lemma
Lemma
For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f (L) ≥ f (C). If T0 ⊂ T1 ⊂ T2 ⊂ · · · ⊂ Tk = L ⊂ Tk+1 ⊂ · · · ⊂ Tn, then, f (T0) ≤ f (T1) · · · ≤ f (Tk−1) ≤ f (L) ≥ f (Tk+1) ≥ . . . ≥ f (Tn)
T2 Tk = L T3 T1 Tk−1
a2
ak
a1 a3
Tn = X
A Structure Lemma
Lemma
For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f (L) ≥ f (C). If T0 ⊂ T1 ⊂ T2 ⊂ · · · ⊂ Tk = L ⊂ Tk+1 ⊂ · · · ⊂ Tn, then, f (T0) ≤ f (T1) · · · ≤ f (Tk−1) ≤ f (L) ≥ f (Tk+1) ≥ . . . ≥ f (Tn) Proof: Base of Induction: f (Tk−1) ≤ f (L) ≥ f (Tk+1). If Ti+1\Ti = ai, then by submodularity and local optimality of L: f (Ti+1) − f (Ti) ≥ f (L) − f (L\{ai}) ≥ 0 ⇒ f (Ti+1) ≥ f (Ti) Therefore, f (T0) ≤ f (T1) ≤ f (T2) · · · ≤ f (L)
Proof of The Structure Lemma
Theorem (Feige, M., Vondrak)
The Local Search Algorithm returns at least (1
3 − ǫ n)OPT.
Proof: Find a local optimum L and return either L or L. Consider the actual optimum C. By structure lemma,
◮ f (L) ≥ f (C ∩ L) ◮ f (L) ≥ f (C ∪ L)
Proof of The Structure Lemma
Theorem (Feige, M., Vondrak)
The Local Search Algorithm returns at least (1
3 − ǫ n)OPT.
Proof: Find a local optimum L and return either L or L. Consider the actual optimum C. By structure lemma,
◮ f (L) ≥ f (C ∩ L) ◮ f (L) ≥ f (C ∪ L)
Hence, again by submodularity, 2f (L) + f (L) ≥ f (C ∩ L) + f (C ∪ L) + f (L) ≥ f (C ∩ L) + f (C ∩ L) + f (X) ≥ f (C) + f (∅) ≥ OPT. Consequently, either f (L) or f (L) must be at least 1
3OPT.
Our Results
Feige, M., Vondrak [FMV]
- 1. Approximation Algorithms for Maximizing non-negative
submodular functions:
◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions
- 2. Hardness Results:
◮ A (1/2 + ǫ)-approximation would require exponentially many
queries.
◮ Submodular functions with succinct representation: NP-hard
to achieve (3/4 + ǫ)-approximation.
Random Subsets
A
A(p) A(p) is a random subset of A: each element is picked with prob- ability p
Sampling lemma
E[f (A(p))] ≥ pf (A) + (1 − p)f (∅)
The Smooth Local Search Algorithm
◮ For any set A, let RA = A(2/3) ∪ A(1/3), a random set.
A RA
RA is a random set with some bias in picking elements from A
Let Φ(A) = E[f (RA)] - a smoothed variant of f (A).
The Smooth Local Search Algorithm
◮ For any set A, let RA = A(2/3) ∪ A(1/3), a random set.
A RA
RA is a random set with some bias in picking elements from A
Let Φ(A) = E[f (RA)] - a smoothed variant of f (A). Algorithm:
◮ Perform local search with respect to Φ(A). ◮ When a local optimum is found, return RA or A.
The Smooth Local Search Algorithm
◮ For any set A, let RA = A(2/3) ∪ A(1/3), a random set.
A RA
RA is a random set with some bias in picking elements from A
Let Φ(A) = E[f (RA)] - a smoothed variant of f (A). Algorithm:
◮ Perform local search with respect to Φ(A). ◮ When a local optimum is found, return RA or A.
Theorem (Feige, M., Vondrak)
The Smooth Local Search algorithm returns at least 0.40OPT.
Proof Sketch of the Algorithm
Analysis: more complicated, using the sampling lemma for 3 sets. Let
◮ A = local optimum found by our algorithm ◮ R = A(2/3) ∪ A(1/3), our random set ◮ C = optimum
Main claims:
- 1. E[f (R)] ≥ E[f (R ∪ (A ∩ C))].
- 2. E[f (R)] ≥ E[f (R ∩ (A ∪ C))].
3.
9 20E[f (R ∪(A∩C))]+ 9 20E[f (R ∩(A∪C))]+ 1 10f (A) ≥ 0.4OPT.
Our Results
- 1. Approximation Algorithms for Maximizing non-monotone
submodular functions:
◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.5-approximation for symmetric functions.
- 2. Hardness Results:
◮ It is impossible to improve the factor (1/2). ◮ A (1/2 + ǫ)-approximation would require exponentially many
queries.
◮ Submodular functions with succinct representation: NP-hard
to achieve (3/4 + ǫ)-approximation.
Proof Idea for the Hardness Result
Goal: Find two functions f and g which look identical to a typical query with high probability, and max g(S) . = 2 max f (S).
Proof Idea for the Hardness Result
Goal: Find two functions f and g which look identical to a typical query with high probability, and max g(S) . = 2 max f (S). Consider two functions f1, g1 : [0, 1]2 → R+:
- 1. f1(x, y) = (x + y)(2 − x − y).
- 2. g1(x, y) = 2x(1 − y) + 2(1 − x)y.
Observe: f1(x, x) = g1(x, x) ⇒ modify g1(x, y) to g2(x, y) s.t. f1(x, y) = g2(x, y) for |x − y| < ǫ.
Proof Idea for the Hardness Result
Goal: Find two functions f and g which look identical to a typical query with high probability, and max g(S) . = 2 max f (S). Consider two functions f1, g1 : [0, 1]2 → R+:
- 1. f1(x, y) = (x + y)(2 − x − y).
(”complete graph cut”)
- 2. g1(x, y) = 2x(1 − y) + 2(1 − x)y.
(”bipartite graph cut”) Observe: f1(x, x) = g1(x, x) ⇒ modify g1(x, y) to g2(x, y) s.t. f1(x, y) = g2(x, y) for |x − y| < ǫ. Mapping to set functions: Let X = X1 ∪ X2, and f (S) = f1
- |S∩X1|
|X1| , |S∩X2| |X2|
- , and g(S) = g2
- |S∩X1|
|X1| , |S∩X2| |X2|
- .
S x y X1 X2 1 − x 1 − y
Similar Technique for Combinatorial Auctions
◮ Using a similar technique, we can show information theoretic
lower bounds for combinatorial auctions.
Buyers
f3(S3) f4(S4) f5(S5) f6(S6) f1(S1) f2(S2) S1 S5 S6 S4 S3 S2 ◮ Goal: Partition items to maximize social welfare, i.e, i fi(Si).
Theorem (M., Schapira, Vondrak, EC08)
Achieving factor better than 1 − 1
e needs exponential number of
value queries.
Extra Constraints
◮ Cardinality Constraints: |S| ≤ k. ◮ Knapsack Constaints: i∈S wi ≤ C. ◮ Matroid Constraints.
Extra Constraints
◮ Cardinality Constraints: |S| ≤ k. ◮ Knapsack Constaints: i∈S wi ≤ C. ◮ Matroid Constraints. ◮ Known results: Monotone Submodular functions:
◮ Matroid or cardinality constraints: 1 − 1
e (NWF78, Vondrak08)
◮ k Knapsack constraints: 1 − 1
e . (Sviridenko01, KST09)
◮ k Matroid constraints:
1 k+1 (NWF78).
Extra Constraints
◮ Cardinality Constraints: |S| ≤ k. ◮ Knapsack Constaints: i∈S wi ≤ C. ◮ Matroid Constraints. ◮ Known results: Monotone Submodular functions:
◮ Matroid or cardinality constraints: 1 − 1
e (NWF78, Vondrak08)
◮ k Knapsack constraints: 1 − 1
e . (Sviridenko01, KST09)
◮ k Matroid constraints:
1 k+1 (NWF78).
◮ (Lee, M., Natarajan, Sviridenko (STOC 2009))
Maximizing non-monotone submodular functions:
◮ One matroid or cardinality constraints:
1 4.
◮ k Knapsack constraints:
1 5.
◮ k Matroid constraints:
1 k+2+ 1
k .
◮ k Partition matroid constraints:
1 k+1+
1 k−1 .
Extra Constraints: Algorithms
Lee, M., Nagarajan, Sviridenko, STOC’09
◮ Local Search Algorithms.
◮ Cardinality constraints:
1 4.
◮ Local Search with add, remove, and swap operations.
Extra Constraints: Algorithms
Lee, M., Nagarajan, Sviridenko, STOC’09
◮ Local Search Algorithms.
◮ Cardinality constraints:
1 4.
◮ Local Search with add, remove, and swap operations. ◮ k Knapsack or budget constraints:
1 5.
- 1. Solve a fractional variant on small elements.
- 2. Round the fractional solution for small elements.
- 3. Output the better of the solution for small elements and large
elements.
Extra Constraints: Algorithms
Lee, M., Nagarajan, Sviridenko, STOC’09
◮ Local Search Algorithms.
◮ Cardinality constraints:
1 4.
◮ Local Search with add, remove, and swap operations. ◮ k Knapsack or budget constraints:
1 5.
- 1. Solve a fractional variant on small elements.
- 2. Round the fractional solution for small elements.
- 3. Output the better of the solution for small elements and large
elements.
◮ k Matroid constraints: 1 k+2+ 1
k . ◮ Local Search: Delete operation and more complicated
exchange operations.
Approximating Everywhere
◮ Can we learn a submodular function by polynomial number of
queries?
◮ After polynomial number of queries, construct an oracle that
approximate f by f ′?
Sk−1 Submodular Oracle Sk S2 f(S1), . . . , f(Sk) S1 S1 S3 · · ·
Approximate Oracle S f ′(S)
S1
◮ Goal: Approximate f by f ′?
Approximating Everywhere
◮ Can we learn a submodular function by polynomial number of
queries?
◮ After polynomial number of queries, construct an oracle that
approximate f by f ′?
Sk−1 Submodular Oracle Sk S2 f(S1), . . . , f(Sk) S1 S1 S3 · · ·
Approximate Oracle S f ′(S)
S1
◮ Goal: Approximate f by f ′?
Theorem (Goemans, Iwata, Harvey, M. SODA09)
Achieving factor better than √n needs exponential number of value queries even for rank functions of matroids.
Approximating Everywhere
◮ Can we learn a submodular function by polynomial number of
queries?
◮ After polynomial number of queries, construct an oracle that
approximate f by f ′?
Sk−1 Submodular Oracle Sk S2 f(S1), . . . , f(Sk) S1 S1 S3 · · ·
Approximate Oracle S f ′(S)
S1
◮ Goal: Approximate f by f ′?
Theorem (Goemans, Iwata, Harvey, M. SODA09)
Achieving factor better than √n needs exponential number of value queries even for rank functions of matroids.
Theorem (Goemans, Iwata, Harvey, M. SODA09)
After a polynomial number of value queries to a monotonte submodular function, we can approximate the function everywhere within O(√n log n).
Outline
◮ Submodularity. ◮ Maximizing non-monotone Submodular Functions ◮ Application 1: Marketing over Social Networks ◮ Application 2: Guaranteed Banner ad Allocation.
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook.
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.
◮ Viral Marketing: Word-of-Mouth Advertising.
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.
◮ Viral Marketing: Word-of-Mouth Advertising.
◮ Users influence each others’ valuation on a social network.
Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.
◮ Viral Marketing: Word-of-Mouth Advertising.
◮ Users influence each others’ valuation on a social network. ◮ Marketing policy: In what order and at what price, do
we offer an item to buyers?
Application 1: Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising.
Application 1: Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon.
Application 1: Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)
functions.
100 A B C
($100,$110,$115,$118)
Application 1: Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)
functions.
110 A B C
($100,$110,$115,$118)
Application 1: Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)
functions.
◮ fi(S) = g(|Ni ∩ S|).
115 A B C
($100,$110,$115,$118)
Application 1: Marketing over Social Networks
◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)
functions.
◮ fi(S) = g(|Ni ∩ S|).
118 A B C
($100,$110,$115,$118)
Marketing Model
◮ Given: A prior (probability distribution) Pi(S) for the
valuation function for user i given that a set S of users already adopted the item.
◮ Goal: Design a Marketing Policy to maximize the expected
revenue.
Marketing Model
◮ Given: A prior (probability distribution) Pi(S) for the
valuation function for user i given that a set S of users already adopted the item.
◮ Goal: Design a Marketing Policy to maximize the expected
revenue.
◮ Marketing policy: Visit buyers one by one and offer them a
price.
◮ Ordering of buyers. ◮ Pricing at each step.
Marketing Model
◮ Given: A prior (probability distribution) Pi(S) for the
valuation function for user i given that a set S of users already adopted the item.
◮ Goal: Design a Marketing Policy to maximize the expected
revenue.
◮ Marketing policy: Visit buyers one by one and offer them a
price.
◮ Ordering of buyers. ◮ Pricing at each step.
◮ Optimal (myopic) Pricing: Optimal price to maximize revenue
at each step (ignoring the future influence).
Marketing Model
◮ Given: A prior (probability distribution) Pi(S) for the
valuation function for user i given that a set S of users already adopted the item.
◮ Goal: Design a Marketing Policy to maximize the expected
revenue.
◮ Marketing policy: Visit buyers one by one and offer them a
price.
◮ Ordering of buyers. ◮ Pricing at each step.
◮ Optimal (myopic) Pricing: Optimal price to maximize revenue
at each step (ignoring the future influence).
◮ Let fi(S) be the optimal revenue from buyer i using the
- ptimal (myopic) price given that a set S of buyers have
bought the item.
◮ We call this function fi the influence function.
Marketing Model: Example
◮ Marketing policy: In what order and at what price do we
- ffer a digital good to buyers?
◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65)
100?
Marketing Model: Example
◮ Marketing policy: In what order and at what price do we
- ffer a digital good to buyers?
◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65) 100
70?
Marketing Model: Example
◮ Marketing policy: In what order and at what price do we
- ffer a digital good to buyers?
◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65) 70
65?
100
Marketing Model: Example
◮ Marketing policy: In what order and at what price do we
- ffer a digital good to buyers?
◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65) 100 70 65
118?
Marketing Model
Marketing policy: In what order and at what price, do we
- ffer a digital good to buyers?
◮ Given: A prior (probability distribution) Pi(S) for the
valuation of user i given that a set S of users already adopted the item.
◮ Goal: Design a Marketing Policy to maximize the expected
revenue.
Marketing Model
Marketing policy: In what order and at what price, do we
- ffer a digital good to buyers?
◮ Given: A prior (probability distribution) Pi(S) for the
valuation of user i given that a set S of users already adopted the item.
◮ Goal: Design a Marketing Policy to maximize the expected
revenue.
◮ The problem is NP-hard. ◮ Hartline, M., and Sundararajan [HMS] (WWW)
Influence & Exploit Strategies
Influence & Exploit strategies:
- 1. Influence: Give the item for free to a set A of influential
buyers.
- 2. Exploit: Apply the following strategy on the rest.
◮ Optimal myopic pricing. ◮ Random order.
Influence & Exploit Strategies
Influence & Exploit strategies:
- 1. Influence: Give the item for free to a set A of influential
buyers.
- 2. Exploit: Apply the following strategy on the rest.
◮ Optimal myopic pricing. ◮ Random order.
Optimal (myopic) pricing:
◮ At each time, offer a price that maximizes the current
expected revenue (ignoring the future influence).
Q1: How good are influence & exploit strategies?
Theorem (Hartline, M., Sundararajan)
Given monotone submodular influence functions, Influence & Exploit strategies give constant-factor approximations to the
- ptimal revenue.
Q1: How good are influence & exploit strategies?
Theorem (Hartline, M., Sundararajan)
Given monotone submodular influence functions, Influence & Exploit strategies give constant-factor approximations to the
- ptimal revenue.
Constant factor:
◮ 0.25 for the general distributions, ◮ 0.306 for distributions satisfying a monotone hazard-rate
condition,
◮ 0.33 for additive settings and uniform distribution, ◮ 0.66 for undirected graphs and uniform distributions, ◮ 0.94 for complete undirected graphs and uniform distributions.
Q2: How to find influential users?
◮ Question 2: How do we choose a set A of influential users to
give the item for free to maximize the revenue?
◮ Let g(A) be the expected revenue, if the initial set is A.
Q2: How to find influential users?
◮ Question 2: How do we choose a set A of influential users to
give the item for free to maximize the revenue?
◮ Let g(A) be the expected revenue, if the initial set is A. ◮ For some small sets A, g(∅) ≤ g(A). ◮ If we give the item for free to all users, g(X) = 0.
Q2: How to find influential users?
◮ Question 2: How do we choose a set A of influential users to
give the item for free to maximize the revenue?
◮ Let g(A) be the expected revenue, if the initial set is A. ◮ For some small sets A, g(∅) ≤ g(A). ◮ If we give the item for free to all users, g(X) = 0.
Theorem (Hartline, M., Sundararajan)
Given monotone submodular influence functions, the revenue function g is a non-monotone non-negative submodular function.
Q2: How to find influential users?
◮ Question 2: How do we choose a set A of influential users to
give the item for free to maximize the revenue?
◮ Let g(A) be the expected revenue, if the initial set is A. ◮ For some small sets A, g(∅) ≤ g(A). ◮ If we give the item for free to all users, g(X) = 0.
Theorem (Hartline, M., Sundararajan)
Given monotone submodular influence functions, the revenue function g is a non-monotone non-negative submodular function.
◮ Thus, to find a set A that maximizes the revenue g(A), we
can use the 0.4-approximation local search algorithm from Feige, M., Vondrak.
Future Directions
◮ Marketing over Social Networks
◮ Avoiding Price Discremenation (Fixed-Price). ◮ Iterative Pricing with Positive Network Externalities
(Akhlaghpour, Ghodsi, Haghpanah, Mahini, M., Nikzad).
◮ Cascading Effect and Influence Propagation. ◮ Revenue Maximization for Fixed-Priced Marketing with
Influence Propagation (M., Sundararajna, Roch).
◮ Learning vs. Marketing.
Thank You
Outline
◮ Submodularity. ◮ Maximizing non-monotone Submodular Functions ◮ Application 1: Marketing over Social Networks. ◮ Application 2: Guaranteed Banner ad Allocation.
Application 2: Guaranteed Banner Advertisement
Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue.
Application 2: Guaranteed Banner Advertisement
Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue. Guaranteed Banner Advertisement:
◮ Each advertiser i
◮ Interested in set Si of impressions, (e.g, young women in
Seattle),
◮ Bids bi for each impression, ◮ Needs di impressions, ◮ Penalty αbi for not satisfying each unit (Guaranteed Delivery).
Sj di = 4 bj Si bi
Advertisers Impressions
(Si, bi, di)
Application 2: Guaranteed Banner Advertisement
Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue. Guaranteed Banner Advertisement:
◮ Each advertiser i
◮ Interested in set Si of impressions, (e.g, young women in
Seattle),
◮ Bids bi for each impression, ◮ Needs di impressions, ◮ Penalty αbi for not satisfying each unit (Guaranteed Delivery).
Sj di = 4 bj Si bi
Advertisers Impressions
(Si, bi, di)
Goal: Choose a set T of advertisers to maximize revenue, f (T).
Application 2: Guaranteed Banner Advertisement
Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue. Guaranteed Banner Advertisement:
◮ Each advertiser i
◮ Interested in set Si of impressions, (e.g, young women in
Seattle),
◮ Bids bi for each impression, ◮ Needs di impressions, ◮ Penalty αbi for not satisfying each unit (Guaranteed Delivery).
Sj di = 4 bj Si bi
Advertisers Impressions
(Si, bi, di)
Goal: Choose a set T of advertisers to maximize revenue, f (T). If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi
Guaranteed Banner Advertisement: Submodularity
If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci.
Guaranteed Banner Advertisement: Submodularity
If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci. If we commit to set T of advertisers: f (T) = P(T) −
i∈T ci = P(T) − C(T).
Advertisers Impressions
T
Maximum weighted matching for this graph. (1 + α)bj (1 + α)bi
P(T) =
Guaranteed Banner Advertisement: Submodularity
If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci. If we commit to set T of advertisers: f (T) = P(T) −
i∈T ci = P(T) − C(T).
Advertisers Impressions
T
Maximum weighted matching for this graph. (1 + α)bj (1 + α)bi
P(T) =
P is submodular ⇒ f is submodular, but it can be negative.
Guaranteed Banner Advertisement: Submodularity
If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci. If we commit to set T of advertisers: f (T) = P(T) −
i∈T ci = P(T) − C(T).
Advertisers Impressions
T
Maximum weighted matching for this graph. (1 + α)bj (1 + α)bi
P(T) =
P is submodular ⇒ f is submodular, but it can be negative. In fact, we prove that the problem is not approximable within any constant factor for any α = c.
Structural Approximation
Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:
◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.
Structural Approximation
Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:
◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.
The above problems are possibly negative, and not approximable within any multiplicative approximatin factor.
Structural Approximation
Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:
◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.
The above problems are possibly negative, and not approximable within any multiplicative approximatin factor. Structural Approximation
◮ Approximation factor is a function of the structure of the
solution.
Structural Approximation
Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:
◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.
The above problems are possibly negative, and not approximable within any multiplicative approximatin factor. Structural Approximation
◮ Approximation factor is a function of the structure of the
solution. We design greedy and linear programming-based tight structural approximation algorithms for the above problems.
Guaranteed Banner Ad Problem
Greedy Algorithm:
- 1. S = ∅.
- 2. At each step, add an advertiser i ∈ X\S that maximizes
P(S∪{i})−P(S)−ci P(S∪{i})−P(S)
if P(S ∪ {i}) − P(S) − ci > 0.
Guaranteed Banner Ad Problem
Greedy Algorithm:
- 1. S = ∅.
- 2. At each step, add an advertiser i ∈ X\S that maximizes
P(S∪{i})−P(S)−ci P(S∪{i})−P(S)
if P(S ∪ {i}) − P(S) − ci > 0. LP-based Algorithm:
◮ Solve a Configurational LP and Round it.
max
- i∈A,Q⊆Si X Q
i dibi((1 + α)(|Q| di − ln |Q| di ) − α)
s.t.
- i∈A,Q∈Si:j∈S X Q
i
≤ 1 ∀j ∈ U
- Q∈Si X Q
i
≤ 1 ∀i ∈ A X Q
i
≥ 0 ∀i ∈ A, ∀Q ∈ Si
Guaranteed Banner Ad Problem
Greedy Algorithm:
- 1. S = ∅.
- 2. At each step, add an advertiser i ∈ X\S that maximizes
P(S∪{i})−P(S)−ci P(S∪{i})−P(S)
if P(S ∪ {i}) − P(S) − ci > 0. LP-based Algorithm:
◮ Solve a Configurational LP and Round it.
max
- i∈A,Q⊆Si X Q
i dibi((1 + α)(|Q| di − ln |Q| di ) − α)
s.t.
- i∈A,Q∈Si:j∈S X Q
i
≤ 1 ∀j ∈ U
- Q∈Si X Q
i
≤ 1 ∀i ∈ A X Q
i
≥ 0 ∀i ∈ A, ∀Q ∈ Si Theorem[Feige, Immorlica, M., Nazerzadeh] The greedy algorithm and the LP-based algorithm achieve tight structural approximation for (capacitated) maximum facility location, segmentation problems, and guaranteed banner advertisement problem.
Future Directions
◮ Guaranteed Banner ad Allocation
◮ Uncertainty in Supply (impressions). ◮ Uncertainty in Demand (advertisers). ◮ Online Allocation Mechanisms. ◮ Truthful Mechanisms.
Marketing over Social Networks
Algorithmic & Economic aspects of the Internet
Internet Monetization (Stochastic Optimization, Linear Programming) Search & Large Networks Algorithmic Game Theory (Auctions, Mechanism Design) Guaranteed Banner Advertisement
Marketing over Social Networks
Algorithmic & Economic aspects of the Internet
Internet Monetization (Stochastic Optimization, Linear Programming) Search & Large Networks (Refined Random Walks, Local Clustering) Algorithmic Game Theory (Auctions, Mechanism Design) Guaranteed Banner Advertisement
Marketing over Social Networks Trust-based Recommendation Systems Overlapping Clustering for Distributed Comp.
Algorithmic & Economic aspects of the Internet
Internet Monetization (Stochastic Optimization, Linear Programming) Search & Large Networks (Refined Random Walks, Local Clustering) PageRank Contributions & Link Spam Detection Algorithmic Game Theory (Auctions, Mechanism Design) Guaranteed Banner Advertisement
Algorithms for Search and Large Networks
◮ Local Computation of PageRank Contributions & Link
Spam Detection
Algorithms for Search and Large Networks
◮ Local Computation of PageRank Contributions & Link
Spam Detection
◮ Axiomatic Approach to Trust-Based Recommendation
Systems
Algorithms for Search and Large Networks
◮ Local Computation of PageRank Contributions & Link
Spam Detection
◮ Axiomatic Approach to Trust-Based Recommendation
Systems
◮ Overlapping Clustering for Distributed Computation