Submodular Maximization applied to Marketing Over Social Networks - - PowerPoint PPT Presentation

submodular maximization applied to marketing over social
SMART_READER_LITE
LIVE PREVIEW

Submodular Maximization applied to Marketing Over Social Networks - - PowerPoint PPT Presentation

Submodular Maximization applied to Marketing Over Social Networks Vahab Mirrokni Google Research, NYC Marketing over Social Networks Online Social Networks: MySpace, Facebook. Marketing over Social Networks Online Social Networks:


slide-1
SLIDE 1

Submodular Maximization applied to Marketing Over Social Networks

Vahab Mirrokni Google Research, NYC

slide-2
SLIDE 2

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook.

slide-3
SLIDE 3

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.

◮ Viral Marketing: Word-of-Mouth Advertising.

slide-4
SLIDE 4

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.

◮ Viral Marketing: Word-of-Mouth Advertising.

◮ Users influence each others’ valuation on a social network.

slide-5
SLIDE 5

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.

◮ Viral Marketing: Word-of-Mouth Advertising.

◮ Users influence each others’ valuation on a social network. ◮ Marketing policy: In what order and at what price, do

we offer an item to buyers?

slide-6
SLIDE 6

Application 2: Guaranteed Banner Advertisement

◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.

slide-7
SLIDE 7

Application 2: Guaranteed Banner Advertisement

◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.

bj bi

Advertisers Impressions

slide-8
SLIDE 8

Application 2: Guaranteed Banner Advertisement

◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.

bj bi

Advertisers Impressions

◮ Guaranteed Delivery:

We pay a penalty for not meeting the demand of each advertiser.

slide-9
SLIDE 9

Application 2: Guaranteed Banner Advertisement

◮ Online Advertisement: $20 billion annual revenue! ◮ Banner Advertisement: 22% of the current revenue.

bj bi

Advertisers Impressions

◮ Guaranteed Delivery:

We pay a penalty for not meeting the demand of each advertiser.

◮ Problem: Which set of advertisers should we commit to?

slide-10
SLIDE 10

Marketing Over Social Networks Guaranteed Banner Ad Allocation Revenue Maximization

slide-11
SLIDE 11

Marketing Over Social Networks Guaranteed Banner Ad Allocation Revenue Maximization

Submodular Maximization

slide-12
SLIDE 12

Outline

◮ Submodularity: Definitions, Applications. ◮ Maximizing non-monotone Submodular Functions

◮ Approximation Algorithms. ◮ Hardness Results.

◮ Application 1: Marketing over Social Networks ◮ Application 2: Guaranteed Banner ad Allocation.

slide-13
SLIDE 13

Submodular Functions

Model the law of diminishing returns and/or economies of scale.

slide-14
SLIDE 14

Submodular Functions

Model the law of diminishing returns and/or economies of scale. Generalize Concave functions to set functions: f is defined

  • n subsets of X.

S2 → f(S2) S1 → f(S1) x f(x) X

∆x ∆x ∆x ∆f1 ∆f2 ∆f1 > ∆f2

slide-15
SLIDE 15

Submodular Functions

Model the law of diminishing returns and/or economies of scale. Generalize Concave functions to set functions: f is defined

  • n subsets of X.

Definition

A set function f : 2X → R is submodular iff for any S, T, f (S) + f (T) ≥ f (S ∪ T) + f (S ∩ T).

slide-16
SLIDE 16

Submodular Functions

Model the law of diminishing returns and/or economies of scale. Generalize Concave functions to set functions: f is defined

  • n subsets of X.

Definition

A set function f : 2X → R is submodular iff for any S, T, f (S) + f (T) ≥ f (S ∪ T) + f (S ∩ T). Decreasing Marginal Value. j S T f is submodular, iff ∀S ⊂ T, j / ∈ T: f (S ∪ {j}) − f (S) ≥ f (T ∪ {j}) − f (T)

slide-17
SLIDE 17

Examples of Submodular Functions

◮ Simple Examples: Additive functions, Concave functions.

◮ Additive function: f (S) =

j∈S cj.

◮ Concave function: f (S) = g(|S|) for a concave function g.

slide-18
SLIDE 18

Examples of Submodular Functions

◮ Simple Examples: Additive functions, Concave functions.

◮ Additive function: f (S) =

j∈S cj.

◮ Concave function: f (S) = g(|S|) for a concave function g.

◮ Set coverage:

f (S) = |

j∈S Aj|.

A1 A3 A4 A5 A6 A2

S = {2, 3, 6}

f(S) = |A2∪A3∪A6|

slide-19
SLIDE 19

Examples of Submodular Functions

◮ Simple Examples: Additive functions, Concave functions.

◮ Additive function: f (S) =

j∈S cj.

◮ Concave function: f (S) = g(|S|) for a concave function g.

◮ Set coverage:

f (S) = |

j∈S Aj|. ◮ Cuts in graphs and hypergraphs:

f (S) = e(S, S) S ¯ S f (S) = e(S, S) S ¯ S g(S) = e(S → S)

slide-20
SLIDE 20

Examples of Submodular Functions

◮ Simple Examples: Additive functions, Concave functions.

◮ Additive function: f (S) =

j∈S cj.

◮ Concave function: f (S) = g(|S|) for a concave function g.

◮ Set coverage:

f (S) = |

j∈S Aj|. ◮ Cuts in graphs and hypergraphs:

f (S) = e(S, S)

◮ Other places:

◮ Maximum Facility location. ◮ Utility functions in economics, Rank function of matroids.

slide-21
SLIDE 21

Examples of Submodular Functions

◮ Simple Examples: Additive functions, Concave functions.

◮ Additive function: f (S) =

j∈S cj.

◮ Concave function: f (S) = g(|S|) for a concave function g.

◮ Set coverage:

f (S) = |

j∈S Aj|. ◮ Cuts in graphs and hypergraphs:

f (S) = e(S, S)

◮ Other places:

◮ Maximum Facility location. ◮ Utility functions in economics, Rank function of matroids.

Two categories:

◮ monotone functions, i.e, f (S) ≤ f (T) for S ⊂ T. ◮ non-monotone functions: Cut functions, Maximum Facility

Location.

slide-22
SLIDE 22

Submodular Maximization

Given: Value oracle, f : 2X → R is submodular.

S f(S)

Submodular Oracle

Value Query

Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.

slide-23
SLIDE 23

Submodular Maximization

Given: Value oracle, f : 2X → R is submodular.

S f(S)

Submodular Oracle

Value Query

Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.

◮ Min-Cut is poly-time solvable. ◮ Submodular function minimization is poly-time solvable.

[Schrijver, Fleischer-Fujishige-Iwata 2001].

slide-24
SLIDE 24

Submodular Maximization

Given: Value oracle, f : 2X → R is submodular.

S f(S)

Submodular Oracle

Value Query

Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.

◮ Min-Cut is poly-time solvable. ◮ Submodular function minimization is poly-time solvable.

[Schrijver, Fleischer-Fujishige-Iwata 2001].

◮ Max-Cut is NP-hard.

slide-25
SLIDE 25

Submodular Maximization

Given: Value oracle, f : 2X → R is submodular.

S f(S)

Submodular Oracle

Value Query

Goal: Maximize f (S) over all sets S ⊆ X, using small number of queries.

◮ Min-Cut is poly-time solvable. ◮ Submodular function minimization is poly-time solvable.

[Schrijver, Fleischer-Fujishige-Iwata 2001].

◮ Max-Cut is NP-hard. ◮ α-approximation: output is at least α times OPT. ◮ For most of this talk, assume f : 2X → R+ is non-negative.

slide-26
SLIDE 26

Applications of Submodular Maximization

◮ Maximizing Monotone Submodular Functions (with cardinality

constraint):

◮ Maximum Set Coverage. ◮ Maximizing influence in social networks (Kempe, Kleinberg,

Tardos, KDD).

◮ Optimal sensor installation for outbreak detection (LKGFVG,

KDD).

slide-27
SLIDE 27

Applications of Submodular Maximization

◮ Maximizing Monotone Submodular Functions (with cardinality

constraint):

◮ Maximum Set Coverage. ◮ Maximizing influence in social networks (Kempe, Kleinberg,

Tardos, KDD).

◮ Optimal sensor installation for outbreak detection (LKGFVG,

KDD).

◮ Maximizing Non-monotone Submodular Functions.

◮ Maximum Facility Location ◮ Segmentation Problems (Kleinberg, Papadimitriou, Raghavan,

JACM).

◮ Least core value of general supermodular cost games (Schulz,

Uhan, APPROX).

◮ Optimal marketing over social networks (Hartline, M.,

Sundararajan, WWW)

◮ Revenue maximization for banner advertisement (Feige,

Immorlica, M., Nazerzadeh, WWW)

slide-28
SLIDE 28

Outline

◮ Submodularity. ◮ Maximizing non-monotone Submodular Functions

◮ Approximation Algorithms. ◮ Hardness Results.

◮ Application 1: Marketing over Social Networks ◮ Application 2: Guaranteed Banner ad Allocation.

slide-29
SLIDE 29

Related work

◮ Maximizing monotone submodular functions with cardinality

constraints, (|S| ≤ k): Greedy (1 − 1/e)-approximation. [Nemhauser-Wolsey-Fisher ’78, Feige ’98]

slide-30
SLIDE 30

Related work

◮ Maximizing monotone submodular functions with cardinality

constraints, (|S| ≤ k): Greedy (1 − 1/e)-approximation. [Nemhauser-Wolsey-Fisher ’78, Feige ’98]

◮ Maximizing non-monotone submodular functions has been

studied in OR. [Lee-Nemhauser-Wang ’96, Goldengorkin-Tijssen-Tso’99] No guaranteed approximation factor known.

slide-31
SLIDE 31

Related work

◮ Maximizing monotone submodular functions with cardinality

constraints, (|S| ≤ k): Greedy (1 − 1/e)-approximation. [Nemhauser-Wolsey-Fisher ’78, Feige ’98]

◮ Maximizing non-monotone submodular functions has been

studied in OR. [Lee-Nemhauser-Wang ’96, Goldengorkin-Tijssen-Tso’99] No guaranteed approximation factor known.

◮ For special cases, known approximation algorithms:

◮ 0.878-approx for Max Cut [Goemans-Williamson ’95] ◮ 0.874-approx for Max Di-Cut [Livnat-Lewin-Zwick ’02] ◮ (1 − 1/2k−1)-approx for Max Cut in k-uniform hypergraphs; ◮ NP-hard to improve 7/8 approximation for k = 4 [H˚

astad ’01].

slide-32
SLIDE 32

Our Results: Non-negative submodular functions

Feige, M., Vondrak (FOCS,07)

  • 1. Approximation Algorithms for Maximizing non-monotone

non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.

slide-33
SLIDE 33

Our Results: Non-negative submodular functions

Feige, M., Vondrak (FOCS,07)

  • 1. Approximation Algorithms for Maximizing non-monotone

non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.

◮ 0.33-approximation (deterministic local search).

slide-34
SLIDE 34

Our Results: Non-negative submodular functions

Feige, M., Vondrak (FOCS,07)

  • 1. Approximation Algorithms for Maximizing non-monotone

non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.

◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”).

slide-35
SLIDE 35

Our Results: Non-negative submodular functions

Feige, M., Vondrak (FOCS,07)

  • 1. Approximation Algorithms for Maximizing non-monotone

non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.

◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions.

slide-36
SLIDE 36

Our Results: Non-negative submodular functions

Feige, M., Vondrak (FOCS,07)

  • 1. Approximation Algorithms for Maximizing non-monotone

non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.

◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions.

  • 2. Hardness Results:

◮ For any fixed ǫ > 0, a (1/2 + ǫ)-approximation would require

exponentially many queries.

slide-37
SLIDE 37

Our Results: Non-negative submodular functions

Feige, M., Vondrak (FOCS,07)

  • 1. Approximation Algorithms for Maximizing non-monotone

non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games.

◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions.

  • 2. Hardness Results:

◮ For any fixed ǫ > 0, a (1/2 + ǫ)-approximation would require

exponentially many queries.

◮ Submodular functions with succinct representation: NP-hard

to achieve (3/4 + ǫ)-approximation.

slide-38
SLIDE 38

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

1 2 3 S = {4}

f(S) = 10

10 11 12 4 5 6 7 9 S

slide-39
SLIDE 39

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

Improving local operation if f (S′) > f (S).

1 2 3 S = {4, 5}

f(S) = 17

10 11 12 4 5 6 7 9 S

slide-40
SLIDE 40

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

Improving local operation if f (S′) > f (S).

1 2 3 S = {4, 5, 7}

f(S) = 23

10 11 12 4 5 6 7 9 S

slide-41
SLIDE 41

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

Improving local operation if f (S′) > f (S).

1 2 3 S = {4, 5, 7, 3}

f(S) = 27

10 11 12 4 5 6 7 9 S

slide-42
SLIDE 42

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

Improving local operation if f (S′) > f (S).

1 2 3 S = {4, 7, 3}

f(S) = 30

10 11 12 4 5 6 7 9 S

slide-43
SLIDE 43

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

Improving local operation if f (S′) > f (S).

1 2 3 S = {4, 7, 3, 6}

f(S) = 33

10 11 12 4 5 6 7 9 S

slide-44
SLIDE 44

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

Improving local operation if f (S′) > f (S).

1 2 3 S = {4, 3, 6}

f(L) = f(S) = 34

10 11 12 4 5 6 7 9 L = S

Output L or ¯ L.

slide-45
SLIDE 45

Submodular Maximization: Local Search

Local Operation:

◮ Add v: S′ = S ∪ {v}. ◮ Remove v: S′ = S\{v}.

Improving local operation if f (S′) > f (S).

Local Search Algorithm:

◮ Start with S = {v} where v is the singleton of max value. ◮ While there is an improving local operation,

  • 1. Perform a local operation s.t. f (S′) > (1 + ǫ/n2)f (S).

◮ Return the better of f (S) and f (S).

Theorem (Feige, M., Vondrak)

The Local Search Algorithm returns at least (1

3 − ǫ n)OPT;

for symmetric functions, at least (1

2 − ǫ n)OPT (tight analysis).

slide-46
SLIDE 46

A Structure Lemma

Lemma

For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f (L) ≥ f (C).

slide-47
SLIDE 47

A Structure Lemma

Lemma

For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f (L) ≥ f (C). If T0 ⊂ T1 ⊂ T2 ⊂ · · · ⊂ Tk = L ⊂ Tk+1 ⊂ · · · ⊂ Tn, then, f (T0) ≤ f (T1) · · · ≤ f (Tk−1) ≤ f (L) ≥ f (Tk+1) ≥ . . . ≥ f (Tn)

T2 Tk = L T3 T1 Tk−1

a2

ak

a1 a3

Tn = X

slide-48
SLIDE 48

A Structure Lemma

Lemma

For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f (L) ≥ f (C). If T0 ⊂ T1 ⊂ T2 ⊂ · · · ⊂ Tk = L ⊂ Tk+1 ⊂ · · · ⊂ Tn, then, f (T0) ≤ f (T1) · · · ≤ f (Tk−1) ≤ f (L) ≥ f (Tk+1) ≥ . . . ≥ f (Tn) Proof: Base of Induction: f (Tk−1) ≤ f (L) ≥ f (Tk+1). If Ti+1\Ti = ai, then by submodularity and local optimality of L: f (Ti+1) − f (Ti) ≥ f (L) − f (L\{ai}) ≥ 0 ⇒ f (Ti+1) ≥ f (Ti) Therefore, f (T0) ≤ f (T1) ≤ f (T2) · · · ≤ f (L)

slide-49
SLIDE 49

Proof of The Structure Lemma

Theorem (Feige, M., Vondrak)

The Local Search Algorithm returns at least (1

3 − ǫ n)OPT.

Proof: Find a local optimum L and return either L or L. Consider the actual optimum C. By structure lemma,

◮ f (L) ≥ f (C ∩ L) ◮ f (L) ≥ f (C ∪ L)

slide-50
SLIDE 50

Proof of The Structure Lemma

Theorem (Feige, M., Vondrak)

The Local Search Algorithm returns at least (1

3 − ǫ n)OPT.

Proof: Find a local optimum L and return either L or L. Consider the actual optimum C. By structure lemma,

◮ f (L) ≥ f (C ∩ L) ◮ f (L) ≥ f (C ∪ L)

Hence, again by submodularity, 2f (L) + f (L) ≥ f (C ∩ L) + f (C ∪ L) + f (L) ≥ f (C ∩ L) + f (C ∩ L) + f (X) ≥ f (C) + f (∅) ≥ OPT. Consequently, either f (L) or f (L) must be at least 1

3OPT.

slide-51
SLIDE 51

Our Results

Feige, M., Vondrak [FMV]

  • 1. Approximation Algorithms for Maximizing non-negative

submodular functions:

◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.50-approximation for symmetric functions

  • 2. Hardness Results:

◮ A (1/2 + ǫ)-approximation would require exponentially many

queries.

◮ Submodular functions with succinct representation: NP-hard

to achieve (3/4 + ǫ)-approximation.

slide-52
SLIDE 52

Random Subsets

A

A(p) A(p) is a random subset of A: each element is picked with prob- ability p

Sampling lemma

E[f (A(p))] ≥ pf (A) + (1 − p)f (∅)

slide-53
SLIDE 53

The Smooth Local Search Algorithm

◮ For any set A, let RA = A(2/3) ∪ A(1/3), a random set.

A RA

RA is a random set with some bias in picking elements from A

Let Φ(A) = E[f (RA)] - a smoothed variant of f (A).

slide-54
SLIDE 54

The Smooth Local Search Algorithm

◮ For any set A, let RA = A(2/3) ∪ A(1/3), a random set.

A RA

RA is a random set with some bias in picking elements from A

Let Φ(A) = E[f (RA)] - a smoothed variant of f (A). Algorithm:

◮ Perform local search with respect to Φ(A). ◮ When a local optimum is found, return RA or A.

slide-55
SLIDE 55

The Smooth Local Search Algorithm

◮ For any set A, let RA = A(2/3) ∪ A(1/3), a random set.

A RA

RA is a random set with some bias in picking elements from A

Let Φ(A) = E[f (RA)] - a smoothed variant of f (A). Algorithm:

◮ Perform local search with respect to Φ(A). ◮ When a local optimum is found, return RA or A.

Theorem (Feige, M., Vondrak)

The Smooth Local Search algorithm returns at least 0.40OPT.

slide-56
SLIDE 56

Proof Sketch of the Algorithm

Analysis: more complicated, using the sampling lemma for 3 sets. Let

◮ A = local optimum found by our algorithm ◮ R = A(2/3) ∪ A(1/3), our random set ◮ C = optimum

Main claims:

  • 1. E[f (R)] ≥ E[f (R ∪ (A ∩ C))].
  • 2. E[f (R)] ≥ E[f (R ∩ (A ∪ C))].

3.

9 20E[f (R ∪(A∩C))]+ 9 20E[f (R ∩(A∪C))]+ 1 10f (A) ≥ 0.4OPT.

slide-57
SLIDE 57

Our Results

  • 1. Approximation Algorithms for Maximizing non-monotone

submodular functions:

◮ 0.33-approximation (deterministic local search). ◮ 0.40-approximation (randomized ”smooth local search”). ◮ 0.5-approximation for symmetric functions.

  • 2. Hardness Results:

◮ It is impossible to improve the factor (1/2). ◮ A (1/2 + ǫ)-approximation would require exponentially many

queries.

◮ Submodular functions with succinct representation: NP-hard

to achieve (3/4 + ǫ)-approximation.

slide-58
SLIDE 58

Proof Idea for the Hardness Result

Goal: Find two functions f and g which look identical to a typical query with high probability, and max g(S) . = 2 max f (S).

slide-59
SLIDE 59

Proof Idea for the Hardness Result

Goal: Find two functions f and g which look identical to a typical query with high probability, and max g(S) . = 2 max f (S). Consider two functions f1, g1 : [0, 1]2 → R+:

  • 1. f1(x, y) = (x + y)(2 − x − y).
  • 2. g1(x, y) = 2x(1 − y) + 2(1 − x)y.

Observe: f1(x, x) = g1(x, x) ⇒ modify g1(x, y) to g2(x, y) s.t. f1(x, y) = g2(x, y) for |x − y| < ǫ.

slide-60
SLIDE 60

Proof Idea for the Hardness Result

Goal: Find two functions f and g which look identical to a typical query with high probability, and max g(S) . = 2 max f (S). Consider two functions f1, g1 : [0, 1]2 → R+:

  • 1. f1(x, y) = (x + y)(2 − x − y).

(”complete graph cut”)

  • 2. g1(x, y) = 2x(1 − y) + 2(1 − x)y.

(”bipartite graph cut”) Observe: f1(x, x) = g1(x, x) ⇒ modify g1(x, y) to g2(x, y) s.t. f1(x, y) = g2(x, y) for |x − y| < ǫ. Mapping to set functions: Let X = X1 ∪ X2, and f (S) = f1

  • |S∩X1|

|X1| , |S∩X2| |X2|

  • , and g(S) = g2
  • |S∩X1|

|X1| , |S∩X2| |X2|

  • .

S x y X1 X2 1 − x 1 − y

slide-61
SLIDE 61

Similar Technique for Combinatorial Auctions

◮ Using a similar technique, we can show information theoretic

lower bounds for combinatorial auctions.

Buyers

f3(S3) f4(S4) f5(S5) f6(S6) f1(S1) f2(S2) S1 S5 S6 S4 S3 S2 ◮ Goal: Partition items to maximize social welfare, i.e, i fi(Si).

Theorem (M., Schapira, Vondrak, EC08)

Achieving factor better than 1 − 1

e needs exponential number of

value queries.

slide-62
SLIDE 62

Extra Constraints

◮ Cardinality Constraints: |S| ≤ k. ◮ Knapsack Constaints: i∈S wi ≤ C. ◮ Matroid Constraints.

slide-63
SLIDE 63

Extra Constraints

◮ Cardinality Constraints: |S| ≤ k. ◮ Knapsack Constaints: i∈S wi ≤ C. ◮ Matroid Constraints. ◮ Known results: Monotone Submodular functions:

◮ Matroid or cardinality constraints: 1 − 1

e (NWF78, Vondrak08)

◮ k Knapsack constraints: 1 − 1

e . (Sviridenko01, KST09)

◮ k Matroid constraints:

1 k+1 (NWF78).

slide-64
SLIDE 64

Extra Constraints

◮ Cardinality Constraints: |S| ≤ k. ◮ Knapsack Constaints: i∈S wi ≤ C. ◮ Matroid Constraints. ◮ Known results: Monotone Submodular functions:

◮ Matroid or cardinality constraints: 1 − 1

e (NWF78, Vondrak08)

◮ k Knapsack constraints: 1 − 1

e . (Sviridenko01, KST09)

◮ k Matroid constraints:

1 k+1 (NWF78).

◮ (Lee, M., Natarajan, Sviridenko (STOC 2009))

Maximizing non-monotone submodular functions:

◮ One matroid or cardinality constraints:

1 4.

◮ k Knapsack constraints:

1 5.

◮ k Matroid constraints:

1 k+2+ 1

k .

◮ k Partition matroid constraints:

1 k+1+

1 k−1 .

slide-65
SLIDE 65

Extra Constraints: Algorithms

Lee, M., Nagarajan, Sviridenko, STOC’09

◮ Local Search Algorithms.

◮ Cardinality constraints:

1 4.

◮ Local Search with add, remove, and swap operations.

slide-66
SLIDE 66

Extra Constraints: Algorithms

Lee, M., Nagarajan, Sviridenko, STOC’09

◮ Local Search Algorithms.

◮ Cardinality constraints:

1 4.

◮ Local Search with add, remove, and swap operations. ◮ k Knapsack or budget constraints:

1 5.

  • 1. Solve a fractional variant on small elements.
  • 2. Round the fractional solution for small elements.
  • 3. Output the better of the solution for small elements and large

elements.

slide-67
SLIDE 67

Extra Constraints: Algorithms

Lee, M., Nagarajan, Sviridenko, STOC’09

◮ Local Search Algorithms.

◮ Cardinality constraints:

1 4.

◮ Local Search with add, remove, and swap operations. ◮ k Knapsack or budget constraints:

1 5.

  • 1. Solve a fractional variant on small elements.
  • 2. Round the fractional solution for small elements.
  • 3. Output the better of the solution for small elements and large

elements.

◮ k Matroid constraints: 1 k+2+ 1

k . ◮ Local Search: Delete operation and more complicated

exchange operations.

slide-68
SLIDE 68

Approximating Everywhere

◮ Can we learn a submodular function by polynomial number of

queries?

◮ After polynomial number of queries, construct an oracle that

approximate f by f ′?

Sk−1 Submodular Oracle Sk S2 f(S1), . . . , f(Sk) S1 S1 S3 · · ·

Approximate Oracle S f ′(S)

S1

◮ Goal: Approximate f by f ′?

slide-69
SLIDE 69

Approximating Everywhere

◮ Can we learn a submodular function by polynomial number of

queries?

◮ After polynomial number of queries, construct an oracle that

approximate f by f ′?

Sk−1 Submodular Oracle Sk S2 f(S1), . . . , f(Sk) S1 S1 S3 · · ·

Approximate Oracle S f ′(S)

S1

◮ Goal: Approximate f by f ′?

Theorem (Goemans, Iwata, Harvey, M. SODA09)

Achieving factor better than √n needs exponential number of value queries even for rank functions of matroids.

slide-70
SLIDE 70

Approximating Everywhere

◮ Can we learn a submodular function by polynomial number of

queries?

◮ After polynomial number of queries, construct an oracle that

approximate f by f ′?

Sk−1 Submodular Oracle Sk S2 f(S1), . . . , f(Sk) S1 S1 S3 · · ·

Approximate Oracle S f ′(S)

S1

◮ Goal: Approximate f by f ′?

Theorem (Goemans, Iwata, Harvey, M. SODA09)

Achieving factor better than √n needs exponential number of value queries even for rank functions of matroids.

Theorem (Goemans, Iwata, Harvey, M. SODA09)

After a polynomial number of value queries to a monotonte submodular function, we can approximate the function everywhere within O(√n log n).

slide-71
SLIDE 71

Outline

◮ Submodularity. ◮ Maximizing non-monotone Submodular Functions ◮ Application 1: Marketing over Social Networks ◮ Application 2: Guaranteed Banner ad Allocation.

slide-72
SLIDE 72

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook.

slide-73
SLIDE 73

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.

◮ Viral Marketing: Word-of-Mouth Advertising.

slide-74
SLIDE 74

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.

◮ Viral Marketing: Word-of-Mouth Advertising.

◮ Users influence each others’ valuation on a social network.

slide-75
SLIDE 75

Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks.

◮ Viral Marketing: Word-of-Mouth Advertising.

◮ Users influence each others’ valuation on a social network. ◮ Marketing policy: In what order and at what price, do

we offer an item to buyers?

slide-76
SLIDE 76

Application 1: Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising.

slide-77
SLIDE 77

Application 1: Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon.

slide-78
SLIDE 78

Application 1: Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)

functions.

100 A B C

($100,$110,$115,$118)

slide-79
SLIDE 79

Application 1: Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)

functions.

110 A B C

($100,$110,$115,$118)

slide-80
SLIDE 80

Application 1: Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)

functions.

◮ fi(S) = g(|Ni ∩ S|).

115 A B C

($100,$110,$115,$118)

slide-81
SLIDE 81

Application 1: Marketing over Social Networks

◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave)

functions.

◮ fi(S) = g(|Ni ∩ S|).

118 A B C

($100,$110,$115,$118)

slide-82
SLIDE 82

Marketing Model

◮ Given: A prior (probability distribution) Pi(S) for the

valuation function for user i given that a set S of users already adopted the item.

◮ Goal: Design a Marketing Policy to maximize the expected

revenue.

slide-83
SLIDE 83

Marketing Model

◮ Given: A prior (probability distribution) Pi(S) for the

valuation function for user i given that a set S of users already adopted the item.

◮ Goal: Design a Marketing Policy to maximize the expected

revenue.

◮ Marketing policy: Visit buyers one by one and offer them a

price.

◮ Ordering of buyers. ◮ Pricing at each step.

slide-84
SLIDE 84

Marketing Model

◮ Given: A prior (probability distribution) Pi(S) for the

valuation function for user i given that a set S of users already adopted the item.

◮ Goal: Design a Marketing Policy to maximize the expected

revenue.

◮ Marketing policy: Visit buyers one by one and offer them a

price.

◮ Ordering of buyers. ◮ Pricing at each step.

◮ Optimal (myopic) Pricing: Optimal price to maximize revenue

at each step (ignoring the future influence).

slide-85
SLIDE 85

Marketing Model

◮ Given: A prior (probability distribution) Pi(S) for the

valuation function for user i given that a set S of users already adopted the item.

◮ Goal: Design a Marketing Policy to maximize the expected

revenue.

◮ Marketing policy: Visit buyers one by one and offer them a

price.

◮ Ordering of buyers. ◮ Pricing at each step.

◮ Optimal (myopic) Pricing: Optimal price to maximize revenue

at each step (ignoring the future influence).

◮ Let fi(S) be the optimal revenue from buyer i using the

  • ptimal (myopic) price given that a set S of buyers have

bought the item.

◮ We call this function fi the influence function.

slide-86
SLIDE 86

Marketing Model: Example

◮ Marketing policy: In what order and at what price do we

  • ffer a digital good to buyers?

◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65)

100?

slide-87
SLIDE 87

Marketing Model: Example

◮ Marketing policy: In what order and at what price do we

  • ffer a digital good to buyers?

◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65) 100

70?

slide-88
SLIDE 88

Marketing Model: Example

◮ Marketing policy: In what order and at what price do we

  • ffer a digital good to buyers?

◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65) 70

65?

100

slide-89
SLIDE 89

Marketing Model: Example

◮ Marketing policy: In what order and at what price do we

  • ffer a digital good to buyers?

◮ Each buyer has a monotone submodular influence function. (60,70) (100,105) (100,110,115,118) (50,60,65) 100 70 65

118?

slide-90
SLIDE 90

Marketing Model

Marketing policy: In what order and at what price, do we

  • ffer a digital good to buyers?

◮ Given: A prior (probability distribution) Pi(S) for the

valuation of user i given that a set S of users already adopted the item.

◮ Goal: Design a Marketing Policy to maximize the expected

revenue.

slide-91
SLIDE 91

Marketing Model

Marketing policy: In what order and at what price, do we

  • ffer a digital good to buyers?

◮ Given: A prior (probability distribution) Pi(S) for the

valuation of user i given that a set S of users already adopted the item.

◮ Goal: Design a Marketing Policy to maximize the expected

revenue.

◮ The problem is NP-hard. ◮ Hartline, M., and Sundararajan [HMS] (WWW)

slide-92
SLIDE 92

Influence & Exploit Strategies

Influence & Exploit strategies:

  • 1. Influence: Give the item for free to a set A of influential

buyers.

  • 2. Exploit: Apply the following strategy on the rest.

◮ Optimal myopic pricing. ◮ Random order.

slide-93
SLIDE 93

Influence & Exploit Strategies

Influence & Exploit strategies:

  • 1. Influence: Give the item for free to a set A of influential

buyers.

  • 2. Exploit: Apply the following strategy on the rest.

◮ Optimal myopic pricing. ◮ Random order.

Optimal (myopic) pricing:

◮ At each time, offer a price that maximizes the current

expected revenue (ignoring the future influence).

slide-94
SLIDE 94

Q1: How good are influence & exploit strategies?

Theorem (Hartline, M., Sundararajan)

Given monotone submodular influence functions, Influence & Exploit strategies give constant-factor approximations to the

  • ptimal revenue.
slide-95
SLIDE 95

Q1: How good are influence & exploit strategies?

Theorem (Hartline, M., Sundararajan)

Given monotone submodular influence functions, Influence & Exploit strategies give constant-factor approximations to the

  • ptimal revenue.

Constant factor:

◮ 0.25 for the general distributions, ◮ 0.306 for distributions satisfying a monotone hazard-rate

condition,

◮ 0.33 for additive settings and uniform distribution, ◮ 0.66 for undirected graphs and uniform distributions, ◮ 0.94 for complete undirected graphs and uniform distributions.

slide-96
SLIDE 96

Q2: How to find influential users?

◮ Question 2: How do we choose a set A of influential users to

give the item for free to maximize the revenue?

◮ Let g(A) be the expected revenue, if the initial set is A.

slide-97
SLIDE 97

Q2: How to find influential users?

◮ Question 2: How do we choose a set A of influential users to

give the item for free to maximize the revenue?

◮ Let g(A) be the expected revenue, if the initial set is A. ◮ For some small sets A, g(∅) ≤ g(A). ◮ If we give the item for free to all users, g(X) = 0.

slide-98
SLIDE 98

Q2: How to find influential users?

◮ Question 2: How do we choose a set A of influential users to

give the item for free to maximize the revenue?

◮ Let g(A) be the expected revenue, if the initial set is A. ◮ For some small sets A, g(∅) ≤ g(A). ◮ If we give the item for free to all users, g(X) = 0.

Theorem (Hartline, M., Sundararajan)

Given monotone submodular influence functions, the revenue function g is a non-monotone non-negative submodular function.

slide-99
SLIDE 99

Q2: How to find influential users?

◮ Question 2: How do we choose a set A of influential users to

give the item for free to maximize the revenue?

◮ Let g(A) be the expected revenue, if the initial set is A. ◮ For some small sets A, g(∅) ≤ g(A). ◮ If we give the item for free to all users, g(X) = 0.

Theorem (Hartline, M., Sundararajan)

Given monotone submodular influence functions, the revenue function g is a non-monotone non-negative submodular function.

◮ Thus, to find a set A that maximizes the revenue g(A), we

can use the 0.4-approximation local search algorithm from Feige, M., Vondrak.

slide-100
SLIDE 100

Future Directions

◮ Marketing over Social Networks

◮ Avoiding Price Discremenation (Fixed-Price). ◮ Iterative Pricing with Positive Network Externalities

(Akhlaghpour, Ghodsi, Haghpanah, Mahini, M., Nikzad).

◮ Cascading Effect and Influence Propagation. ◮ Revenue Maximization for Fixed-Priced Marketing with

Influence Propagation (M., Sundararajna, Roch).

◮ Learning vs. Marketing.

slide-101
SLIDE 101

Thank You

slide-102
SLIDE 102

Outline

◮ Submodularity. ◮ Maximizing non-monotone Submodular Functions ◮ Application 1: Marketing over Social Networks. ◮ Application 2: Guaranteed Banner ad Allocation.

slide-103
SLIDE 103

Application 2: Guaranteed Banner Advertisement

Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue.

slide-104
SLIDE 104

Application 2: Guaranteed Banner Advertisement

Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue. Guaranteed Banner Advertisement:

◮ Each advertiser i

◮ Interested in set Si of impressions, (e.g, young women in

Seattle),

◮ Bids bi for each impression, ◮ Needs di impressions, ◮ Penalty αbi for not satisfying each unit (Guaranteed Delivery).

Sj di = 4 bj Si bi

Advertisers Impressions

(Si, bi, di)

slide-105
SLIDE 105

Application 2: Guaranteed Banner Advertisement

Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue. Guaranteed Banner Advertisement:

◮ Each advertiser i

◮ Interested in set Si of impressions, (e.g, young women in

Seattle),

◮ Bids bi for each impression, ◮ Needs di impressions, ◮ Penalty αbi for not satisfying each unit (Guaranteed Delivery).

Sj di = 4 bj Si bi

Advertisers Impressions

(Si, bi, di)

Goal: Choose a set T of advertisers to maximize revenue, f (T).

slide-106
SLIDE 106

Application 2: Guaranteed Banner Advertisement

Online Advertisement: $20 billion annual revenue! Banner Advertisement: 22% of the current revenue. Guaranteed Banner Advertisement:

◮ Each advertiser i

◮ Interested in set Si of impressions, (e.g, young women in

Seattle),

◮ Bids bi for each impression, ◮ Needs di impressions, ◮ Penalty αbi for not satisfying each unit (Guaranteed Delivery).

Sj di = 4 bj Si bi

Advertisers Impressions

(Si, bi, di)

Goal: Choose a set T of advertisers to maximize revenue, f (T). If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi

slide-107
SLIDE 107

Guaranteed Banner Advertisement: Submodularity

If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci.

slide-108
SLIDE 108

Guaranteed Banner Advertisement: Submodularity

If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci. If we commit to set T of advertisers: f (T) = P(T) −

i∈T ci = P(T) − C(T).

Advertisers Impressions

T

Maximum weighted matching for this graph. (1 + α)bj (1 + α)bi

P(T) =

slide-109
SLIDE 109

Guaranteed Banner Advertisement: Submodularity

If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci. If we commit to set T of advertisers: f (T) = P(T) −

i∈T ci = P(T) − C(T).

Advertisers Impressions

T

Maximum weighted matching for this graph. (1 + α)bj (1 + α)bi

P(T) =

P is submodular ⇒ f is submodular, but it can be negative.

slide-110
SLIDE 110

Guaranteed Banner Advertisement: Submodularity

If we give q items to advertiser i, we get qbi − αbi(di − q) = q(1 + α)bi − diαbi = q(1 + α)bi − ci. If we commit to set T of advertisers: f (T) = P(T) −

i∈T ci = P(T) − C(T).

Advertisers Impressions

T

Maximum weighted matching for this graph. (1 + α)bj (1 + α)bi

P(T) =

P is submodular ⇒ f is submodular, but it can be negative. In fact, we prove that the problem is not approximable within any constant factor for any α = c.

slide-111
SLIDE 111

Structural Approximation

Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:

◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.

slide-112
SLIDE 112

Structural Approximation

Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:

◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.

The above problems are possibly negative, and not approximable within any multiplicative approximatin factor.

slide-113
SLIDE 113

Structural Approximation

Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:

◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.

The above problems are possibly negative, and not approximable within any multiplicative approximatin factor. Structural Approximation

◮ Approximation factor is a function of the structure of the

solution.

slide-114
SLIDE 114

Structural Approximation

Feige, Immorlica, M., Nazerzadeh [FIMN] There are many natural submodular functions of the form: A submodular profit function - additive cost function = P(S) − C(S). Examples:

◮ Maximum Facility Location Problem. ◮ Segmentation Problems. ◮ Guaranteed Banner Ad Problem.

The above problems are possibly negative, and not approximable within any multiplicative approximatin factor. Structural Approximation

◮ Approximation factor is a function of the structure of the

solution. We design greedy and linear programming-based tight structural approximation algorithms for the above problems.

slide-115
SLIDE 115

Guaranteed Banner Ad Problem

Greedy Algorithm:

  • 1. S = ∅.
  • 2. At each step, add an advertiser i ∈ X\S that maximizes

P(S∪{i})−P(S)−ci P(S∪{i})−P(S)

if P(S ∪ {i}) − P(S) − ci > 0.

slide-116
SLIDE 116

Guaranteed Banner Ad Problem

Greedy Algorithm:

  • 1. S = ∅.
  • 2. At each step, add an advertiser i ∈ X\S that maximizes

P(S∪{i})−P(S)−ci P(S∪{i})−P(S)

if P(S ∪ {i}) − P(S) − ci > 0. LP-based Algorithm:

◮ Solve a Configurational LP and Round it.

max

  • i∈A,Q⊆Si X Q

i dibi((1 + α)(|Q| di − ln |Q| di ) − α)

s.t.

  • i∈A,Q∈Si:j∈S X Q

i

≤ 1 ∀j ∈ U

  • Q∈Si X Q

i

≤ 1 ∀i ∈ A X Q

i

≥ 0 ∀i ∈ A, ∀Q ∈ Si

slide-117
SLIDE 117

Guaranteed Banner Ad Problem

Greedy Algorithm:

  • 1. S = ∅.
  • 2. At each step, add an advertiser i ∈ X\S that maximizes

P(S∪{i})−P(S)−ci P(S∪{i})−P(S)

if P(S ∪ {i}) − P(S) − ci > 0. LP-based Algorithm:

◮ Solve a Configurational LP and Round it.

max

  • i∈A,Q⊆Si X Q

i dibi((1 + α)(|Q| di − ln |Q| di ) − α)

s.t.

  • i∈A,Q∈Si:j∈S X Q

i

≤ 1 ∀j ∈ U

  • Q∈Si X Q

i

≤ 1 ∀i ∈ A X Q

i

≥ 0 ∀i ∈ A, ∀Q ∈ Si Theorem[Feige, Immorlica, M., Nazerzadeh] The greedy algorithm and the LP-based algorithm achieve tight structural approximation for (capacitated) maximum facility location, segmentation problems, and guaranteed banner advertisement problem.

slide-118
SLIDE 118

Future Directions

◮ Guaranteed Banner ad Allocation

◮ Uncertainty in Supply (impressions). ◮ Uncertainty in Demand (advertisers). ◮ Online Allocation Mechanisms. ◮ Truthful Mechanisms.

slide-119
SLIDE 119

Marketing over Social Networks

Algorithmic & Economic aspects of the Internet

Internet Monetization (Stochastic Optimization, Linear Programming) Search & Large Networks Algorithmic Game Theory (Auctions, Mechanism Design) Guaranteed Banner Advertisement

slide-120
SLIDE 120

Marketing over Social Networks

Algorithmic & Economic aspects of the Internet

Internet Monetization (Stochastic Optimization, Linear Programming) Search & Large Networks (Refined Random Walks, Local Clustering) Algorithmic Game Theory (Auctions, Mechanism Design) Guaranteed Banner Advertisement

slide-121
SLIDE 121

Marketing over Social Networks Trust-based Recommendation Systems Overlapping Clustering for Distributed Comp.

Algorithmic & Economic aspects of the Internet

Internet Monetization (Stochastic Optimization, Linear Programming) Search & Large Networks (Refined Random Walks, Local Clustering) PageRank Contributions & Link Spam Detection Algorithmic Game Theory (Auctions, Mechanism Design) Guaranteed Banner Advertisement

slide-122
SLIDE 122

Algorithms for Search and Large Networks

◮ Local Computation of PageRank Contributions & Link

Spam Detection

slide-123
SLIDE 123

Algorithms for Search and Large Networks

◮ Local Computation of PageRank Contributions & Link

Spam Detection

◮ Axiomatic Approach to Trust-Based Recommendation

Systems

slide-124
SLIDE 124

Algorithms for Search and Large Networks

◮ Local Computation of PageRank Contributions & Link

Spam Detection

◮ Axiomatic Approach to Trust-Based Recommendation

Systems

◮ Overlapping Clustering for Distributed Computation

slide-125
SLIDE 125

Thank You