submodular maximization applied to marketing over social
play

Submodular Maximization applied to Marketing Over Social Networks - PowerPoint PPT Presentation

Submodular Maximization applied to Marketing Over Social Networks Vahab Mirrokni Google Research, NYC Marketing over Social Networks Online Social Networks: MySpace, Facebook. Marketing over Social Networks Online Social Networks:


  1. Our Results: Non-negative submodular functions Feige, M., Vondrak ( FOCS,07 ) 1. Approximation Algorithms for Maximizing non-monotone non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games. ◮ 0 . 33-approximation (deterministic local search).

  2. Our Results: Non-negative submodular functions Feige, M., Vondrak ( FOCS,07 ) 1. Approximation Algorithms for Maximizing non-monotone non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games. ◮ 0 . 33-approximation (deterministic local search). ◮ 0 . 40-approximation (randomized ”smooth local search”).

  3. Our Results: Non-negative submodular functions Feige, M., Vondrak ( FOCS,07 ) 1. Approximation Algorithms for Maximizing non-monotone non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games. ◮ 0 . 33-approximation (deterministic local search). ◮ 0 . 40-approximation (randomized ”smooth local search”). ◮ 0 . 50-approximation for symmetric functions.

  4. Our Results: Non-negative submodular functions Feige, M., Vondrak ( FOCS,07 ) 1. Approximation Algorithms for Maximizing non-monotone non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games. ◮ 0 . 33-approximation (deterministic local search). ◮ 0 . 40-approximation (randomized ”smooth local search”). ◮ 0 . 50-approximation for symmetric functions. 2. Hardness Results: ◮ For any fixed ǫ > 0, a (1 / 2 + ǫ )-approximation would require exponentially many queries.

  5. Our Results: Non-negative submodular functions Feige, M., Vondrak ( FOCS,07 ) 1. Approximation Algorithms for Maximizing non-monotone non-negative submodular functions: Examples: Cut functions, Marketing over social Networks, core value of supermodular games. ◮ 0 . 33-approximation (deterministic local search). ◮ 0 . 40-approximation (randomized ”smooth local search”). ◮ 0 . 50-approximation for symmetric functions. 2. Hardness Results: ◮ For any fixed ǫ > 0, a (1 / 2 + ǫ )-approximation would require exponentially many queries. ◮ Submodular functions with succinct representation: NP-hard to achieve (3 / 4 + ǫ )-approximation.

  6. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . 10 11 12 1 4 6 S 3 9 2 5 7 S = { 4 } f ( S ) = 10

  7. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . Improving local operation if f ( S ′ ) > f ( S ). 10 11 12 S 1 4 6 3 9 2 5 7 S = { 4 , 5 } f ( S ) = 17

  8. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . Improving local operation if f ( S ′ ) > f ( S ). 10 11 12 S 1 4 6 3 9 2 5 7 S = { 4 , 5 , 7 } f ( S ) = 23

  9. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . Improving local operation if f ( S ′ ) > f ( S ). 10 11 12 S 1 4 6 3 9 2 5 7 S = { 4 , 5 , 7 , 3 } f ( S ) = 27

  10. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . Improving local operation if f ( S ′ ) > f ( S ). 10 11 12 S 1 4 6 3 9 2 5 7 S = { 4 , 7 , 3 } f ( S ) = 30

  11. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . Improving local operation if f ( S ′ ) > f ( S ). 10 11 12 S 1 4 6 3 9 2 5 7 S = { 4 , 7 , 3 , 6 } f ( S ) = 33

  12. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . Improving local operation if f ( S ′ ) > f ( S ). 10 11 12 L = S 1 4 6 3 9 2 5 7 Output L or ¯ L . S = { 4 , 3 , 6 } f ( L ) = f ( S ) = 34

  13. Submodular Maximization: Local Search Local Operation: ◮ Add v : S ′ = S ∪ { v } . ◮ Remove v : S ′ = S \{ v } . Improving local operation if f ( S ′ ) > f ( S ). Local Search Algorithm: ◮ Start with S = { v } where v is the singleton of max value. ◮ While there is an improving local operation, 1. Perform a local operation s.t. f ( S ′ ) > (1 + ǫ/ n 2 ) f ( S ) . ◮ Return the better of f ( S ) and f ( S ). Theorem (Feige, M., Vondrak) The Local Search Algorithm returns at least ( 1 3 − ǫ n ) OPT; for symmetric functions, at least ( 1 2 − ǫ n ) OPT (tight analysis).

  14. A Structure Lemma Lemma For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f ( L ) ≥ f ( C ) .

  15. A Structure Lemma Lemma For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f ( L ) ≥ f ( C ) . If T 0 ⊂ T 1 ⊂ T 2 ⊂ · · · ⊂ T k = L ⊂ T k +1 ⊂ · · · ⊂ T n , then, f ( T 0 ) ≤ f ( T 1 ) · · · ≤ f ( T k − 1 ) ≤ f ( L ) ≥ f ( T k +1 ) ≥ . . . ≥ f ( T n ) a k a 3 a 2 a 1 T n = X T k − 1 T k = L T 1 T 2 T 3

  16. A Structure Lemma Lemma For a local optimal solution L, and any subset C ⊂ L or a superset L ⊂ C, f ( L ) ≥ f ( C ) . If T 0 ⊂ T 1 ⊂ T 2 ⊂ · · · ⊂ T k = L ⊂ T k +1 ⊂ · · · ⊂ T n , then, f ( T 0 ) ≤ f ( T 1 ) · · · ≤ f ( T k − 1 ) ≤ f ( L ) ≥ f ( T k +1 ) ≥ . . . ≥ f ( T n ) Proof: Base of Induction: f ( T k − 1 ) ≤ f ( L ) ≥ f ( T k +1 ). If T i +1 \ T i = a i , then by submodularity and local optimality of L : f ( T i +1 ) − f ( T i ) ≥ f ( L ) − f ( L \{ a i } ) ≥ 0 ⇒ f ( T i +1 ) ≥ f ( T i ) Therefore, f ( T 0 ) ≤ f ( T 1 ) ≤ f ( T 2 ) · · · ≤ f ( L )

  17. Proof of The Structure Lemma Theorem (Feige, M., Vondrak) The Local Search Algorithm returns at least ( 1 3 − ǫ n ) OPT. Proof: Find a local optimum L and return either L or L . Consider the actual optimum C . By structure lemma, ◮ f ( L ) ≥ f ( C ∩ L ) ◮ f ( L ) ≥ f ( C ∪ L )

  18. Proof of The Structure Lemma Theorem (Feige, M., Vondrak) The Local Search Algorithm returns at least ( 1 3 − ǫ n ) OPT. Proof: Find a local optimum L and return either L or L . Consider the actual optimum C . By structure lemma, ◮ f ( L ) ≥ f ( C ∩ L ) ◮ f ( L ) ≥ f ( C ∪ L ) Hence, again by submodularity, 2 f ( L ) + f ( L ) ≥ f ( C ∩ L ) + f ( C ∪ L ) + f ( L ) ≥ f ( C ∩ L ) + f ( C ∩ L ) + f ( X ) ≥ f ( C ) + f ( ∅ ) ≥ OPT . Consequently, either f ( L ) or f ( L ) must be at least 1 3 OPT .

  19. Our Results Feige, M., Vondrak [FMV] 1. Approximation Algorithms for Maximizing non-negative submodular functions: ◮ 0 . 33-approximation (deterministic local search). ◮ 0 . 40-approximation (randomized ”smooth local search”). ◮ 0 . 50-approximation for symmetric functions 2. Hardness Results: ◮ A (1 / 2 + ǫ )-approximation would require exponentially many queries. ◮ Submodular functions with succinct representation: NP-hard to achieve (3 / 4 + ǫ )-approximation.

  20. Random Subsets A A ( p ) is a random subset of A : each element is picked with prob- ability p A ( p ) Sampling lemma E [ f ( A ( p ))] ≥ pf ( A ) + (1 − p ) f ( ∅ )

  21. The Smooth Local Search Algorithm ◮ For any set A , let R A = A (2 / 3) ∪ A (1 / 3), a random set. R A R A is a random set with some bias A in picking elements from A Let Φ( A ) = E [ f ( R A )] - a smoothed variant of f ( A ).

  22. The Smooth Local Search Algorithm ◮ For any set A , let R A = A (2 / 3) ∪ A (1 / 3), a random set. R A R A is a random set with some bias A in picking elements from A Let Φ( A ) = E [ f ( R A )] - a smoothed variant of f ( A ). Algorithm: ◮ Perform local search with respect to Φ( A ). ◮ When a local optimum is found, return R A or A .

  23. The Smooth Local Search Algorithm ◮ For any set A , let R A = A (2 / 3) ∪ A (1 / 3), a random set. R A R A is a random set with some bias A in picking elements from A Let Φ( A ) = E [ f ( R A )] - a smoothed variant of f ( A ). Algorithm: ◮ Perform local search with respect to Φ( A ). ◮ When a local optimum is found, return R A or A . Theorem (Feige, M., Vondrak) The Smooth Local Search algorithm returns at least 0 . 40 OPT.

  24. Proof Sketch of the Algorithm Analysis: more complicated , using the sampling lemma for 3 sets. Let ◮ A = local optimum found by our algorithm ◮ R = A (2 / 3) ∪ A (1 / 3), our random set ◮ C = optimum Main claims: 1. E [ f ( R )] ≥ E [ f ( R ∪ ( A ∩ C ))]. 2. E [ f ( R )] ≥ E [ f ( R ∩ ( A ∪ C ))]. 20 E [ f ( R ∪ ( A ∩ C ))]+ 9 9 20 E [ f ( R ∩ ( A ∪ C ))]+ 1 3. 10 f ( A ) ≥ 0 . 4 OPT .

  25. Our Results 1. Approximation Algorithms for Maximizing non-monotone submodular functions: ◮ 0 . 33-approximation (deterministic local search). ◮ 0 . 40-approximation (randomized ”smooth local search”). ◮ 0 . 5-approximation for symmetric functions. 2. Hardness Results: ◮ It is impossible to improve the factor (1 / 2). ◮ A (1 / 2 + ǫ )-approximation would require exponentially many queries. ◮ Submodular functions with succinct representation: NP-hard to achieve (3 / 4 + ǫ )-approximation.

  26. Proof Idea for the Hardness Result Goal: Find two functions f and g which look identical to a typical query with high probability, and max g ( S ) . = 2 max f ( S ).

  27. Proof Idea for the Hardness Result Goal: Find two functions f and g which look identical to a typical query with high probability, and max g ( S ) . = 2 max f ( S ). Consider two functions f 1 , g 1 : [0 , 1] 2 → R + : 1. f 1 ( x , y ) = ( x + y )(2 − x − y ). 2. g 1 ( x , y ) = 2 x (1 − y ) + 2(1 − x ) y . Observe: f 1 ( x , x ) = g 1 ( x , x ) ⇒ modify g 1 ( x , y ) to g 2 ( x , y ) s.t. f 1 ( x , y ) = g 2 ( x , y ) for | x − y | < ǫ .

  28. Proof Idea for the Hardness Result Goal: Find two functions f and g which look identical to a typical query with high probability, and max g ( S ) . = 2 max f ( S ). Consider two functions f 1 , g 1 : [0 , 1] 2 → R + : 1. f 1 ( x , y ) = ( x + y )(2 − x − y ). (”complete graph cut”) 2. g 1 ( x , y ) = 2 x (1 − y ) + 2(1 − x ) y . (”bipartite graph cut”) Observe: f 1 ( x , x ) = g 1 ( x , x ) ⇒ modify g 1 ( x , y ) to g 2 ( x , y ) s.t. f 1 ( x , y ) = g 2 ( x , y ) for | x − y | < ǫ . Mapping to set functions: Let X = X 1 ∪ X 2 , and � � � � | S ∩ X 1 | | S ∩ X 2 | | S ∩ X 1 | | S ∩ X 2 | f ( S ) = f 1 | X 1 | , , and g ( S ) = g 2 | X 1 | , . | X 2 | | X 2 | x y S 1 − x 1 − y X 2 X 1

  29. Similar Technique for Combinatorial Auctions ◮ Using a similar technique, we can show information theoretic lower bounds for combinatorial auctions. Buyers S 5 S 6 S 1 S 2 S 3 S 4 f 1 ( S 1 ) f 2 ( S 2 ) f 4 ( S 4 ) f 6 ( S 6 ) f 3 ( S 3 ) f 5 ( S 5 ) ◮ Goal: Partition items to maximize social welfare, i.e, � i f i ( S i ). Theorem (M., Schapira, Vondrak, EC08 ) Achieving factor better than 1 − 1 e needs exponential number of value queries.

  30. Extra Constraints ◮ Cardinality Constraints: | S | ≤ k . ◮ Knapsack Constaints: � i ∈ S w i ≤ C . ◮ Matroid Constraints.

  31. Extra Constraints ◮ Cardinality Constraints: | S | ≤ k . ◮ Knapsack Constaints: � i ∈ S w i ≤ C . ◮ Matroid Constraints. ◮ Known results: Monotone Submodular functions: ◮ Matroid or cardinality constraints: 1 − 1 e (NWF78, Vondrak08) ◮ k Knapsack constraints: 1 − 1 e . (Sviridenko01, KST09) ◮ k Matroid constraints: 1 k +1 (NWF78).

  32. Extra Constraints ◮ Cardinality Constraints: | S | ≤ k . ◮ Knapsack Constaints: � i ∈ S w i ≤ C . ◮ Matroid Constraints. ◮ Known results: Monotone Submodular functions: ◮ Matroid or cardinality constraints: 1 − 1 e (NWF78, Vondrak08) ◮ k Knapsack constraints: 1 − 1 e . (Sviridenko01, KST09) ◮ k Matroid constraints: 1 k +1 (NWF78). ◮ (Lee, M., Natarajan, Sviridenko (STOC 2009)) Maximizing non-monotone submodular functions: ◮ One matroid or cardinality constraints: 1 4 . ◮ k Knapsack constraints: 1 5 . ◮ k Matroid constraints: 1 k . k +2+ 1 ◮ k Partition matroid constraints: 1 k − 1 . 1 k +1+

  33. Extra Constraints: Algorithms Lee, M., Nagarajan, Sviridenko, STOC’09 ◮ Local Search Algorithms. 1 ◮ Cardinality constraints: 4 . ◮ Local Search with add, remove, and swap operations.

  34. Extra Constraints: Algorithms Lee, M., Nagarajan, Sviridenko, STOC’09 ◮ Local Search Algorithms. 1 ◮ Cardinality constraints: 4 . ◮ Local Search with add, remove, and swap operations. 1 ◮ k Knapsack or budget constraints: 5 . 1. Solve a fractional variant on small elements. 2. Round the fractional solution for small elements. 3. Output the better of the solution for small elements and large elements.

  35. Extra Constraints: Algorithms Lee, M., Nagarajan, Sviridenko, STOC’09 ◮ Local Search Algorithms. 1 ◮ Cardinality constraints: 4 . ◮ Local Search with add, remove, and swap operations. 1 ◮ k Knapsack or budget constraints: 5 . 1. Solve a fractional variant on small elements. 2. Round the fractional solution for small elements. 3. Output the better of the solution for small elements and large elements. 1 ◮ k Matroid constraints: k . k +2+ 1 ◮ Local Search: Delete operation and more complicated exchange operations.

  36. Approximating Everywhere ◮ Can we learn a submodular function by polynomial number of queries? ◮ After polynomial number of queries, construct an oracle that approximate f by f ′ ? S 1 S 1 S 1 Submodular f ( S 1 ) , . . . , f ( S k ) S 2 Oracle S 3 Approximate · · · Oracle S k − 1 f ′ ( S ) S S k ◮ Goal: Approximate f by f ′ ?

  37. Approximating Everywhere ◮ Can we learn a submodular function by polynomial number of queries? ◮ After polynomial number of queries, construct an oracle that approximate f by f ′ ? S 1 S 1 S 1 Submodular f ( S 1 ) , . . . , f ( S k ) S 2 Oracle S 3 Approximate · · · Oracle S k − 1 f ′ ( S ) S S k ◮ Goal: Approximate f by f ′ ? Theorem (Goemans, Iwata, Harvey, M. SODA09 ) Achieving factor better than √ n needs exponential number of value queries even for rank functions of matroids.

  38. Approximating Everywhere ◮ Can we learn a submodular function by polynomial number of queries? ◮ After polynomial number of queries, construct an oracle that approximate f by f ′ ? S 1 S 1 S 1 Submodular f ( S 1 ) , . . . , f ( S k ) S 2 Oracle S 3 Approximate · · · Oracle S k − 1 f ′ ( S ) S S k ◮ Goal: Approximate f by f ′ ? Theorem (Goemans, Iwata, Harvey, M. SODA09 ) Achieving factor better than √ n needs exponential number of value queries even for rank functions of matroids. Theorem (Goemans, Iwata, Harvey, M. SODA09 ) After a polynomial number of value queries to a monotonte submodular function, we can approximate the function everywhere within O ( √ n log n ) .

  39. Outline ◮ Submodularity. ◮ Maximizing non-monotone Submodular Functions ◮ Application 1: Marketing over Social Networks ◮ Application 2: Guaranteed Banner ad Allocation.

  40. Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook.

  41. Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks. ◮ Viral Marketing: Word-of-Mouth Advertising.

  42. Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network.

  43. Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Monetizing Social Networks. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Marketing policy: In what order and at what price, do we offer an item to buyers?

  44. Application 1: Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising.

  45. Application 1: Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon.

  46. Application 1: Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave) functions. C B A 100 ( $100 ,$110,$115,$118)

  47. Application 1: Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave) functions. C B A 110 ($100, $110 ,$115,$118)

  48. Application 1: Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave) functions. ◮ f i ( S ) = g ( | N i ∩ S | ). C B A 115 ($100,$110, $115 ,$118)

  49. Application 1: Marketing over Social Networks ◮ Online Social Networks: MySpace, Facebook. ◮ Viral Marketing: Word-of-Mouth Advertising. ◮ Users influence each others’ valuation on a social network. ◮ Especially for Networked Goods, like Zune, Sprint, or Verizon. ◮ Model the influence among users by submodular (or concave) functions. ◮ f i ( S ) = g ( | N i ∩ S | ). C B A 118 ($100,$110,$115, $118 )

  50. Marketing Model ◮ Given: A prior (probability distribution) P i ( S ) for the valuation function for user i given that a set S of users already adopted the item. ◮ Goal: Design a Marketing Policy to maximize the expected revenue.

  51. Marketing Model ◮ Given: A prior (probability distribution) P i ( S ) for the valuation function for user i given that a set S of users already adopted the item. ◮ Goal: Design a Marketing Policy to maximize the expected revenue. ◮ Marketing policy : Visit buyers one by one and offer them a price. ◮ Ordering of buyers. ◮ Pricing at each step.

  52. Marketing Model ◮ Given: A prior (probability distribution) P i ( S ) for the valuation function for user i given that a set S of users already adopted the item. ◮ Goal: Design a Marketing Policy to maximize the expected revenue. ◮ Marketing policy : Visit buyers one by one and offer them a price. ◮ Ordering of buyers. ◮ Pricing at each step. ◮ Optimal (myopic) Pricing: Optimal price to maximize revenue at each step (ignoring the future influence).

  53. Marketing Model ◮ Given: A prior (probability distribution) P i ( S ) for the valuation function for user i given that a set S of users already adopted the item. ◮ Goal: Design a Marketing Policy to maximize the expected revenue. ◮ Marketing policy : Visit buyers one by one and offer them a price. ◮ Ordering of buyers. ◮ Pricing at each step. ◮ Optimal (myopic) Pricing: Optimal price to maximize revenue at each step (ignoring the future influence). ◮ Let f i ( S ) be the optimal revenue from buyer i using the optimal (myopic) price given that a set S of buyers have bought the item. ◮ We call this function f i the influence function.

  54. Marketing Model: Example ◮ Marketing policy: In what order and at what price do we offer a digital good to buyers? ◮ Each buyer has a monotone submodular influence function. (50,60,65) 100? ( 100 ,105) (60,70) (100,110,115,118)

  55. Marketing Model: Example ◮ Marketing policy: In what order and at what price do we offer a digital good to buyers? ◮ Each buyer has a monotone submodular influence function. (50,60,65) 70? 100 (100,105) (60, 70 ) (100,110,115,118)

  56. Marketing Model: Example ◮ Marketing policy: In what order and at what price do we offer a digital good to buyers? ◮ Each buyer has a monotone submodular influence function. (50,60, 65 ) 65? 70 100 (100,105) (60,70) (100,110,115,118)

  57. Marketing Model: Example ◮ Marketing policy: In what order and at what price do we offer a digital good to buyers? ◮ Each buyer has a monotone submodular influence function. (50,60,65) 65 70 100 (100,105) (60,70) 118? (100,110,115, 118 )

  58. Marketing Model Marketing policy: In what order and at what price, do we offer a digital good to buyers? ◮ Given: A prior (probability distribution) P i ( S ) for the valuation of user i given that a set S of users already adopted the item. ◮ Goal: Design a Marketing Policy to maximize the expected revenue.

  59. Marketing Model Marketing policy: In what order and at what price, do we offer a digital good to buyers? ◮ Given: A prior (probability distribution) P i ( S ) for the valuation of user i given that a set S of users already adopted the item. ◮ Goal: Design a Marketing Policy to maximize the expected revenue. ◮ The problem is NP-hard. ◮ Hartline, M., and Sundararajan [HMS] ( WWW )

  60. Influence & Exploit Strategies Influence & Exploit strategies : 1. Influence: Give the item for free to a set A of influential buyers . 2. Exploit: Apply the following strategy on the rest. ◮ Optimal myopic pricing. ◮ Random order.

  61. Influence & Exploit Strategies Influence & Exploit strategies : 1. Influence: Give the item for free to a set A of influential buyers . 2. Exploit: Apply the following strategy on the rest. ◮ Optimal myopic pricing. ◮ Random order. Optimal (myopic) pricing: ◮ At each time, offer a price that maximizes the current expected revenue (ignoring the future influence).

  62. Q1: How good are influence & exploit strategies? Theorem (Hartline, M., Sundararajan) Given monotone submodular influence functions, Influence & Exploit strategies give constant-factor approximations to the optimal revenue.

  63. Q1: How good are influence & exploit strategies? Theorem (Hartline, M., Sundararajan) Given monotone submodular influence functions, Influence & Exploit strategies give constant-factor approximations to the optimal revenue. Constant factor: ◮ 0 . 25 for the general distributions, ◮ 0 . 306 for distributions satisfying a monotone hazard-rate condition, ◮ 0 . 33 for additive settings and uniform distribution, ◮ 0 . 66 for undirected graphs and uniform distributions, ◮ 0 . 94 for complete undirected graphs and uniform distributions.

  64. Q2: How to find influential users? ◮ Question 2: How do we choose a set A of influential users to give the item for free to maximize the revenue? ◮ Let g(A) be the expected revenue, if the initial set is A .

  65. Q2: How to find influential users? ◮ Question 2: How do we choose a set A of influential users to give the item for free to maximize the revenue? ◮ Let g(A) be the expected revenue, if the initial set is A . ◮ For some small sets A , g ( ∅ ) ≤ g ( A ). ◮ If we give the item for free to all users, g ( X ) = 0.

  66. Q2: How to find influential users? ◮ Question 2: How do we choose a set A of influential users to give the item for free to maximize the revenue? ◮ Let g(A) be the expected revenue, if the initial set is A . ◮ For some small sets A , g ( ∅ ) ≤ g ( A ). ◮ If we give the item for free to all users, g ( X ) = 0. Theorem (Hartline, M., Sundararajan) Given monotone submodular influence functions, the revenue function g is a non-monotone non-negative submodular function.

  67. Q2: How to find influential users? ◮ Question 2: How do we choose a set A of influential users to give the item for free to maximize the revenue? ◮ Let g(A) be the expected revenue, if the initial set is A . ◮ For some small sets A , g ( ∅ ) ≤ g ( A ). ◮ If we give the item for free to all users, g ( X ) = 0. Theorem (Hartline, M., Sundararajan) Given monotone submodular influence functions, the revenue function g is a non-monotone non-negative submodular function. ◮ Thus, to find a set A that maximizes the revenue g ( A ), we can use the 0 . 4-approximation local search algorithm from Feige, M., Vondrak.

  68. Future Directions ◮ Marketing over Social Networks ◮ Avoiding Price Discremenation (Fixed-Price). ◮ Iterative Pricing with Positive Network Externalities (Akhlaghpour, Ghodsi, Haghpanah, Mahini, M., Nikzad). ◮ Cascading Effect and Influence Propagation. ◮ Revenue Maximization for Fixed-Priced Marketing with Influence Propagation (M., Sundararajna, Roch). ◮ Learning vs. Marketing.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend