submodular maximization seffi naor lecture 3 4th cargese
play

Submodular Maximization Seffi Naor Lecture 3 4th Cargese Workshop - PowerPoint PPT Presentation

Submodular Maximization Seffi Naor Lecture 3 4th Cargese Workshop on Combinatorial Optimization Seffi Naor Submodular Maximization Continuous Relaxation Recap: a continuous relaxation for maximization Seffi Naor Submodular Maximization


  1. Submodular Maximization Seffi Naor Lecture 3 4th Cargese Workshop on Combinatorial Optimization Seffi Naor Submodular Maximization

  2. Continuous Relaxation Recap: a continuous relaxation for maximization Seffi Naor Submodular Maximization

  3. Continuous Relaxation Recap: a continuous relaxation for maximization Multilinear Extension: F ( x ) = ∑ f ( R ) ∏ ( 1 − x i ) , ∀ x ∈ [ 0, 1 ] N x i ∏ R ⊆N u i ∈ R u i / ∈ R Simple probabilistic interpretation. x integral ⇒ F ( x ) = f ( x ) . Seffi Naor Submodular Maximization

  4. Continuous Relaxation Recap: a continuous relaxation for maximization Multilinear Extension: F ( x ) = ∑ f ( R ) ∏ ( 1 − x i ) , ∀ x ∈ [ 0, 1 ] N x i ∏ R ⊆N u i ∈ R u i / ∈ R Simple probabilistic interpretation. x integral ⇒ F ( x ) = f ( x ) . Multilinear Relaxation What are the properties of F ? It is neither convex nor concave. Seffi Naor Submodular Maximization

  5. Properties of the Multilinear Extension Lemma The multilinear extension F satisfies: If f is non-decreasing, then ∂ F ∂ x i � 0 everywhere in the cube for all i . ∂ 2 F ∂ x i ∂ x j � 0 everywhere in the cube for all i , j . If f is submodular, then Seffi Naor Submodular Maximization

  6. Properties of the Multilinear Extension Lemma The multilinear extension F satisfies: If f is non-decreasing, then ∂ F ∂ x i � 0 everywhere in the cube for all i . ∂ 2 F ∂ x i ∂ x j � 0 everywhere in the cube for all i , j . If f is submodular, then Useful for proving: Theorem The multilinear extension F satisfies: If f is non-decreasing, then F is non-decreasing in every direction � d . If f is submodular, then F is concave in every direction � d � 0 . If f is submodular, then F is convex in every direction � e i − � e j for all i , j ∈ N . Seffi Naor Submodular Maximization

  7. Properties of the Multilinear Extension Summarizing: f − ( x ) f + ( x ) f L ( x ) � � F ( x ) = � �� � ���� � �� � � �� � concave closure multilinear ext. convex closure Lovasz ext. Any extension can be described as E [ f ( R )] where R is chosen from a distribution that preserves the x i values (marginals). concave closure maximizes expectation but is hard to compute. concave closure minimizes expectation and has a nice characterization (Lovasz extension). Multilinear extension is somewhere in the “middle”. Seffi Naor Submodular Maximization

  8. Continuous Relaxation constrained submodular maximization problem Family of allowed subsets M ⊆ 2 N . f ( S ) max S ∈ M s . t . Seffi Naor Submodular Maximization

  9. Continuous Relaxation constrained submodular maximization problem Family of allowed subsets M ⊆ 2 N . f ( S ) max S ∈ M s . t . following the paradigm for relaxing linear maximization problems P M - convex hull of feasible sets (characteristic vectors) F ( x ) max s . t . x ∈ P M Seffi Naor Submodular Maximization

  10. Continuous Relaxation constrained submodular maximization problem Family of allowed subsets M ⊆ 2 N . f ( S ) max S ∈ M s . t . following the paradigm for relaxing linear maximization problems P M - convex hull of feasible sets (characteristic vectors) F ( x ) max s . t . x ∈ P M comparing linear and submodular relaxations optimizing a fractional solution: linear: easy submodular: not clear ... Seffi Naor Submodular Maximization

  11. Continuous Relaxation constrained submodular maximization problem Family of allowed subsets M ⊆ 2 N . f ( S ) max S ∈ M s . t . following the paradigm for relaxing linear maximization problems P M - convex hull of feasible sets (characteristic vectors) F ( x ) max s . t . x ∈ P M comparing linear and submodular relaxations optimizing a fractional solution: linear: easy submodular: not clear ... rounding a fractional solution: linear: hard (problem dependent) submodular: easy (pipage for matroids) Seffi Naor Submodular Maximization

  12. Pipage Rounding on Matroids Work of [Ageev-Sviridenko-04],[C˘ alinescu-Chekuri-P´ al-Vondr´ ak-08]. Seffi Naor Submodular Maximization

  13. Pipage Rounding on Matroids Work of [Ageev-Sviridenko-04],[C˘ alinescu-Chekuri-P´ al-Vondr´ ak-08]. For a matroid M , the matroid polytope associated with it: P M = { x ∈ [ 0, 1 ] N : ∑ x i � r M ( S ) ∀ S ⊆ M} i ∈ S where r M ( · ) is the rank function of M . The extreme points of P M correspond to characterstic vectors of indepenedent sets in M . Seffi Naor Submodular Maximization

  14. Pipage Rounding on Matroids Work of [Ageev-Sviridenko-04],[C˘ alinescu-Chekuri-P´ al-Vondr´ ak-08]. For a matroid M , the matroid polytope associated with it: P M = { x ∈ [ 0, 1 ] N : ∑ x i � r M ( S ) ∀ S ⊆ M} i ∈ S where r M ( · ) is the rank function of M . The extreme points of P M correspond to characterstic vectors of indepenedent sets in M . Observation: if f is linear, a point x can be rounded by writing it as a convex sum of extreme points. Seffi Naor Submodular Maximization

  15. Pipage Rounding on Matroids Work of [Ageev-Sviridenko-04],[C˘ alinescu-Chekuri-P´ al-Vondr´ ak-08]. For a matroid M , the matroid polytope associated with it: P M = { x ∈ [ 0, 1 ] N : ∑ x i � r M ( S ) ∀ S ⊆ M} i ∈ S where r M ( · ) is the rank function of M . The extreme points of P M correspond to characterstic vectors of indepenedent sets in M . Observation: if f is linear, a point x can be rounded by writing it as a convex sum of extreme points. Question: What do we do if f is (general) submodular? Seffi Naor Submodular Maximization

  16. Pipage Rounding on Matroids Rounding general submodular function f : if x is non-integral, there are i , j ∈ N for which 0 < x i , x j < 1 . recall, F is convex in every direction e i − e j . hence, F is non-decreasing in one of the directions ± ( e i − e j ) Seffi Naor Submodular Maximization

  17. Pipage Rounding on Matroids Rounding general submodular function f : if x is non-integral, there are i , j ∈ N for which 0 < x i , x j < 1 . recall, F is convex in every direction e i − e j . hence, F is non-decreasing in one of the directions ± ( e i − e j ) Rounding Algorithm: suppose direction e i − e j is non-decreasing δ - max change (due to a tight set A ) if either x i + δ or x j − δ are integral - progress else there exists a tight set A ′ ⊂ A , i ∈ A ′ , j / ∈ A ′ ( | A ′ | < | A | ) recurse on A ′ - progress eventually: minimal tight set (contained in all tight sets) in which any pair of coordinates can be increased/decreased - progress Seffi Naor Submodular Maximization

  18. Continuous Greedy The Continuous Greedy Algorithm [C˘ alinescu-Chekuri-P´ al-Vondr´ ak-08] computes an approximate fractional solution f is monotone (for now ...) P M is downward closed ( � 0 ∈ P M ) Seffi Naor Submodular Maximization

  19. Continuous Greedy Seffi Naor Submodular Maximization

  20. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  21. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  22. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  23. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  24. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  25. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  26. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  27. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  28. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  29. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

  30. Continuous Greedy x ( 0 ) = � � 0 � � n ∂ F ( � x ( t )) � ∂ x i ( t ) y ∗ ( t ) = argmax = y ∗ ∑ · y i : � y ∈ P M � i ( t ) ∂ x i ∂ t i = 1 Seffi Naor Submodular Maximization

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend