cs675 convex and combinatorial optimization fall 2019
play

CS675: Convex and Combinatorial Optimization Fall 2019 Convex - PowerPoint PPT Presentation

CS675: Convex and Combinatorial Optimization Fall 2019 Convex Optimization Problems Instructor: Shaddin Dughmi Outline Convex Optimization Basics 1 Common Classes 2 Interlude: Positive Semi-Definite Matrices 3 More Convex Optimization


  1. CS675: Convex and Combinatorial Optimization Fall 2019 Convex Optimization Problems Instructor: Shaddin Dughmi

  2. Outline Convex Optimization Basics 1 Common Classes 2 Interlude: Positive Semi-Definite Matrices 3 More Convex Optimization Problems 4

  3. Recall: Convex Optimization Problem A problem of minimizing a convex function (or maximizing a concave function) over a convex set. minimize f ( x ) x ∈ X subject to X ⊆ R n is convex, and f : R n → R is convex Terminology: decision variable(s), objective function, feasible set, optimal solution/value, ǫ -optimal solution/value Convex Optimization Basics 1/22

  4. Standard Form Instances typically formulated in the following standard form minimize f ( x ) subject to g i ( x ) ≤ 0 , for i ∈ C 1 . a ⊺ i x = b i , for i ∈ C 2 . g i is convex Terminology: equality constraints, inequality constraints, active/inactive at x , feasible/infeasible, unbounded Convex Optimization Basics 2/22

  5. Standard Form Instances typically formulated in the following standard form minimize f ( x ) subject to g i ( x ) ≤ 0 , for i ∈ C 1 . a ⊺ i x = b i , for i ∈ C 2 . g i is convex Terminology: equality constraints, inequality constraints, active/inactive at x , feasible/infeasible, unbounded In principle, every convex optimization problem can be formulated in this form (possibly implicitly) Recall: every convex set is the intersection of halfspaces Convex Optimization Basics 2/22

  6. Standard Form Instances typically formulated in the following standard form minimize f ( x ) subject to g i ( x ) ≤ 0 , for i ∈ C 1 . a ⊺ i x = b i , for i ∈ C 2 . g i is convex Terminology: equality constraints, inequality constraints, active/inactive at x , feasible/infeasible, unbounded In principle, every convex optimization problem can be formulated in this form (possibly implicitly) Recall: every convex set is the intersection of halfspaces When there is no objective function (or, equivalently, f ( x ) = 0 for all x ), we say this is convex feasibility problem Convex Optimization Basics 2/22

  7. Local and Global Optimality x ∈ X is locally optimal if ∃ open ball B centered at x s.t. f ( x ) ≤ f ( y ) for all y ∈ B � X . It is globally optimal if it’s an optimal solution. Fact For a convex optimization problem, every locally optimal feasible solution is globally optimal. Convex Optimization Basics 3/22

  8. Local and Global Optimality x ∈ X is locally optimal if ∃ open ball B centered at x s.t. f ( x ) ≤ f ( y ) for all y ∈ B � X . It is globally optimal if it’s an optimal solution. Fact For a convex optimization problem, every locally optimal feasible solution is globally optimal. Proof Let x be locally optimal, and y be any other feasible point. Convex Optimization Basics 3/22

  9. Local and Global Optimality x ∈ X is locally optimal if ∃ open ball B centered at x s.t. f ( x ) ≤ f ( y ) for all y ∈ B � X . It is globally optimal if it’s an optimal solution. Fact For a convex optimization problem, every locally optimal feasible solution is globally optimal. Proof Let x be locally optimal, and y be any other feasible point. The line segment from x to y is contained in the feasible set. Convex Optimization Basics 3/22

  10. Local and Global Optimality x ∈ X is locally optimal if ∃ open ball B centered at x s.t. f ( x ) ≤ f ( y ) for all y ∈ B � X . It is globally optimal if it’s an optimal solution. Fact For a convex optimization problem, every locally optimal feasible solution is globally optimal. Proof Let x be locally optimal, and y be any other feasible point. The line segment from x to y is contained in the feasible set. By local optimality f ( x ) ≤ f ( θx + (1 − θ ) y ) for θ sufficiently close to 1 . Convex Optimization Basics 3/22

  11. Local and Global Optimality x ∈ X is locally optimal if ∃ open ball B centered at x s.t. f ( x ) ≤ f ( y ) for all y ∈ B � X . It is globally optimal if it’s an optimal solution. Fact For a convex optimization problem, every locally optimal feasible solution is globally optimal. Proof Let x be locally optimal, and y be any other feasible point. The line segment from x to y is contained in the feasible set. By local optimality f ( x ) ≤ f ( θx + (1 − θ ) y ) for θ sufficiently close to 1 . Jensen’s inequality then implies that y is suboptimal. f ( x ) ≤ f ( θx + (1 − θ ) y ) ≤ θf ( x ) + (1 − θ ) f ( y ) f ( x ) ≤ f ( y ) Convex Optimization Basics 3/22

  12. Representation Typically, by problem we mean a family of instances, each of which is described either explicitly via problem parameters, or given implicitly via an oracle, or something in between. Convex Optimization Basics 4/22

  13. Representation Typically, by problem we mean a family of instances, each of which is described either explicitly via problem parameters, or given implicitly via an oracle, or something in between. Explicit Representation A family of linear programs of the following form c T x maximize subject to Ax � b x � 0 may be described by c ∈ R n , A ∈ R m × n , and b ∈ R m . Convex Optimization Basics 4/22

  14. Representation Typically, by problem we mean a family of instances, each of which is described either explicitly via problem parameters, or given implicitly via an oracle, or something in between. Oracle Representation At their most abstract, convex optimization problems of the following form minimize f ( x ) subject to x ∈ X are described via a separation oracle for X and epi f . Convex Optimization Basics 4/22

  15. Representation Typically, by problem we mean a family of instances, each of which is described either explicitly via problem parameters, or given implicitly via an oracle, or something in between. Oracle Representation At their most abstract, convex optimization problems of the following form minimize f ( x ) subject to x ∈ X are described via a separation oracle for X and epi f . Given additional data about instances of the problem, namely a range [ L, H ] for its optimal value and a ball of volume V containing X , the ellipsoid method returns an ǫ -optimal solution using only poly( n, log( H − L ) , log V ) oracle calls. ǫ Convex Optimization Basics 4/22

  16. Representation Typically, by problem we mean a family of instances, each of which is described either explicitly via problem parameters, or given implicitly via an oracle, or something in between. In Between Consider the following fractional relaxation of the Traveling Salesman Problem, described by a network ( V, E ) and distances d e on e ∈ E . min � e d e x e s.t. � e ∈ δ ( S ) x e ≥ 2 , ∀ S ⊂ V, S � = ∅ . x � 0 Convex Optimization Basics 4/22

  17. Representation Typically, by problem we mean a family of instances, each of which is described either explicitly via problem parameters, or given implicitly via an oracle, or something in between. In Between Consider the following fractional relaxation of the Traveling Salesman Problem, described by a network ( V, E ) and distances d e on e ∈ E . min � e d e x e s.t. � e ∈ δ ( S ) x e ≥ 2 , ∀ S ⊂ V, S � = ∅ . x � 0 Representation of LP is implicit, in the form of a network. Using this representation, separation oracles can be implemented efficiently, and used as subroutines in the ellipsoid method. Convex Optimization Basics 4/22

  18. Equivalence Next up: we look at some common classes of convex optimization problems Technically, not all of them will be convex in their natural representation However, we will show that they are “equivalent” to a convex optimization problem Convex Optimization Basics 5/22

  19. Equivalence Next up: we look at some common classes of convex optimization problems Technically, not all of them will be convex in their natural representation However, we will show that they are “equivalent” to a convex optimization problem Equivalence Loosly speaking, two optimization problems are equivalent if an optimal solution to one can easily be “translated” into an optimal solution for the other. Convex Optimization Basics 5/22

  20. Equivalence Next up: we look at some common classes of convex optimization problems Technically, not all of them will be convex in their natural representation However, we will show that they are “equivalent” to a convex optimization problem Equivalence Loosly speaking, two optimization problems are equivalent if an optimal solution to one can easily be “translated” into an optimal solution for the other. Note Deciding whether an optimization problem is equivalent to a tractable convex optimization problem is, in general, a black art honed by experience. There is no silver bullet. Convex Optimization Basics 5/22

  21. Outline Convex Optimization Basics 1 Common Classes 2 Interlude: Positive Semi-Definite Matrices 3 More Convex Optimization Problems 4

  22. Linear Programming We have already seen linear programming minimize c ⊺ x subject to Ax ≤ b Common Classes 6/22

  23. Linear Fractional Programming Generalizes linear programming c ⊺ x + d minimize e ⊺ x + f subject to Ax � b e ⊺ x + f > 0 The objective is quasiconvex (in fact, quasilinear) over the open halfspace where the denominator is positive. Common Classes 7/22

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend