primal heuristic for minlps in scip
play

Primal heuristic for MINLPs in SCIP Ambros Gleixner, and Felipe - PowerPoint PPT Presentation

Primal heuristic for MINLPs in SCIP Ambros Gleixner, and Felipe Serrano Zuse Institute Berlin serrano@zib.de SCIP Optimization Suite http://scip.zib.de Workshop on Discrepancy Theory and Integer Programming Amsterdam June 12, 2018


  1. Primal heuristic for MINLPs in SCIP Ambros Gleixner, and Felipe Serrano Zuse Institute Berlin · serrano@zib.de SCIP Optimization Suite · http://scip.zib.de Workshop on Discrepancy Theory and Integer Programming Amsterdam · June 12, 2018

  2. Outline Introduction: LP-based Branch and Bound Spatial Branch and bound Heuristics Sub-NLP NLP diving Multi-start MPEC Undercover RENS Conclusion Gleixner, Serrano · MINLPs in SCIP 1 / 24

  3. Mixed-Integer Nonlinear Programs (MINLPs) min c T x s.t. g k ( x ) ≤ 0 ∀ k ∈ [ m ] x i ∈ Z ∀ i ∈ I ⊆ [ n ] x i ∈ [ ℓ i , u i ] ∀ i ∈ [ n ] The functions g k ∈ C 1 ([ ℓ, u ] , R ) can be 0 100 200 300 200 10 200 0 0 5 − 200 − 1 1 − 1 1 convex or nonconvex Gleixner, Serrano · MINLPs in SCIP 2 / 24

  4. One way of solving MINLPs to global optimality • Methods for finding (good) feasible solutions � Primal heuristics • Proof that there is no better solution � LP-based spatial branch and bound Gleixner, Serrano · MINLPs in SCIP 3 / 24

  5. LP based spatial Branch & Bound • Build a (extended formulation of a) polyhedral relaxation R • Solve R and get solution x ∗ • If x ∗ is feasible we are done. If not, • Try to strengthen R by separating x ∗ • When not possible, branch possibly on continuous variables (spatially) Gleixner, Serrano · MINLPs in SCIP 4 / 24

  6. • Very expensive for general g • However, when g is convex, it is as easy as computing a gradient: g x g x x x 0 • Idea: find convex underestimator g of g x • Then x l u g x 0 x l u g x 0 • If g x 0 we can separate. Building polyhedral relaxations: the problem • We can start with the variable’s bounds as our relaxation. • Then we have to solve the separation problem: Given { x ∈ [ l , u ] : g ( x ) ≤ 0 } and ¯ x s.t. g (¯ x ) > 0 either • Find a separating inequality or • prove that none exists. Gleixner, Serrano · MINLPs in SCIP 5 / 24

  7. • However, when g is convex, it is as easy as computing a gradient: g x g x x x 0 • Idea: find convex underestimator g of g x • Then x l u g x 0 x l u g x 0 • If g x 0 we can separate. Building polyhedral relaxations: the problem • We can start with the variable’s bounds as our relaxation. • Then we have to solve the separation problem: Given { x ∈ [ l , u ] : g ( x ) ≤ 0 } and ¯ x s.t. g (¯ x ) > 0 either • Find a separating inequality or • prove that none exists. • Very expensive for general g Gleixner, Serrano · MINLPs in SCIP 5 / 24

  8. • Idea: find convex underestimator g of g x • Then x l u g x 0 x l u g x 0 • If g x 0 we can separate. Building polyhedral relaxations: the problem • We can start with the variable’s bounds as our relaxation. • Then we have to solve the separation problem: Given { x ∈ [ l , u ] : g ( x ) ≤ 0 } and ¯ x s.t. g (¯ x ) > 0 either • Find a separating inequality or • prove that none exists. • Very expensive for general g • However, when g is convex, it is as easy as computing a gradient: g (¯ x ) + ∇ g (¯ x )( x − ¯ x ) ≤ 0 Gleixner, Serrano · MINLPs in SCIP 5 / 24

  9. • Then x l u g x 0 x l u g x 0 • If g x 0 we can separate. Building polyhedral relaxations: the problem • We can start with the variable’s bounds as our relaxation. • Then we have to solve the separation problem: Given { x ∈ [ l , u ] : g ( x ) ≤ 0 } and ¯ x s.t. g (¯ x ) > 0 either • Find a separating inequality or • prove that none exists. • Very expensive for general g • However, when g is convex, it is as easy as computing a gradient: g (¯ x ) + ∇ g (¯ x )( x − ¯ x ) ≤ 0 • Idea: find convex underestimator ˆ g of g ( x ) Gleixner, Serrano · MINLPs in SCIP 5 / 24

  10. Building polyhedral relaxations: the problem • We can start with the variable’s bounds as our relaxation. • Then we have to solve the separation problem: Given { x ∈ [ l , u ] : g ( x ) ≤ 0 } and ¯ x s.t. g (¯ x ) > 0 either • Find a separating inequality or • prove that none exists. • Very expensive for general g • However, when g is convex, it is as easy as computing a gradient: g (¯ x ) + ∇ g (¯ x )( x − ¯ x ) ≤ 0 • Idea: find convex underestimator ˆ g of g ( x ) • Then { x ∈ [ l , u ] : g ( x ) ≤ 0 } ⊆ { x ∈ [ l , u ] : ˆ g ( x ) ≤ 0 } • If ˆ g (¯ x ) > 0 we can separate. Gleixner, Serrano · MINLPs in SCIP 5 / 24

  11. x 2 y 2 x 2 y 2 xy 3 • 0 2 • Admittedly, ad-hoc argument • In practice: if functions are simple enough, we know convex/concave envelopes • If function is not simple enough, make it simpler! Building polyhedral relaxations: an example 2 1 • x 2 + y 2 + 2 exp( xy 3 ) ≤ 3 with x , y ∈ [ − 2 , 2 ] 0 - 1 - 2 - 2 - 1 0 1 2 Gleixner, Serrano · MINLPs in SCIP 6 / 24

  12. • Admittedly, ad-hoc argument • In practice: if functions are simple enough, we know convex/concave envelopes • If function is not simple enough, make it simpler! Building polyhedral relaxations: an example 2 1 • x 2 + y 2 + 2 exp( xy 3 ) ≤ 3 with x , y ∈ [ − 2 , 2 ] 0 • exp( · ) > 0 ⇒ x 2 + y 2 ≤ x 2 + y 2 + 2 exp( xy 3 ) - 1 - 2 - 2 - 1 0 1 2 Gleixner, Serrano · MINLPs in SCIP 6 / 24

  13. Building polyhedral relaxations: an example 2 1 • x 2 + y 2 + 2 exp( xy 3 ) ≤ 3 with x , y ∈ [ − 2 , 2 ] 0 • exp( · ) > 0 ⇒ x 2 + y 2 ≤ x 2 + y 2 + 2 exp( xy 3 ) - 1 - 2 - 2 - 1 0 1 2 • Admittedly, ad-hoc argument • In practice: if functions are simple enough, we know convex/concave envelopes • If function is not simple enough, make it simpler! Gleixner, Serrano · MINLPs in SCIP 6 / 24

  14. Building polyhedral relaxation in SCIP • x 2 + y 2 + 2 exp( xy 3 ) ≤ 3 • Introduce auxiliary variables • z 1 = y 3 • z 2 = xz 1 • x 2 + y 2 + 2 exp( z 2 ) ≤ 3 • We can find polyhedral relaxations of z 1 = y 3 • For z 2 = xz 1 we have McCormick inequalities: max { xz 1 + z 1 x − xz 1 , x ¯ z 1 + z 1 ¯ x − ¯ x ¯ z 1 } ≤ z 2 ≤ min { x ¯ z 1 + z 1 x − x ¯ z 1 , xz 1 + z 1 ¯ x − ¯ xz 1 } • Finally, x 2 + y 2 + 2 exp( z 2 ) is convex Gleixner, Serrano · MINLPs in SCIP 7 / 24

  15. Branching on a nonlinear variable in a nonconvex constraint allows for tighter relaxations: Spatial Branch and bound Solutions might not be separable: conv { ( x , y ) : x 2 = y , x ∈ [ ℓ, u ] } is x 2 ≤ y ≤ ℓ 2 + u 2 − ℓ 2 u − ℓ ( x − ℓ ) ∀ x ∈ [ ℓ, u ] . 1.0 0.8 0.6 0.4 0.2 � 1.0 � 0.5 0.5 1.0 Gleixner, Serrano · MINLPs in SCIP 8 / 24

  16. Spatial Branch and bound Solutions might not be separable: conv { ( x , y ) : x 2 = y , x ∈ [ ℓ, u ] } is x 2 ≤ y ≤ ℓ 2 + u 2 − ℓ 2 u − ℓ ( x − ℓ ) ∀ x ∈ [ ℓ, u ] . Branching on a nonlinear variable in a nonconvex constraint allows for tighter relaxations: 1.0 1.0 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 � 1.0 � 0.5 0.5 1.0 � 1.0 � 0.5 0.5 1.0 Gleixner, Serrano · MINLPs in SCIP 8 / 24

  17. Outline Introduction: LP-based Branch and Bound Spatial Branch and bound Heuristics Sub-NLP NLP diving Multi-start MPEC Undercover RENS Conclusion Gleixner, Serrano · MINLPs in SCIP 9 / 24

  18. Extends MIP heuristics to MINLP • SCIP runs its MIP heuristics on MIP relaxation of MINLP • use heuristic’s proposed solution as x Sub-NLP • Idea: fix integer variables to integer values and run a local NLP solver • Good for: MINLP Let ¯ x be LP-optimum of the current node’s relaxation • If there is a i ∈ I such that ¯ x i / ∈ Z → STOP • Fix x i to ¯ x i . • Solve remaining NLP to local optimality using ¯ x as initial point. min Gleixner, Serrano · MINLPs in SCIP 10 / 24

  19. Sub-NLP • Idea: fix integer variables to integer values and run a local NLP solver • Good for: MINLP Let ¯ x be LP-optimum of the current node’s relaxation • If there is a i ∈ I such that ¯ x i / ∈ Z → STOP • Fix x i to ¯ x i . • Solve remaining NLP to local optimality using ¯ x as initial point. min Extends MIP heuristics to MINLP • SCIP runs its MIP heuristics on MIP relaxation of MINLP • use heuristic’s proposed solution as ¯ x Gleixner, Serrano · MINLPs in SCIP 10 / 24

  20. NLP diving • Idea: Solve NLP relaxations, fixing an integer variable after each NLP • Good for: MINLP • solve NLP relaxation • fix an integer variable • propagate • repeat • if fixing is infeasible, backtrack • undo last fixing and fix to another value • if infeasible again, abort Gleixner, Serrano · MINLPs in SCIP 11 / 24

  21. Multi-start Heuristic [Chinneck et al. 2013] • Idea: use different starting points for NLP solver • Good for: NLPs without too many integer variables 4 ≤ x 2 + y 2 ≤ 9 4 ≤ ( x − 2 ) 2 + y 2 ≤ 9 x ∈ [ − 4 , 6 ] y ∈ [ − 4 , 4 ] Gleixner, Serrano · MINLPs in SCIP 12 / 24

  22. 2. for each point x k : g i x k • s k g i x k i i 2 g i x k • Update: n 1 s k x k x k i n i 1 3. identify clusters of points C 1 C p 4. for each cluster • build convex combination 1 y C j x x C j • round each fractional integer variable to closest integer • use resulting point as starting point for NLP Multi-start in SCIP Input: Nonlinear Constraints g i ( x ) ≤ 0 1. generate random points in [ ℓ, u ] Gleixner, Serrano · MINLPs in SCIP 13 / 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend