hybrid csp global optimization michel rueher
play

Hybrid CSP & Global Optimization Michel RUEHER University of - PowerPoint PPT Presentation

Hybrid CSP & Global Optimization Michel RUEHER University of Nice Sophia-Antipolis / I3S CNRS, France Courtesy to Alexandre Goldsztejn, Yahia Lebbah, Claude Michel June, 2011 ACP Summer School Hybrid Methods for Constraint


  1. Hybrid CSP & Global Optimization Michel RUEHER University of Nice Sophia-Antipolis / I3S – CNRS, France Courtesy to Alexandre Goldsztejn, Yahia Lebbah, Claude Michel June, 2011 ACP Summer School “Hybrid Methods for Constraint Programming” Turunç

  2. Outline CSP & Global Optimization M. Rueher 2

  3. The Problem CSP & Global Optimization M. Rueher We consider the continuous global optimisation problem  min f ( x )   s.c. g i ( x ) = 0 , j = 1 .. k  P ≡ g j ( x ) ≤ 0 , j = k + 1 .. m   x ≤ x ≤ x  with ◮ X = [ x , x ] : a vector of intervals of I R R n → I R n → I ◮ f : I R and g j : I R ◮ Functions f and g j : are continuously differentiable on X 3

  4. Trends in global optimisation CSP & Global Optimization M. Rueher ◮ Performance Most successful systems (Baron, α BB, . . . ) use local methods and linear relaxations → not rigorous (work with floats) ◮ Rigour Mainly rely on interval computation . . . available systems (e.g., Globsol) are quite slow ◮ Challenge: to combine the advantages of both approaches in an efficient and rigorous global optimisation framework 4

  5. Example of flaw due to a lack of rigour CSP & Global Optimization M. Rueher Consider the following optimisation problem: min x y − x 2 ≥ 0 s. t. y − x 2 ∗ ( x − 2 ) + 10 − 5 ≤ 0 y x , y ∈ [ − 10 , + 10 ] 0 x Baron 6.0 and Baron 7.2 find 0 as the minimum . . . 5

  6. Branch and Bound Algorithm (1) CSP & Global Optimization M. Rueher ◮ BB Algorithm –Scheme While L � = ∅ do % L initialized with the input box • Select a box B from the set of current boxes L • Reduction (filtering or tightening) of B • Lower bounding of f in box B • Upper bounding of f in box B • Update of f and f • Splitting of B (if not empty) ◮ Upper Bounding – Critical issue: to prove the existence of a feasible point in a reduced box ◮ Lower Bounding – Critical issue: to achieve an efficient pruning 6

  7. Branch and bound algorithm (2) CSP & Global Optimization M. Rueher Function BB ( IN x , ǫ ; OUT S , [ L , U ] ) S : set of proven feasible points f x denotes the set of possible values for f in x nbStarts : number of starting points in the first upper-bounding L := { x } ; ( L , U ) := ( −∞ , + ∞ ) ; S := UpperBounding ( x ′ , nbStarts ) ; while w ([ L , U ]) > ǫ ∧ L � = ∅ do x ′ := x ′′ such that f x ′′ = min { f x ′′ : x ′′ ∈ L} ; L := L \ { x } ′ ; f x ′ := min ( f x ′ , U ) ; x ′ := Prune ( x ′ ) ; f x ′ := LowerBound ( x ′ ) ; S := S ∪ UpperBounding ( x ′ , 1 ) ; if x ′ � = ∅ then ( x ′ 1 , x ′ 2 ) := Split ( x ′ ) ; L := L ∪ { x ′ 1 , x ′ 2 } ; then ( L , U ) := ( min { f x ′′ : x ′′ ∈ L} , min { f x ′′ : x ′′ ∈ S} ) if L � = ∅ endwhile 7

  8. Computing “sharp” upper bounds CSP & Global Optimization M. Rueher ◮ Upper bounding • local search → approximate feasible point x approx • epsilon inflation process and proof → provide a feasible box x proved ∗ = min ( f ( x proved ) , f ∗ ) • compute f ◮ Critical issue : to prove the existence of a feasible point in a reduced box • Singularities • Guess point too far from a feasible region (local search works with floats) 8

  9. Using the lower bound to get an CSP & Global Optimization upper-bound M. Rueher y P R U ? L x Branch&Bound step where P is the set of feasible points and R is the linear relaxation Idea: modify the safe lower bound ... to get an upper-bound ! 9

  10. Lower bound: a good starting point to find CSP & Global Optimization a feasible upper-bound ? M. Rueher y Set of feasible points F A feasible point ? Approximate feasible point N Set of non feasible points x N , optimal solution of R , not a feasible point of P but (may be) a good starting point : ◮ BB splits the domains at each iteration: smaller box � N nearest from the optima of P ◮ Proof process inflates a box around the guess point � compensate the distance from the feasible region 10

  11. Method CSP & Global Optimization M. Rueher ◮ Correction procedure to get a better feasible point from a given approximate feasible point → to exploit Newton-Raphson for under-constrained systems of equations (and Moore-Penrose inverse) Good convergence when the starting point is nearly feasible 11

  12. Handling square systems of equations CSP & Global Optimization M. Rueher R n − R m ( n = m ) ◮ g = ( g 1 , . . . , g m ) : I → I → Newton-Raphson step: x ( i + 1 ) = x ( i ) − J − 1 g ( x ( i ) ) g ( x ( i ) ) Converges well if the exact solution to be approximated is not singular 12

  13. Handling under-constrained systems of CSP & Global Optimization equations M. Rueher Manifold of solutions → linear system l ( x ) = 0 is under- constrained → Choose a solution x ( 1 ) of l ( x ) = 0 Best choice: Solution of l ( x ) = 0 close to x ( 0 ) Can easily be computed with the Moore-Penrose inverse : x ( i + 1 ) = x ( i ) − A + g ( x ( i ) ) g ( x ( i ) ) R n × m is the Moore-Penrose in- A + ∈ I g verse of A g , solution of the equation which minimizes || x ( 1 ) − x ( 0 ) || ) 13

  14. Handling under-constrained systems of CSP & Global Optimization equations and inequalities M. Rueher ◮ Under-constrained systems of equations and inequalities � introduce slack variables ◮ Initial values for the slack variables have to be provided Slightly positive value → to break the symmetry → good convergence 14

  15. A new upper bounding strategie CSP & Global Optimization M. Rueher Function UpperBounding ( IN x , x ∗ LP ; INOUT S ′ ) % S ′ : list of proven feasible boxes % x ∗ LP : the optimal solution of the LP relaxation of P ( x ) S ′ := ∅ x ∗ corr := FeasibilityCorrection( x ∗ LP ) % Improving x ∗ LP feasibility x p := InflateAndProve( x ∗ corr , x ) if x p � = ∅ then S ′ := S ′ ∪ x p endif return S ′ 15

  16. Experiments CSP & Global Optimization M. Rueher ◮ Significant set of benchmarks of the COCONUT project ◮ Selection of 35 benchmarks where Icos did find the global minimum while relying on an unsafe local search ◮ 31 benchmarks are solved and proved within a 30s time out ◮ Almost all benchmarks are solved in much less time and with much more proven solutions 16

  17. Using CSP to boost safe OBR CSP & Global Optimization M. Rueher ◮ OBR (optimal based reduction): known bounds of the objective function → to reduce the size of the domains ◮ Refutation techniques → boosting safe OBR 17

  18. Lower bounding CSP & Global Optimization M. Rueher ◮ Relaxing the problem • linear relaxation R of P ��������� ��������� ��������� ��������� ��������� ��������� ��������� ��������� d T x min ��������� ��������� ��������� ��������� Ax ≤ b ��������� ��������� ��������� ��������� s . t . P ��������� ��������� ��������� ��������� ��������� ��������� ��������� ��������� • LP solver → f ∗ ��������� ��������� ��������� ��������� R ��������� ��������� ��������� ��������� ��������� ��������� ��������� ��������� → numerous splitting ◮ OBR is a way to speed up the reduction process 18

  19. Optimality Base Reduction CSP & Global Optimization M. Rueher ◮ Introduced by Ryoo and Sahinidis • to take advantage of the known bounds of the objective function to reduce the size of the domains • uses a well known property of the saddle point to compute new bounds for the domains with the known bounds of the objective function 19

  20. Theorems of OBR CSP & Global Optimization M. Rueher ◮ Let [ L , U ] be the domain of f : ◮ U is an upper-bound of the intial problem P ◮ L is a lower-bound of a convex relaxation R of P If the constraint x i − x i ≤ 0 is active at the optimal solution of R and has a corresponding multiplier i > 0 ( λ ∗ is the optimal solution of the dual of R ), λ ∗ then i = x i − U − L x i ≥ x ′ i with x ′ λ ∗ i if x ′ i > x i , the domain of x i can be shrinked to [ x ′ i , x i ] without loss of any global optima ◮ similar theorems for x i − x i ≤ 0 and g i ( x ) ≤ 0. 20

  21. OBR: intuitions CSP & Global Optimization M. Rueher ◮ Ryoo & Sahinidis 96 f L U x ′ i = x i − U − L x ′ i = x i + U − L λ ∗ λ ∗ i i x i x i x i x ′ x ′ i i i = x i − U − L x i ≥ x ′ i with x ′ λ ∗ i • does not modify the very branch and bound process • almost for free ! 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend