Motivations ( key word ) Decomposition of the solving parts : SS : - - PDF document

motivations key word
SMART_READER_LITE
LIVE PREVIEW

Motivations ( key word ) Decomposition of the solving parts : SS : - - PDF document

Automatically learned Strategies ( Special Topics in Meta-heuristics ) Eric Bourreau, Remi Colleta LIRMM (CS lab) Montpellier University France {Eric.Bourreau,Remi.Coletta}@lirmm.fr Motivations ( key word ) Decomposition of the solving


slide-1
SLIDE 1

1

Automatically learned Strategies (Special Topics in Meta-heuristics)

Eric Bourreau, Remi Colleta

LIRMM (CS lab) Montpellier University France {Eric.Bourreau,Remi.Coletta}@lirmm.fr

EURO XXI 2/17

Motivations (keyword)

  • Decomposition of the solving parts :

–SS : Search Strategy (structure) Var/Val Choice (heuristic) –ES : Exploration Strategy (walk) DFS/BFS, … –LS : Limitation Strategy (restriction) Complete/Partial, …

  • SS

– Min – Max – Split – Input – FirstFail – Nearest – RedCost – Phero

  • ES

– DFS – BFS – BFS-LA – BrFS – PTS – IterSamp – ACO – RDS

  • LS

– LDS(i) – SOL(k) – NOD(n) – Leaves(n) – nbBack(n) – Credit – Barrier – DBDFS

Strategy as Keyword [Towards an on-line

  • ptimisation framework,

EOLE Consortium, OLCP’01]

slide-2
SLIDE 2

2

EURO XXI 3/17

Motivations (sentence)

  • rewrite old heuristic

Masure = DFS(LOC(DP)) Pesant = DFS(k-Opt(Nearest)) Benoist = Branch(Move(Flow)) Rousseau = SEQ(LDS(3,vs1), VNS(1000, vs2))

  • Prop
  • FAIL
  • Success
  • Assign
  • SEQ
  • ALT
  • While
  • THEN

Perron, ILOG, CP’99 Laburthe, Salsa, Cons’02 Degivry, Tools, Comp&OR’04

Search ::= BasicSearch | CompoundSearch BasicSearch ::= Decision | BasicComp Decision ::= Refinement | Transformation Refinement ::= SETV(Variable, Value) | REMV(Variable, Value) | ... | PROP(Integer) Transformation ::= ASS(Variable, Value) | UNASS(Variable) | ... | ENLARGE(Variable, Values) BasicComp ::= FAIL | SUCCESS | SEQ(Search1, Search2) | ALT(Search1, Search2) CompoundSearch ::= GENERATE(Variable) | GENERATE(Variables) | ...| FLIP(Variables) | LOCS(Integer, Variables) | ...

  • add grammar
  • and control

EURO XXI 4/17

Motivations (extend and learn)

<Build> :: Insert(i) | <LDS> | DO(<Build>,<Optimize>) | FORALL(<LDS>,<Optimize>) <Optimize> :: CHAIN(n,m) | TREE(n,m,k) | LNS(n,h,<Build>) | LOOP(n,<Optimize>) | THEN(<Optimize>) <LDS> :: LDS(i,n,l)

  • !

"#$%#!

! % !%#&!!

  • '
  • %
  • (

( (

  • )
  • *
  • *

+ * , *

  • !-

% $$ -!!

  • !

$$ ! !!&

[Caseau, A Meta Heuristic factory for the VRP, CP99]

slide-3
SLIDE 3

3

EURO XXI 5/17

Main Drawbacks

  • In the discovery part :

– During each new generation, you have to evaluate quality of the new strategy. It can be time consuming (especially for bad results !)

  • Still too closed from the problem area

– Pattern are dedicated to

  • Routing problem (k-opt),
  • scheduling (shuffling),
  • frequency assignment (quasi-clique resolution)

EURO XXI 6/17

Outline

  • Motivations / Drawbacks
  • My Solution : Constraint Programming
  • New Framework
  • Experiments (in progress)
  • Conclusion

Encountered Problems

slide-4
SLIDE 4

4

EURO XXI 7/17

Time Machine Learning

  • Quality predictor to evaluate performance quickly

– Build a training set (few thousand) – Validate prediction on important results (time, number of nodes) – Evaluate it on new strategies in polynomial time

  • Possible choices

– Neural Network (perceptron), – Decision tree and regression (MP5, J48), – Instance base learning (k-means, SVM)

[Hutter, Hamady, Hoos, Leyton Brown, CP06] [Adenso Daz, Laguna, OR’06]

EURO XXI 8/17

Restricted Vocabulary

  • Few parameters … use CSP
  • Decision tree on parameters to extract a

still significant subset of parameters

– First interest : reduce the number – Second interest : sort parameter (useful for step after : crossover)

  • Generic representation (integer)
  • Solving = variable ordering in DFS search
slide-5
SLIDE 5

5

EURO XXI 9/17

Constraint Satisfaction Problem

  • An instance of CSP is a triplet (X,C,D)

– X : a set of variables – D : possible domain of X – C : a set of constraints on X

  • A solution is an :

– instantiation for all variable in X – with a value in the domain D – satisfying each constraint C

  • During the search for a solution

– a filtering algorithm for each constraint – remove in the variable domains, inconsistent values – and propagate this information to the constraint network

EURO XXI 10/17

Illustrative example

  • Map Coloring

Mi :: [1,2,3,4] … Mi Mj …

X C D

alldifferent(Ma,Mb,Mc) 4 1 2 3 1 3 3 1,2 1,2 1,2,3,4 Solving Heuristic : VARIABLE( minimal domain, maximum connected) CONSTRAINTS( familly, quality, efficiency) Backdoor

slide-6
SLIDE 6

6

EURO XXI 11/17

Selected Attributes on variables and constraints

  • Static Attributes

– Arity (number of variables involved in the constraint) – Propagation Cost (complexity of the filtering algorithm family) – Propagation Efficiency (

  • Tightness (number of valid solutions divided by number of possible

assignments)

  • Dynamic Attributes (x2 – specific/familly)
  • Average time
  • Number of calls
  • Average pruned values

Score for the selected variable :

  • .
  • .
  • ,
  • ...
  • !
  • =

=

=

  • P

r

  • b

l e m / i n s t a n c e s d e p e n d a n t

EURO XXI 12/17

Framework

Problem (modeled with CSP) n randomized heuristics Neural Network New generations New generations New generations Decision tree Filtered database

{a1σ(0),a1σ(1),…,e1σ(15), time1 … anσ(0),anσ(1),…,enσ(15), timen}

Database of running examples

{a1

0,e1 0,a11,e1 1,…,e1 22, time1 …

an

0,en 0,an 1,en 1,…,en 22, timen}

Genetic algorithm Best candidates

{a1

σ(0),a1 σ(1),…,e1 σ(15), time1

… an’

σ(0),an’ σ(1),…,en’ σ(15), timen’}

Best generation : best heuristic

a*

σ(0),a* σ(1),…,e* σ(15), time~

Classical Heuristic validation

slide-7
SLIDE 7

7

EURO XXI 13/17

Encountered problems (1)

  • Diversification of heuristics

– Necessary for a good learning curve for the prediction – Decision tree haven’t enough information

  • Did not reject attribute
  • Did not sort attribute

Size of database must be big Running samples will take long time Learning curve impacted (retropropagation) Learning phase will take long time Genes order in chromosome impacted Introduce more complex crossover than basic simple

EURO XXI 14/17

Encountered Problems (2)

  • Discretization problem for the targeted class :

– Non uniform distribution – Extreme points are very long to compute

slide-8
SLIDE 8

8

EURO XXI 15/17

Preliminary experiments

  • Sudoku

– Trained on

  • CPU time
  • Number of nodes

– Promising good results

  • -24% CPU / -2% Nodes
  • -27% CPU / -12% Nodes
  • Extension to MagicSquare … TimeTabling
  • Another branch : [map coloring/RLFAP/RCPSP]

EURO XXI 16/17

Conclusion

  • Promising new directions

(no programming skill, no Ph’D in OR, …) (solving part fully automatised)

  • Dynamic calibration = add depth in

learning curve (kind of adaptive strategy)

Perspectives

slide-9
SLIDE 9

9

EURO XXI 17/17

Thanks to the crew