Meta-algorithmic techniques in SAT solving: Automated - - PowerPoint PPT Presentation

meta algorithmic techniques in sat solving automated
SMART_READER_LITE
LIVE PREVIEW

Meta-algorithmic techniques in SAT solving: Automated - - PowerPoint PPT Presentation

Meta-algorithmic techniques in SAT solving: Automated configuration, selection and beyond Holger H. Hoos BETA Lab Department of Computer Science University of British Columbia Canada SAT/SMT Summer School Trento, Italy, 2012/06/12 What


slide-1
SLIDE 1

Meta-algorithmic techniques in SAT solving: Automated configuration, selection and beyond

Holger H. Hoos

BETA Lab Department of Computer Science University of British Columbia Canada

SAT/SMT Summer School Trento, Italy, 2012/06/12

slide-2
SLIDE 2

What fuels progress in SAT solving?

◮ Insights into SAT (theory) ◮ Creativity of algorithm designers

heuristics

◮ Advanced debugging techniques

fuzz testing, delta-debugging

◮ Principled experimentation

statistical techniques, experimentation platforms

◮ SAT competitions

Holger Hoos: Meta-algorithmic techniques in SAT solving 2

slide-3
SLIDE 3

2009: 5 of 27 medals 2011: 28 of 54 medals

Holger Hoos: Meta-algorithmic techniques in SAT solving 3

slide-4
SLIDE 4

Meta-algorithmic techniques

◮ algorithms that operate upon other algorithms (SAT solvers)

= meta-heuristics

◮ here: algorithms whose inputs include one or more SAT solvers

◮ configurators (e.g., ParamILS, GGA, SMAC) ◮ selectors (e.g., SATzilla, 3S) ◮ schedulers (e.g., aspeed; also: 3S, SATzilla) ◮ (parallel) portfolios (e.g., ManySAT, ppfolio) ◮ run-time predictors ◮ experimentation platforms (e.g., EDACC, HAL) Holger Hoos: Meta-algorithmic techniques in SAT solving 4

slide-5
SLIDE 5

Why are meta-algorithmic techniques important?

◮ no one knows how to best solve SAT (or any other

NP-hard problem) no single dominant solver

◮ state-of-the-art performance often achieved by combinations

  • f various heuristic choices (e.g., pre-processing;

variable/value selection heuristic; restart rules; data structures; . . . ) complex interactions, unexpected behaviour

◮ performance can be tricky to assess due to

◮ differences in behaviour across problem instances ◮ stochasticity

potential for suboptimal choices in solver development, applications

Holger Hoos: Meta-algorithmic techniques in SAT solving 5

slide-6
SLIDE 6

Why are meta-algorithmic techniques important?

◮ human intuitions can be misleading, abilities are limited

substantial benefit from augmentation with computational techniques

◮ use of fully specified procedures (rather than intuition /

ad hoc choices) can improve reproducibility, facilitate scientific analysis / understanding

Holger Hoos: Meta-algorithmic techniques in SAT solving 6

slide-7
SLIDE 7

SAT is a hotbed for meta-algorithmic work!

◮ Drosophila problem for computing science (and beyond)

◮ prototypical NP-hard problem ◮ prominent in various areas of CS and beyond ◮ important applications ◮ conceptual simplicity aids solver design / development

◮ active and diverse community ◮ SAT competitions

Holger Hoos: Meta-algorithmic techniques in SAT solving 7

slide-8
SLIDE 8

Outline

  • 1. Introduction
  • 2. Algorithm configuration
  • 3. Algorithm selection
  • 4. Algorithm scheduling
  • 5. Parallel algorithm portfolios
  • 6. Programming by Optimisation (PbO)

Holger Hoos: Meta-algorithmic techniques in SAT solving 8

slide-9
SLIDE 9

Traditional algorithm design approach:

◮ iterative, manual process ◮ designer gradually introduces/modifies components or

mechanisms

◮ performance is tested on benchmark instances ◮ design often starts from generic or broadly applicable

problem solving method (e.g., evolutionary algorithm)

Holger Hoos: Meta-algorithmic techniques in SAT solving 9

slide-10
SLIDE 10

Note:

◮ During the design process, many decisions are made. ◮ Some choices take the form of parameters,

  • thers are hard-coded.

◮ Design decisions interact in complex ways.

Holger Hoos: Meta-algorithmic techniques in SAT solving 10

slide-11
SLIDE 11

Problems:

◮ Design process is labour-intensive. ◮ Design decisions often made in ad-hoc fasion,

based on limited experimentation and intuition.

◮ Human designers typically over-generalise observations,

explore few designs.

◮ Implicit assumptions of independence, monotonicity

are often incorrect.

◮ Number of components and mechanisms tends to grow

in each stage of design process. complicated designs, unfulfilled performance potential

Holger Hoos: Meta-algorithmic techniques in SAT solving 11

slide-12
SLIDE 12

Solution: Automated Algorithm Configuration

◮ Key idea: expose design choices as parameters,

Key idea: use automated procedure to effectively Key idea: explore resulting configuration space

◮ Given: algorithm A with parameters ¯

p and configuration space C; set Π of benchmark instances, performance metric m

◮ Objective: find configuration c∗ ∈ C for which A performs

best on Π according to metric m

Holger Hoos: Meta-algorithmic techniques in SAT solving 12

slide-13
SLIDE 13

Algorithm configuration

Lo Hi

Holger Hoos: Meta-algorithmic techniques in SAT solving 13

slide-14
SLIDE 14

Example: SAT-based software verification

Hutter, Babic, HH, Hu (2007) ◮ Goal: Solve suite of SAT-encoded software verification

Goal: instances as fast as possible

◮ new DPLL-style SAT solver Spear (by Domagoj Babic)

= highly parameterised heuristic algorithm = (26 parameters, ≈ 8.3 × 1017 configurations)

◮ manual configuration by algorithm designer ◮ automated configuration using ParamILS, a generic

algorithm configuration procedure

Hutter, HH, St¨ utzle (2007)

Holger Hoos: Meta-algorithmic techniques in SAT solving 14

slide-15
SLIDE 15

Spear: Empirical results on software verification benchmarks solver

  • num. solved

mean run-time MiniSAT 2.0 302/302 161.3 CPU sec Spear original 298/302 787.1 CPU sec Spear generic. opt. config. 302/302 35.9 CPU sec Spear specific. opt. config. 302/302 1.5 CPU sec

◮ ≈ 500-fold speedup through use automated algorithm

configuration procedure (ParamILS)

◮ new state of the art (winner of 2007 SMT Competition, QF BV category)

Holger Hoos: Meta-algorithmic techniques in SAT solving 15

slide-16
SLIDE 16

SPEAR on software verification

10

−2 10 −1 10 0 10 1 10 2 10 3 10 4

10

−2

10

−1

10 10

1

10

2

10

3

10

4

default config. [CPU sec] auto-config. [CPU sec]

Holger Hoos: Meta-algorithmic techniques in SAT solving 16

slide-17
SLIDE 17

Advantages of using automated configurators:

◮ enables better exploration of larger design spaces

better performance

◮ lets human designer focus on higher-level issues

more effective use of human expertise / creativity

◮ uses principled, fully formalised methods to find

good configurations better reproducibility, fairer comparisons, insights

◮ can be used to customise algorithms for use in

specific application context with minimal human effort expanded range of successful applications

Holger Hoos: Meta-algorithmic techniques in SAT solving 17

slide-18
SLIDE 18

Approaches to automated algorithm configuration

◮ Standard optimisation techniques (e.g., CMA-ES – Hansen & Ostermeier 2001; MADS – Audet & Orban 2006) ◮ Advanced sampling methods (e.g., REVAC – Nannen & Eiben 2006–9) ◮ Racing (e.g., F-Race – Birattari, St¨ utzle, Paquete, Varrentrapp 2002; Iterative F-Race – Balaprakash, Birattari, St¨ utzle 2007) ◮ Model-free search (e.g., ParamILS – Hutter, HH, St¨ utzle 2007; Hutter, HH, Leyton-Brown, St¨ utzle 2009; GGA – Ans´

  • tegui, Sellmann, Tierney 2009)

◮ Sequential model-based optimisation (e.g., SPO – Bartz-Beielstein 2006; SMAC – Hutter, HH, Leyton-Brown 2011–12)

Holger Hoos: Meta-algorithmic techniques in SAT solving 18

slide-19
SLIDE 19

Iterated Local Search (Initialisation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 19

slide-20
SLIDE 20

Iterated Local Search (Initialisation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 19

slide-21
SLIDE 21

Iterated Local Search (Local Search)

Holger Hoos: Meta-algorithmic techniques in SAT solving 19

slide-22
SLIDE 22

Iterated Local Search (Perturbation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 19

slide-23
SLIDE 23

Iterated Local Search (Local Search)

Holger Hoos: Meta-algorithmic techniques in SAT solving 19

slide-24
SLIDE 24

Iterated Local Search

?

Selection (using Acceptance Criterion)

Holger Hoos: Meta-algorithmic techniques in SAT solving 19

slide-25
SLIDE 25

Iterated Local Search (Perturbation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 19

slide-26
SLIDE 26

ParamILS

◮ iterated local search in configuration space ◮ initialisation: pick best of default + R random configurations ◮ subsidiary local search: iterative first improvement,

change one parameter in each step

◮ perturbation: change s randomly chosen parameters ◮ acceptance criterion: always select better configuration ◮ number of runs per configuration increases over time;

ensure that incumbent always has same number of runs as challengers

Holger Hoos: Meta-algorithmic techniques in SAT solving 20

slide-27
SLIDE 27

SATenstein: Automatically Building Local Search Solvers for SAT

KhudaBukhsh, Xu, HH, Leyton-Brown (2009)

Frankenstein: create perfect human being from scavenged body parts SATenstein: create perfect SAT solvers using components scavenged from existing solvers Geneneral approach:

◮ components from GSAT, WalkSAT, dynamic local search and

G2WSAT algorithms

◮ flexible SLS framework (derived from UBCSAT) ◮ find performance-optimising instantiations using ParamILS

Holger Hoos: Meta-algorithmic techniques in SAT solving 21

slide-28
SLIDE 28

Challenge:

◮ 41 parameters (mostly categorical) ◮ over 2 · 1011 configurations ◮ 6 well-known distributions of SAT instances

(QCP, SW-GCP, R3SAT, HGEN, FAC, CBMC-SE)

◮ 11 challenger algorithms

(includes all winning SLS solvers from SAT competitions 2003–2008)

Holger Hoos: Meta-algorithmic techniques in SAT solving 22

slide-29
SLIDE 29

Result:

◮ factor 70–1000 performance improvements over best

challengers on QCP, HGEN, CBMC-SE

◮ factor 1.4–2 performance improvement over best challengers

  • n SW-GCP, R3SAT, FAC

Holger Hoos: Meta-algorithmic techniques in SAT solving 23

slide-30
SLIDE 30

SATenstein-LS vs VW on CBMC-SE

10

−2

10

−1

10 10

1

10

2

10

3

10

−2

10

−1

10 10

1

10

2

10

3

SATenstein−LS[CBMC−SE] median runtime (CPU sec) VW median runtime (CPU sec)

SOLVED BY BOTH SOLVED BY ONE Holger Hoos: Meta-algorithmic techniques in SAT solving 24

slide-31
SLIDE 31

SATenstein-LS vs Oracle on CBMC-SE

10

−2

10

−1

10 10

1

10

2

10

3

10

−2

10

−1

10 10

1

10

2

10

3

SATenstein−LS[CBMC−SE] median runtime (CPU sec) Oracle median runtime (CPU sec)

SOLVED BY BOTH SOLVED BY ONE Holger Hoos: Meta-algorithmic techniques in SAT solving 25

slide-32
SLIDE 32

Algorithm selection

Off-line (per-distribution) algorithm selection:

◮ Given: set S of algorithms for a problem, representative

distribution (or set) of problem instance Π

◮ Objective: select from S the algorithm expected to solve

instances from Π most efficiently (w.r.t. aggregate performance measure) = single best solver SAT competitions

Holger Hoos: Meta-algorithmic techniques in SAT solving 26

slide-33
SLIDE 33

Methods for off-line algorithm selection

◮ exhaustive evaluation: run all solvers on all instances ◮ racing methods: eliminate solvers as soon as they

“fall significantly behind” the current leader

(Maron & Moore 1994; Birattari, St¨ utzle, Paquete, Varrentrapp 2002)

Holger Hoos: Meta-algorithmic techniques in SAT solving 27

slide-34
SLIDE 34

Racing (for Algorithm Selection)

1 2 3 4 5

algorithms problem instances

(Initialisation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 28

slide-35
SLIDE 35

Racing (for Algorithm Selection)

3 4 5

     

algorithms problem instances

 

1 2

(Initialisation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 28

slide-36
SLIDE 36

Racing (for Algorithm Selection)

5

  

algorithms problem instances

 

1 2 3 4

     

winner

(Initialisation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 28

slide-37
SLIDE 37

Potential problem:

inhomogenous benchmark sets / distributions may not correctly identify single best solver

Solutions:

◮ use sets of instances at each stage of the race ◮ racing within (reasonably) homogenous subsets ◮ order instances according to difficulty (ordered racing)

(Styles & HH – in preparation)

Note:

racing can also be used for algorithm configuration (requires special techniques for handling large configuration spaces)

(Balaprakash, Birattari, St¨ utzle 2007)

Holger Hoos: Meta-algorithmic techniques in SAT solving 29

slide-38
SLIDE 38

Instance-based algorithm selection (Rice 1976):

◮ Given: set S of algorithms for a problem, problem instance π ◮ Objective: select from S the algorithm expected to solve π

most efficiently, based on (cheaply computable) features of π

Note:

Best case performance bounded by oracle, which selects the best s ∈ S for each π = virtual best solver (VBS)

Holger Hoos: Meta-algorithmic techniques in SAT solving 30

slide-39
SLIDE 39

Instance-based algorithm selection

selector component algorithms feature extractor

Holger Hoos: Meta-algorithmic techniques in SAT solving 31

slide-40
SLIDE 40

Key components:

◮ set of (state-of-the-art) solvers ◮ set of cheaply computable, informative features ◮ efficient procedure for mapping features to solvers (selector) ◮ training data ◮ procedure for building good selector based on training data

(selector builder)

Holger Hoos: Meta-algorithmic techniques in SAT solving 32

slide-41
SLIDE 41

Methods for instance-based selection:

◮ classification-based: predict the best solver,

e.g., using . . .

◮ decision trees

(Guerri & Milano 2004)

◮ case-based reasoning

(Gebruers, Guerri, Hnich, Milano 2004)

◮ (weighted) k-nearest neighbours

(Malitsky, Sabharwal, Samulowitz, Sellmann 2011; Kadioglu, Malitsky, Sabharwal, Samulowitz, Sellmann 2011)

◮ pairwise cost-sensitive decision forests + voting

(Xu, Hutter, HH, Leyton-Brown 2012) ◮ regression-based: predict running time for each solver,

select the one predicted to be fastest

(Leyton-Brown, Nudelman, Shoham 2003; Xu, Hutter, HH, Leyton-Brown 2007–9)

Holger Hoos: Meta-algorithmic techniques in SAT solving 33

slide-42
SLIDE 42

Instance features:

◮ Use generic and problem-specific features that correlate with

performance and can be computed (relatively) cheaply:

◮ number of clauses, variables, . . . ◮ constraint graph features ◮ local & complete search probes

◮ Use as features statistics of distributions,

e.g., variation coefficient of node degree in constraint graph

◮ Consider combinations of features (e.g., pairwise products

quadratic basis function expansion).

Holger Hoos: Meta-algorithmic techniques in SAT solving 34

slide-43
SLIDE 43

SATzilla 2007–9 (Xu, Hutter, HH, Leyton-Brown):

◮ use state-of-the-art complete (DPLL/CDCL) and incomplete

(local search) SAT solvers

◮ extract (up to) 84 polytime-computable instance features ◮ use ridge regression on selected features to predict solver

run-times from instance features (one model per solver)

◮ run solver with best predicted performance

Holger Hoos: Meta-algorithmic techniques in SAT solving 35

slide-44
SLIDE 44

Some bells and whistles:

◮ use pre-solvers to solve ‘easy’ instances quickly ◮ build run-time predictors for various types of instances,

use classifier to select best predictor based on instance features.

◮ predict time required for feature computation; if that time

is too long (or error occurs), use back-up solver

◮ use method by Schmee & Hahn (1979) to deal with

censored run-time data prizes in 5 of the 9 main categories of the 2009 SAT Solver Competition (3 gold, 2 silver medals)

Holger Hoos: Meta-algorithmic techniques in SAT solving 36

slide-45
SLIDE 45

The problem with standard classification approaches

Crucial assumption: solvers behave similarly on instances with similar features But do they really?

◮ uninformative features ◮ correlated features ◮ feature normalisation (tricky!) ◮ cost of misclassification

Holger Hoos: Meta-algorithmic techniques in SAT solving 37

slide-46
SLIDE 46

SATzilla 2011–12 (Xu, Hutter, HH, Leyton-Brown 2012):

◮ uses cost-based decision forests to directly select solver

based on features

◮ one predictive model for each pair of solvers (which is better?) ◮ majority voting (over pairwise predictions) to select

solver to be run (further details, results next Monday, 11:30 session)

Holger Hoos: Meta-algorithmic techniques in SAT solving 38

slide-47
SLIDE 47

Hydra: Automatically Configuring Algorithms for Portfolio-Based Selection

Xu, HH, Leyton-Brown (2010)

Note:

◮ SATenstein builds solvers that work well on average on a given

set of SAT instances but: may have to settle for compromises for broad, heterogenous sets

◮ SATzilla builds algorithm selector based on given set

  • f SAT solvers

but: success entirely depends on quality of given solvers Idea: Combine the two approaches portfolio-based selection from set of automatically constructed solvers

Holger Hoos: Meta-algorithmic techniques in SAT solving 39

slide-48
SLIDE 48

Configuration + Selection = Hydra

parametric algorithm (multiple configurations) selector feature extractor Holger Hoos: Meta-algorithmic techniques in SAT solving 40

slide-49
SLIDE 49

Simple combination:

  • 1. build solvers for various types of instances using automated

algorithm configuration

  • 2. construct portfolio-based selector from these

Drawback: Requires suitably defined sets of instances

Better solution:

iteratively build & add solvers that improve performance

  • f given portfolio

Hydra Note: Builds portfolios solely using

◮ generic, highly configurable solver (e.g., SATenstein) ◮ instance features (as used in SATzilla)

Holger Hoos: Meta-algorithmic techniques in SAT solving 41

slide-50
SLIDE 50

Results on mixture of 6 well-known benchmark sets

10

  • 2

10 10

2

10

  • 2

10

  • 1

10 10

1

10

2

10

3

Hydra[BM, 1] PAR Score Hydra[BM, 7] PAR Score

Holger Hoos: Meta-algorithmic techniques in SAT solving 42

slide-51
SLIDE 51

Results on mixture of 6 well-known benchmark sets

1 2 3 4 5 6 7 8 50 100 150 200 250 300 Number of Hydra Steps PAR Score Hydra[BM] training Hydra[BM] test SATenFACT on training SATenFACT on test

Holger Hoos: Meta-algorithmic techniques in SAT solving 43

slide-52
SLIDE 52

Note:

◮ Hydra can use arbitrary algorithm configurators,

selector builders

◮ different approaches are possible: e.g., ISAC, based on

feature-based instance clustering, distance-based selection

(Kadioglu, Malitsky, Sellmann, Tierney 2010)

Holger Hoos: Meta-algorithmic techniques in SAT solving 44

slide-53
SLIDE 53

Algorithm scheduling

Note:

SATzilla and 3S∗ use a sequence of (pre-)solvers before instance-based selection.

∗ Kadioglu, Malitsky, Sabharwal, Samulowitz, Sellmann (2011) Holger Hoos: Meta-algorithmic techniques in SAT solving 45

slide-54
SLIDE 54

Algorithm Scheduling

schedule Holger Hoos: Meta-algorithmic techniques in SAT solving 46

slide-55
SLIDE 55

Questions:

  • 1. How to determine that sequence?
  • 2. How much performance can be obtained from solver

scheduling only?

Holger Hoos: Meta-algorithmic techniques in SAT solving 47

slide-56
SLIDE 56

Methods for algorithm scheduling methods:

◮ exhaustive search (as done SATzilla)

expensive; limited to few solvers, cutoff times

◮ based on optimisation procedure

◮ using integer programming (IP) techniques

3S – Kadioglu et al. (2011)

◮ using answer-set-programming (ASP) formulation + solver

aspeed – HH, Kaminski, Schaub, Schneider (2012)

Holger Hoos: Meta-algorithmic techniques in SAT solving 48

slide-57
SLIDE 57

Preliminary performance results:

Performance of pure scheduling can be suprisingly close to that of combined scheduling + selection (full SATzilla).

(HH, Kaminski, Schaub, Schneider – in preparation; Xu, Hutter, HH, Leyton-Brown – in preparation)

Holger Hoos: Meta-algorithmic techniques in SAT solving 49

slide-58
SLIDE 58

Notes:

◮ the ASP solver clasp used by aspeed is powered by a

(state-of-the-art) SAT solver core

◮ pure algorithm scheduling (e.g., aspeed) does not require

instance features

◮ sequential schedules can be parallelised easily and effectively (HH, Kaminski, Schaub, Schneider – 2012)

Holger Hoos: Meta-algorithmic techniques in SAT solving 50

slide-59
SLIDE 59

Parallel algorithm portfolios

Key idea:

Exploit complementary strengths by running multiple algorithms (or instances of a randomised algorithm) concurrently.

Holger Hoos: Meta-algorithmic techniques in SAT solving 51

slide-60
SLIDE 60

Parallel Algorithm Portfolios

Holger Hoos: Meta-algorithmic techniques in SAT solving 51

slide-61
SLIDE 61

Parallel algorithm portfolios

Key idea:

Exploit complementary strengths by running multiple algorithms (or instances of a randomised algorithm) concurrently. risk vs reward (expected running time) tradeoff, robust performance on a wide range of instances

Huberman, Lukose, Hogg (1997); Gomes & Selman (1997,2000)

Note:

◮ can be realised through time-sharing / multi-tasking ◮ particularly attractive for multi-core / multi-processor

architectures

Holger Hoos: Meta-algorithmic techniques in SAT solving 51

slide-62
SLIDE 62

Application to decision problems (like SAT, SMT):

Concurrently run given component solvers until the first of them solves the instance. running time on instance π = (# solvers) × (running time of VBS on π)

Examples:

◮ ManySAT (Hamadi, Jabbour, Sais 2009; Guo, Hamadi,

Jabbour, Sais 2010)

◮ Plingeling (Biere 2010–11) ◮ ppfolio (Roussel 2011)

excellent performance (see 2009, 2011 SAT competitions)

Holger Hoos: Meta-algorithmic techniques in SAT solving 52

slide-63
SLIDE 63

Types of parallel portfolios:

◮ static vs instance-based vs dynamic ◮ with or without communication (SAT: clause sharing)

Methods for constructing (static) parallel portfolios:

◮ manual (by human expert, based on experimentation) ◮ classification maximisation (Petrik & Zilberstein 2006) ◮ case-based reasoning + greedy construction heuristic (Yun &

Epstein 2012)

Holger Hoos: Meta-algorithmic techniques in SAT solving 53

slide-64
SLIDE 64

Constructing portfolios from a single parametric solver

HH, Leyton-Brown, Schaub, Schneider (under review)

Key idea: Take single parametric solver, find configurations that make an effective parallel portfolio (analogous to Hydra). Note: This allows to automatically obtain parallel solvers from sequential sources (automatic parallisation)

Methods for constructing such portfolios:

◮ global optimisation:

simultaneous configuration of all component solvers

◮ greedy construction:

add + configure one component at a time

Holger Hoos: Meta-algorithmic techniques in SAT solving 54

slide-65
SLIDE 65

Preliminary results on competition application instances (4 components) solver PAR1 PAR10 #timeouts ManySAT(1.1) 1887 16 003 213/679 ManySAT(2.0) 1998 17 373 232/679 Plingeling (276) 1850 15 437 205/679 Plingeling (587) 1684 13 812 183/679 Greedy-MT4(Lingeling) 1717 13 712 181/679 ppfolio 1646 13 310 176/679 CryptoMiniSat 1600 12 271 161/679 VBS over all of the above 1282 10 296 136/679

Holger Hoos: Meta-algorithmic techniques in SAT solving 55

slide-66
SLIDE 66

Algorithm configuration

Lo Hi

Holger Hoos: Meta-algorithmic techniques in SAT solving 56

slide-67
SLIDE 67

Algorithm configuration

Lo Hi

Holger Hoos: Meta-algorithmic techniques in SAT solving 56

slide-68
SLIDE 68

The next step: Programming by Optimisation (PbO)

HH (2010–12)

Key idea:

◮ specify large, rich design spaces of solver for given problem

◮ avoid premature, uninformed, possibly detrimental

design choices

◮ active development of promising alternatives for

design components

◮ automatically make choices to obtain solver optimised

for given use context

Holger Hoos: Meta-algorithmic techniques in SAT solving 57

slide-69
SLIDE 69

application context solver

  • ptimised

solver design space

  • f solvers

Holger Hoos: Meta-algorithmic techniques in SAT solving 58

slide-70
SLIDE 70

application context planner planner solver

  • ptimised

solver parallel portfolio instance- based selector design space

  • f solvers

Holger Hoos: Meta-algorithmic techniques in SAT solving 58

slide-71
SLIDE 71

application context planner planner

  • ptimised

solver parallel portfolio instance- based selector design space

  • f solvers

Holger Hoos: Meta-algorithmic techniques in SAT solving 58

slide-72
SLIDE 72

Levels of PbO:

Level 4: Make no design choice prematurely that cannot be justified compellingly. Level 3: Strive to provide design choices and alternatives. Level 2: Keep and expose design choices considered during software development. Level 1: Expose design choices hardwired into existing code (magic constants, hidden parameters, abandoned design alternatives). Level 0: Optimise settings of parameters exposed by existing software.

Holger Hoos: Meta-algorithmic techniques in SAT solving 59

slide-73
SLIDE 73

Success in optimising speed:

Application, Design choices Speedup PbO level SAT-based software verification (Spear), 41

Hutter, Babi´ c, HH, Hu (2007)

4.5–500 × 2–3 AI Planning (LPG), 62

Vallati, Fawcett, Gerevini, HH, Saetti (2011)

3–118 × 1 Mixed integer programming (CPLEX), 76

Hutter, HH, Leyton-Brown (2010)

2–52 ×

... and solution quality:

University timetabling, 18 design choices, PbO level 2–3 new state of the art; UBC exam scheduling

Fawcett, Chiarandini, HH (2009) Holger Hoos: Meta-algorithmic techniques in SAT solving 60

slide-74
SLIDE 74

Software development in the PbO paradigm

use context

PbO-<L> source(s) parametric <L> source(s) instantiated <L> source(s) deployed executable design space description PbO-<L> weaver PbO design

  • ptimiser

benchmark inputs Holger Hoos: Meta-algorithmic techniques in SAT solving 61

slide-75
SLIDE 75

Design space specification

Option 1: use language-specific mechanisms

◮ command-line parameters ◮ conditional execution ◮ conditional compilation (ifdef)

Option 2: generic programming language extension

Dedicated support for . . .

◮ exposing parameters ◮ specifying alternative blocks of code

Holger Hoos: Meta-algorithmic techniques in SAT solving 62

slide-76
SLIDE 76

Advantages of generic language extension:

◮ reduced overhead for programmer ◮ clean separation of design choices from other code ◮ dedicated PbO support in software development environments

Key idea:

◮ augmented sources: PbO-Java = Java + PbO constructs, . . . ◮ tool to compile down into target language: weaver

Holger Hoos: Meta-algorithmic techniques in SAT solving 63

slide-77
SLIDE 77

use context

PbO-<L> source(s) parametric <L> source(s) instantiated <L> source(s) deployed executable design space description PbO-<L> weaver PbO design

  • ptimiser

benchmark input Holger Hoos: Meta-algorithmic techniques in SAT solving 64

slide-78
SLIDE 78

Exposing parameters

... numerator -= (int) (numerator / (adjfactor+1) * 1.4); ... ... ##PARAM(float multiplier=1.4) numerator -= (int) (numerator / (adjfactor+1) * ##multiplier); ... ◮ parameter declarations can appear at arbitrary places

(before or after first use of parameter)

◮ access to parameters is read-only (values can only be

set/changed via command-line or config file)

Holger Hoos: Meta-algorithmic techniques in SAT solving 65

slide-79
SLIDE 79

Specifying design alternatives

◮ Choice: set of interchangeable fragments of code

that represent design alternatives (instances of choice)

◮ Choice point:

location in a program at which a choice is available ##BEGIN CHOICE preProcessing <block 1> ##END CHOICE preProcessing

Holger Hoos: Meta-algorithmic techniques in SAT solving 66

slide-80
SLIDE 80

Specifying design alternatives

◮ Choice: set of interchangeable fragments of code

that represent design alternatives (instances of choice)

◮ Choice point:

location in a program at which a choice is available ##BEGIN CHOICE preProcessing=standard <block S> ##END CHOICE preProcessing ##BEGIN CHOICE preProcessing=enhanced <block E> ##END CHOICE preProcessing

Holger Hoos: Meta-algorithmic techniques in SAT solving 66

slide-81
SLIDE 81

Specifying design alternatives

◮ Choice: set of interchangeable fragments of code

that represent design alternatives (instances of choice)

◮ Choice point:

location in a program at which a choice is available ##BEGIN CHOICE preProcessing <block 1> ##END CHOICE preProcessing ... ##BEGIN CHOICE preProcessing <block 2> ##END CHOICE preProcessing

Holger Hoos: Meta-algorithmic techniques in SAT solving 66

slide-82
SLIDE 82

Specifying design alternatives

◮ Choice: set of interchangeable fragments of code

that represent design alternatives (instances of choice)

◮ Choice point:

location in a program at which a choice is available ##BEGIN CHOICE preProcessing <block 1a> ##BEGIN CHOICE extraPreProcessing <block 2> ##END CHOICE extraPreProcessing <block 1b> ##END CHOICE preProcessing

Holger Hoos: Meta-algorithmic techniques in SAT solving 66

slide-83
SLIDE 83
  • Holger Hoos: Meta-algorithmic techniques in SAT solving

67

slide-84
SLIDE 84

The Weaver

transforms PbO-<L> code into <L> code (<L> = Java, C++, . . . )

◮ parametric mode:

◮ expose parameters ◮ make choices accessible via (conditional, categorical)

parameters

◮ (partial) instantiation mode:

◮ hardwire (some) parameters into code

(expose others)

◮ hardwire (some) choices into code

(make others accessible via parameters)

Holger Hoos: Meta-algorithmic techniques in SAT solving 68

slide-85
SLIDE 85
  • Holger Hoos: Meta-algorithmic techniques in SAT solving

69

slide-86
SLIDE 86

Design space optimisation

Use previously discussed techniques for . . .

◮ automated algorithm configuration (e.g., ParamILS, . . . ) ◮ instance-based selector construction (e.g., Hydra) ◮ parallel portfolio construction (e.g., Schneider, )

Much room for further improvements:

◮ better optimisation procedures (e.g., for configuration) ◮ meta-optimisation (optimising the optimisers) ◮ scenario-based optimiser selection ◮ parallel portfolios of design optimisation procedures

Holger Hoos: Meta-algorithmic techniques in SAT solving 70

slide-87
SLIDE 87

Cost & concerns

But what about ...

◮ Computational complexity? ◮ Cost of development? ◮ Limitations of scope?

Holger Hoos: Meta-algorithmic techniques in SAT solving 71

slide-88
SLIDE 88

Computationally too expensive?

Spear revisited:

◮ total configuration time on software verification benchmarks:

≈ 30 CPU days

◮ wall-clock time on 10 CPU cluster:

≈ 3 days

◮ cost on Amazon Elastic Compute Cloud (EC2):

61.20 USD (= 42.58 EUR)

◮ 61.20 USD pays for ...

◮ 1:45 hours of average software engineer ◮ 8:26 hours at minimum wage Holger Hoos: Meta-algorithmic techniques in SAT solving 72

slide-89
SLIDE 89

Too expensive in terms of development?

Design and coding:

◮ tradeoff between performance/flexibility and overhead ◮ overhead depends on level of PbO ◮ traditional approach: cost from manual exploration of

design choices!

Testing and debugging:

◮ design alternatives for individual mechanisms and components

can be tested separately effort linear (rather than exponential) in the number of design choices

Holger Hoos: Meta-algorithmic techniques in SAT solving 73

slide-90
SLIDE 90

Limited to the “niche” of NP-hard problem solving?

Some PbO-flavoured work in the literature:

◮ computing-platform-specific performance optimisation

  • f linear algebra routines

(Whaley et al. 2001) ◮ optimisation of sorting algorithms

using genetic programming

(Li et al. 2005) ◮ compiler optimisation (Pan & Eigenmann 2006, Cavazos et al. 2007) ◮ database server configuration (Diao et al. 2003)

Holger Hoos: Meta-algorithmic techniques in SAT solving 74

slide-91
SLIDE 91

The road ahead

◮ Support for PbO-based software development

◮ Weavers for PbO-C, PbO-C++, PbO-Java ◮ PbO-aware development platforms ◮ Improved / integrated PbO design optimiser

◮ Best practices ◮ Many further applications ◮ Scientific insights

Holger Hoos: Meta-algorithmic techniques in SAT solving 75

slide-92
SLIDE 92

Communications of the ACM, 55(2), pp. 70–80, February 2012

www.prog-by-opt.net

slide-93
SLIDE 93

Meta-algorithmic techniques . . .

◮ leverage computational power to construct

better algorithms

◮ liberate human designers from boring, menial tasks and

lets them focus on higher-level design issues

◮ facilitates principled design of heuristic algorithms ◮ profoundly changes how we build and use algorithms

Holger Hoos: Meta-algorithmic techniques in SAT solving 76

slide-94
SLIDE 94

Three (slightly provocative?) predictions

◮ Meta-algorithmic techniques will become indispensable

for solving SAT, SMT and all other NP-hard problems.

◮ Programming by Optimisation (or a closely related approach)

will become as ubiquitous as compilation.

◮ Driven by SAT, SMT, PbO and advances in automated

testing / debugging, program synthesis from higher-level designs will become practical and widely used.

Holger Hoos: Meta-algorithmic techniques in SAT solving 77