On Automated Parameter Tuning, with Applications in Next-Generation - - PowerPoint PPT Presentation

on automated parameter tuning with applications in next
SMART_READER_LITE
LIVE PREVIEW

On Automated Parameter Tuning, with Applications in Next-Generation - - PowerPoint PPT Presentation

On Automated Parameter Tuning, with Applications in Next-Generation Manufacturing Lars Kotthofg and Patrick Johnson Artifjcially Intelligent Manufacturing Center University of Wyoming larsko,pjohns27@uwyo.edu UCC, 02 April 2019 1 Big


slide-1
SLIDE 1

On Automated Parameter Tuning, with Applications in Next-Generation Manufacturing

Lars Kotthofg and Patrick Johnson

Artifjcially Intelligent Manufacturing Center University of Wyoming larsko,pjohns27@uwyo.edu UCC, 02 April 2019

1

slide-2
SLIDE 2

Big Picture

▷ advance the state of the art through meta-algorithmic techniques ▷ rather than inventing new things, use existing things more intelligently – automatically ▷ invent new things through combinations of existing things

2

slide-3
SLIDE 3

Motivation – Performance Difgerences

0.1 1 10 100 1000 0.1 1 10 100 1000 Virtual Best SAT Virtual Best CSP

Hurley, Barry, Lars Kotthofg, Yuri Malitsky, and Barry O’Sullivan. “Proteus: A Hierarchical Portfolio of Solvers and Transformations.” In CPAIOR, 2014. 3

slide-4
SLIDE 4

Motivation – Performance Improvements

10

−2 10 −1 10 0 10 1 10 2 10 3 10 4

10

−2

10

−1

10 10

1

10

2

10

3

10

4

SPEAR, original default (s) SPEAR, optimized for SWV (s)

Hutter, Frank, Domagoj Babic, Holger H. Hoos, and Alan J. Hu. “Boosting Verifjcation by Automatic Tuning of Decision Procedures.” In FMCAD ’07: Proceedings of the Formal Methods in Computer Aided Design, 27–34. Washington, DC, USA: IEEE Computer Society, 2007. 4

slide-5
SLIDE 5

What to Tune – Parameters

▷ anything you can change that makes sense to change ▷ e.g. search heuristic, variable ordering, type of global constraint decomposition ▷ not random seed, whether to enable debugging, etc. ▷ some will afgect performance, others will have no efgect at all

5

slide-6
SLIDE 6

Automated Parameter Tuning

Frank Hutter and Marius Lindauer, “Algorithm Confjguration: A Hands on Tutorial”, AAAI 2016 6

slide-7
SLIDE 7

General Approach

▷ evaluate algorithm as black-box function ▷ observe efgect of parameters without knowing the inner workings ▷ decide where to evaluate next ▷ balance diversifjcation/exploration and intensifjcation/exploitation ▷ repeat until satisfjed

7

slide-8
SLIDE 8

Grid and Random Search

▷ evaluate certain points in parameter space

Bergstra, James, and Yoshua Bengio. “Random Search for Hyper-Parameter Optimization.” J. Mach. Learn.

  • Res. 13, no. 1 (February 2012): 281–305.

8

slide-9
SLIDE 9

Local Search

▷ start with random confjguration ▷ change a single parameter (local search step) ▷ if better, keep the change, else revert ▷ repeat, stop when resources exhausted or desired solution quality achieved ▷ restart occasionally with new random confjgurations

9

slide-10
SLIDE 10

Local Search Example

(Initialisation)

graphics by Holger Hoos 10

slide-11
SLIDE 11

Local Search Example

(Initialisation)

graphics by Holger Hoos 11

slide-12
SLIDE 12

Local Search Example

(Local Search)

graphics by Holger Hoos 12

slide-13
SLIDE 13

Local Search Example

(Local Search)

graphics by Holger Hoos 13

slide-14
SLIDE 14

Local Search Example

(Perturbation)

graphics by Holger Hoos 14

slide-15
SLIDE 15

Local Search Example

(Local Search)

graphics by Holger Hoos 15

slide-16
SLIDE 16

Local Search Example

(Local Search)

graphics by Holger Hoos 16

slide-17
SLIDE 17

Local Search Example

(Local Search)

graphics by Holger Hoos 17

slide-18
SLIDE 18

Local Search Example

?

Selection (using Acceptance Criterion)

graphics by Holger Hoos 18

slide-19
SLIDE 19

Surrogate-Model-Based Search

▷ evaluate small number of initial (random) confjgurations ▷ build surrogate model of parameter-performance surface based

  • n this

▷ use model to predict where to evaluate next ▷ repeat, stop when resources exhausted or desired solution quality achieved ▷ allows targeted exploration of promising confjgurations

19

slide-20
SLIDE 20

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0.000 0.005 0.010 0.015 0.020 0.025

x type

  • init

prop

type

y yhat ei

Iter = 1, Gap = 1.9909e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 20

slide-21
SLIDE 21

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0.00 0.01 0.02 0.03

x type

  • init

prop seq

type

y yhat ei

Iter = 2, Gap = 1.9909e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 21

slide-22
SLIDE 22

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0.000 0.002 0.004 0.006

x type

  • init

prop seq

type

y yhat ei

Iter = 3, Gap = 1.9909e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 22

slide-23
SLIDE 23

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0e+00 2e−04 4e−04 6e−04 8e−04

x type

  • init

prop seq

type

y yhat ei

Iter = 4, Gap = 1.9992e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 23

slide-24
SLIDE 24

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0e+00 1e−04 2e−04

x type

  • init

prop seq

type

y yhat ei

Iter = 5, Gap = 1.9992e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 24

slide-25
SLIDE 25

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0.00000 0.00003 0.00006 0.00009 0.00012

x type

  • init

prop seq

type

y yhat ei

Iter = 6, Gap = 1.9996e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 25

slide-26
SLIDE 26

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0e+00 1e−05 2e−05 3e−05 4e−05 5e−05

x type

  • init

prop seq

type

y yhat ei

Iter = 7, Gap = 2.0000e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 26

slide-27
SLIDE 27

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0.0e+00 5.0e−06 1.0e−05 1.5e−05 2.0e−05

x type

  • init

prop seq

type

y yhat ei

Iter = 8, Gap = 2.0000e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 27

slide-28
SLIDE 28

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0.0e+00 2.5e−06 5.0e−06 7.5e−06 1.0e−05

x type

  • init

prop seq

type

y yhat ei

Iter = 9, Gap = 2.0000e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 28

slide-29
SLIDE 29

Surrogate-Model-Based Search Example

  • y

ei −1.0 −0.5 0.0 0.5 1.0 0.0 0.4 0.8 0e+00 1e−07 2e−07 3e−07 4e−07

x type

  • init

prop seq

type

y yhat ei

Iter = 10, Gap = 2.0000e−01 Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. http://arxiv.org/abs/1703.03373. 29

slide-30
SLIDE 30

Two-Slide MBO

# http://www.cs.uwyo.edu/~larsko/mbo.py params = { 'C': np.logspace(-2, 10, 13), 'gamma': np.logspace(-9, 3, 13) } param_grid = [ { 'C': x, 'gamma': y } for x in params['C'] for y in params['gamma'] ] # [{'C': 0.01, 'gamma': 1e-09}, {'C': 0.01, 'gamma': 1e-08}...] initial_samples = 3 evals = 10 random.seed(1) def est_acc(pars): clf = svm.SVC(**pars) return np.median(cross_val_score(clf, iris.data, iris.target, cv = 10)) data = [] for pars in random.sample(param_grid, initial_samples): acc = est_acc(pars) data += [ list(pars.values()) + [ acc ] ] # [[1.0, 0.1, 1.0], # [1000000000.0, 1e-07, 1.0], # [0. 1, 1e-06,0.9333333333333333]]

30

slide-31
SLIDE 31

Two-Slide MBO

regr = RandomForestRegressor(random_state = 0) for evals in range(0, evals): df = np.array(data) regr.fit(df[:,0:2], df[:,2]) preds = regr.predict([ list(pars.values()) for pars in param_grid ]) i = preds.argmax() acc = est_acc(param_grid[i]) data += [ list(param_grid[i].values()) + [ acc ] ] print("{}: best predicted {} for {}, actual {}" .format(evals, round(preds[i], 2), param_grid[i], round(acc, 2))) i = np.array(data)[:,2].argmax() print("Best accuracy ({}) for parameters {}".format(data[i][2], data[i][0:2]))

31

slide-32
SLIDE 32

Two-Slide MBO (slide 3)

0: best predicted 0.99 for {'C': 1.0, 'gamma': 1e-09}, actual 0.93 1: best predicted 0.99 for {'C': 1000000000.0, 'gamma': 1e-09}, actual 0.93 2: best predicted 0.99 for {'C': 1000000000.0, 'gamma': 0.1}, actual 0.93 3: best predicted 0.97 for {'C': 1.0, 'gamma': 0.1}, actual 1.0 4: best predicted 0.99 for {'C': 1.0, 'gamma': 0.1}, actual 1.0 5: best predicted 1.0 for {'C': 1.0, 'gamma': 0.1}, actual 1.0 6: best predicted 1.0 for {'C': 1.0, 'gamma': 0.1}, actual 1.0 7: best predicted 1.0 for {'C': 1.0, 'gamma': 0.1}, actual 1.0 8: best predicted 1.0 for {'C': 0.01, 'gamma': 0.1}, actual 0.93 9: best predicted 1.0 for {'C': 1.0, 'gamma': 0.1}, actual 1.0 Best accuracy (1.0) for parameters [1.0, 0.1]

32

slide-33
SLIDE 33

Application – Optimizing Graphene Oxide Reduction

▷ reduce graphene oxide to graphene through laser irradiation ▷ allows to create electrically conductive lines in insulating material ▷ laser parameters need to be tuned carefully to achieve good results

33

slide-34
SLIDE 34

From Graphite/Coal to Carbon Electronics

Overview of the Process

34

slide-35
SLIDE 35

Evaluation of Irradiated Material

595 !

"#$

833 !

"#$

1071 !

"#$

1309 !

"#$

1428 !

"#$

1527 !

"#$

1785 !

"#$

2023 !

"#$

35

slide-36
SLIDE 36

Morphology of Irradiated Material

36

slide-37
SLIDE 37

Surrogate-Model-Based Optimization

  • 2

4 6 10 20 30 40 50

Iteration Ratio

37

slide-38
SLIDE 38

Surrogate-Model-Based Optimization

  • 2

4 6 2 4 6 8

Iteration Ratio

  • Predictions work even with small training dataset (19 points)
  • AI Model achieved IG/ID ratio (>6) after 1st prediction

During Training After 1st prediction + Prediction

  • Actual

50 um 50 um 38

slide-39
SLIDE 39

Explored Parameter Space

  • 1

4 3 45 5 7 6 8 2 15 13 14 12 17 26 21 22 27 19 20 18 25 9 24 31 40 32 35 29 38 41 34 33 11 30 28 10 16 23 46 36 42 37 39 47 48 44 43 2 4 6

Parameter Space Ratio

39

slide-40
SLIDE 40

Tools and Resources

iRace http://iridia.ulb.ac.be/irace/ TPOT https://github.com/EpistasisLab/tpot mlrMBO https://github.com/mlr-org/mlrMBO SMAC http://www.cs.ubc.ca/labs/beta/Projects/SMAC/

Spearmint https://github.com/HIPS/Spearmint TPE https://jaberg.github.io/hyperopt/

COSEAL group for COnfjguration and SElection of ALgorithms: https://www.coseal.net/ Out soon: edited book on automated machine learning https://www.automl.org/book/ (Frank Hutter, Lars Kotthofg, Joaquin Vanschoren) More on our applications: https://www.uwyo.edu/ceas/engineering-initiative/aim/

40

slide-41
SLIDE 41

We’re hiring!

Several funded positions available.

41