applications of machine learning in engineering
play

Applications of Machine Learning in Engineering (and Parameter - PowerPoint PPT Presentation

Applications of Machine Learning in Engineering (and Parameter Tuning) Lars Kotthofg University of Wyoming larsko@uwyo.edu RMACC, 23 May 2019 slides available at https://www.cs.uwyo.edu/~larsko/slides/rmacc19.pdf 1 Optimizing Graphene Oxide


  1. Applications of Machine Learning in Engineering (and Parameter Tuning) Lars Kotthofg University of Wyoming larsko@uwyo.edu RMACC, 23 May 2019 slides available at https://www.cs.uwyo.edu/~larsko/slides/rmacc19.pdf 1

  2. Optimizing Graphene Oxide Reduction material results 2 ▷ reduce graphene oxide to graphene through laser irradiation ▷ allows to create electrically conductive lines in insulating ▷ laser parameters need to be tuned carefully to achieve good

  3. From Graphite/Coal to Carbon Electronics 3 Overview of the Process

  4. Evaluation of Irradiated Material 4

  5. ML-Optimized Laser Parameters 5 ● ● ● ● ● ● ● 6 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● Ratio ● ● ● ● ● ● ● ● ● ● ● 4 ● ● ● ● ● ● ● 2 0 10 20 30 40 50 Iteration

  6. ML-Optimized Laser Parameters 6 ● ● 6 ● ● ● ● ● ● Ratio 4 + Prediction 2 Actual • 50 um 50 um 0 2 4 6 8 Iteration After 1 st prediction During Training • Predictions work even with small training dataset (19 points) • AI Model achieved I G /I D ratio (>6) after 1st prediction

  7. Explored Parameter Space 7 48 47 12 18 29 1 5 9 7 21 17 35 6 37 42 22 19 31 38 33 28 25 10 32 11 13 2 27 34 8 16 20 30 39 Ratio 6 3 44 43 46 45 14 26 24 40 4 4 36 41 ● 23 ● 15 ● ● ● ● ● 2 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● Parameter Space

  8. Design of New Materials absorption of material 8 ▷ optimize parameters of pattern generator for energy

  9. Big Picture techniques intelligently – automatically 9 ▷ advance the state of the art through meta-algorithmic ▷ rather than inventing new things, use existing things more ▷ invent new things through combinations of existing things

  10. What to Tune – Parameters method 10 ▷ anything you can change that makes sense to change ▷ e.g. heuristic, power of a laser, kernel for a machine learning ▷ not random seed, whether to enable debugging, etc. ▷ some will afgect performance, others will have no efgect at all

  11. Automated Parameter Tuning Frank Hutter and Marius Lindauer, “Algorithm Confjguration: A Hands on Tutorial”, AAAI 2016 11

  12. General Approach workings intensifjcation/exploitation 12 ▷ evaluate algorithm as black-box function ▷ observe efgect of parameters without knowing the inner ▷ decide where to evaluate next ▷ balance diversifjcation/exploration and ▷ repeat until satisfjed

  13. When are we done? solution (with fjnite time) 13 ▷ most approaches incomplete ▷ cannot prove optimality, not guaranteed to fjnd optimal ▷ performance highly dependent on confjguration space � How do we know when to stop?

  14. Resource Budget How much time/how many function evaluations? 14 ▷ too much � wasted resources ▷ too little � suboptimal result ▷ use statistical tests ▷ evaluate on parts of the instance set ▷ for runtime: adaptive capping

  15. Grid and Random Search Bergstra, James, and Yoshua Bengio. “Random Search for Hyper-Parameter Optimization.” J. Mach. Learn. Res. 13, no. 1 (February 2012): 281–305. 15 ▷ evaluate certain points in parameter space

  16. Local Search quality achieved 16 ▷ start with random confjguration ▷ change a single parameter (local search step) ▷ if better, keep the change, else revert ▷ repeat, stop when resources exhausted or desired solution ▷ restart occasionally with new random confjgurations

  17. Local Search Example graphics by Holger Hoos 17 (Initialisation)

  18. Local Search Example graphics by Holger Hoos 18 (Initialisation)

  19. Local Search Example graphics by Holger Hoos 19 (Local Search)

  20. Local Search Example graphics by Holger Hoos 20 (Local Search)

  21. Local Search Example graphics by Holger Hoos 21 (Perturbation)

  22. Local Search Example graphics by Holger Hoos 22 (Local Search)

  23. Local Search Example graphics by Holger Hoos 23 (Local Search)

  24. Local Search Example graphics by Holger Hoos 24 (Local Search)

  25. Local Search Example graphics by Holger Hoos 25 ? Selection (using Acceptance Criterion)

  26. Surrogate-Model-Based Search on this quality achieved 26 ▷ evaluate small number of initial (random) confjgurations ▷ build surrogate model of parameter-performance surface based ▷ use model to predict where to evaluate next ▷ repeat, stop when resources exhausted or desired solution ▷ allows targeted exploration of promising confjgurations

  27. Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 27 Iter = 1, Gap = 1.5281e−01 0.8 ● ● y 0.4 type ● ● init ● 0.0 prop type 0.06 y yhat 0.04 ei 0.02 ei 0.00 −0.02 −1.0 −0.5 0.0 0.5 1.0 x

  28. Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 28 Iter = 2, Gap = 1.5281e−01 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y 0.03 yhat ei 0.02 ei 0.01 0.00 −1.0 −0.5 0.0 0.5 1.0 x

  29. Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 29 Iter = 3, Gap = 1.5281e−01 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y 0.020 yhat ei 0.015 ei 0.010 0.005 0.000 −1.0 −0.5 0.0 0.5 1.0 x

  30. Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 30 Iter = 4, Gap = 1.3494e−02 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y yhat 0.010 ei ei 0.005 0.000 −1.0 −0.5 0.0 0.5 1.0 x

  31. Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 31 Iter = 5, Gap = 1.3494e−02 0.8 ● ● y 0.4 type ● ● init ● prop 0.0 seq type y 0.015 yhat ei 0.010 ei 0.005 0.000 −1.0 −0.5 0.0 0.5 1.0 x

  32. Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 32 Iter = 6, Gap = 2.1938e−06 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y 0.006 yhat ei 0.004 ei 0.002 0.000 −1.0 −0.5 0.0 0.5 1.0 x

  33. Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 33 Iter = 7, Gap = 2.1938e−06 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y yhat 1e−03 ei ei 5e−04 0e+00 −1.0 −0.5 0.0 0.5 1.0 x

  34. Tools and Resources TPOT https://github.com/EpistasisLab/tpot mlrMBO https://github.com/mlr-org/mlrMBO SMAC http://www.cs.ubc.ca/labs/beta/Projects/SMAC/ Spearmint https://github.com/HIPS/Spearmint TPE https://jaberg.github.io/hyperopt/ COSEAL group for COnfjguration and SElection of ALgorithms: https://www.coseal.net/ Further reading: Jones, Donald R., Matthias Schonlau, and William J. Welch. “Effjcient Global Optimization of Expensive Black-Box Functions.” J. of Global Optimization 13, no. 4 (December 1998): 455–92. 34 iRace http://iridia.ulb.ac.be/irace/

  35. https://www.automl.org/book/ 35

  36. I’m hiring! Several positions available. 36

  37. Exercises https://www.cs.uwyo.edu/~larsko/mbo/mbo.py and run it 37 ▷ (install python and/or scikit-learn) ▷ download ▷ try a difgerent data set ▷ try tuning difgerent/more parameters ▷ …

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend