hyperparameter optimization strategies
play

Hyperparameter optimization strategies git clone - PowerPoint PPT Presentation

Hyperparameter optimization strategies git clone https://github.com/IASIAI/hyperparameter-optimization-strategies.git meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net Gabriel Marchidan Bogdan Burlacu Software architect AI researcher ,


  1. Hyperparameter optimization strategies git clone https://github.com/IASIAI/hyperparameter-optimization-strategies.git meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  2. Gabriel Marchidan Bogdan Burlacu Software architect AI researcher , PhD meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  3. “Algorithms are conceived in analytic purity in the high citadels of academic research, heuristics are midwifed by expediency in the dark corners of the practitioner’s lair” Fred Glover, 1977 meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  4. Contents ● Problem statement ● Disclaimer ● Random search ● Grid search ● Bayesian optimization ● Covariance Matrix Adaptation Evolution Strategy (CMA-ES) meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  5. Problem statement Hyperparameters are parameters whose values are set prior to the ● commencement of the learning process. By contrast, the values of other parameters are derived via training. ● The problem (hyper)parameter optimization is not specific to ML ● meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  6. Problem statement ● meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  7. Problem statement ● meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  8. Disclaimer Things that will improve a real-life algorithm mode than in-depth parameter optimization: ● Having better data ● Having more data ● Changing the algorithm (or the rates in which multiple algorithms’ results are being weighted) So don’t start with this ! meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  9. Grid search ● Scans the parameter space in a grid pattern with a certain step size ● Hence the name “Grid search” meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  10. Grid search ● Grid search probes parameter configurations deterministically, by laying down a grid of all possible configurations inside your parameter space ● In all continuous dimensions of parameter space a step is considered (defines the smoothness of the grid) meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  11. Grid search ● Requires a lot of function evaluations ● Is highly impractical for algorithms with more than 4 parameters ● The number of function evaluations grows exponentially with each additional parameter (curse of dimensionality) meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  12. Curse of dimensionality ● meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  13. Random search ● Random points from the parameter space are being chosen ● In turn, random points from the solution space are being sampled meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  14. Random search ● Random Search suggests configurations randomly from your parameter space ● The best result is saved along with the corresponding parameters ● The next result is sampled either randomly from the whole parameter space or randomly from a sphere around the current result ● The process is repeated until a termination criterion is met (usually, number of iterations) meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  15. Random search ● Can be applied to functions that are not continuous or differentiable ● It makes no assumptions about the properties of the function ● Has multiple variants: fixed step, optimum step, adaptive step etc. meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  16. Bayesian optimization ● With each observation we are improving the model of the objective function ● We are sampling the points that have the highest chance to improve the objective function meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  17. Bayesian optimization ● meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  18. Bayesian optimization ● To use Bayesian optimization, we need a way to flexibly model distributions over objective functions ● For this problem, Gaussian Processes are a particularly elegant technique ● Used for problems where each sampling is costly either as time or resources ● Historically Gaussian Processes were developed to help search for gold meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  19. Bayesian optimization ● meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  20. CMA-ES ● CMA-ES stands for Covariance Matrix Adaptation Evolution Strategy ● It is an evolutionary algorithm for difficult non-linear non-convex black-box optimisation problems in continuous domain ● The CMA-ES is considered as state-of-the-art in evolutionary computation and has been adopted as one of the standard tools for continuous optimisation meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  21. CMA-ES ● It is an evolutionary algorithm ● Solutions are represented by parameter vectors with real number values ● Initial solutions are randomly generated ● Subsequent solutions are generated from the fittest solutions of the previous generation by recombination and mutation meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  22. CMA-ES ● Runs have the same population size each step ( λ , λ > 4, usually λ > 20) ● Each value from the parameter solution vector is modified by sampling a certain distribution ● The distribution is updated by CMA based on the best solutions found in the current step(ES) meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  23. CMA-ES ● The mean is updated each time to provide a new centroid for new solutions ● Two paths of the time evolution of the distribution mean of the strategy are recorded, called search or evolution paths ● The two paths contain information about the correlation between consecutive iterations meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  24. CMA-ES ● Each iteration the mean is adjusted ● The two evolution paths are updated ● A new step size is calculated meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  25. Running the simulations ● Windows - WinPython - https://winpython.github.io/ ● Linux and macOS – Python 3.5+, SciPy, scikit-learn, skopt (scikit-optimize) ● Python virtualenv recommended for Linux and macOS ● To test, you should be able to run the examples here: https://scikit-optimize.github.io/ git clone https://github.com/IASIAI/hyperparameter-optimization-strategies.git meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  26. Simulation ● Trying to find the maximum value of the Rastrigin function ● Will run, in turn: ○ Grid Search ○ Random Search ○ Bayesian optimization ○ CMA-ES meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  27. Biography ● https://www.lri.fr/~hansen/cmaesintro.html ● https://blog.sigopt.com/posts/evaluating-hyperparameter-optimization-strategies ● https://cloud.google.com/blog/big-data/2017/08/hyperparameter-tuning-in-cloud-machine-learning-engine-using-bayesia n-optimization ● https://en.wikipedia.org/wiki/Hyperparameter_(machine_learning) ● https://en.wikipedia.org/wiki/CMA-ES ● https://en.wikipedia.org/wiki/Rastrigin_function meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  28. Questions ? meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

  29. Thank You! meetup.com/ IASI-AI / facebook.com/ AI.Iasi / iasiai.net

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend