DYNAMIC RESAMPLING FOR GUIDED EVOLUTIONARY MULTI-OBJECTIVE - - PowerPoint PPT Presentation

dynamic resampling for guided evolutionary multi
SMART_READER_LITE
LIVE PREVIEW

DYNAMIC RESAMPLING FOR GUIDED EVOLUTIONARY MULTI-OBJECTIVE - - PowerPoint PPT Presentation

DYNAMIC RESAMPLING FOR GUIDED EVOLUTIONARY MULTI-OBJECTIVE OPTIMIZATION OF STOCHASTIC SYSTEMS Florian Siegmund, Amos H.C. Ng University of Skvde, Sweden Kalyanmoy Deb Indian Institute of Technology Kanpur, India Outline Background


slide-1
SLIDE 1

DYNAMIC RESAMPLING FOR GUIDED EVOLUTIONARY MULTI-OBJECTIVE OPTIMIZATION OF STOCHASTIC SYSTEMS

Florian Siegmund, Amos H.C. Ng University of Skövde, Sweden Kalyanmoy Deb Indian Institute of Technology Kanpur, India

slide-2
SLIDE 2

Outline

  • Background Guided Search
  • Algorithm independent resampling techniques
  • Distance-based Resampling
  • Numerical Experiments
  • Conclusions/Future Work
slide-3
SLIDE 3

Background Guided Multi-objective Search

  • Precondition: Limited simulation budget
  • High budget required to explore Pareto-front
  • High-dimensional objective spaces
  • Costly evaluation of stochastic models
  • Focus on interesting areas in objective space
  • R-NSGA-II: Reference point
slide-4
SLIDE 4

Background: Reference point-based NSGA-II

  • Evolutionary Optimization Algorithm
  • Deb et al. (2006)
slide-5
SLIDE 5

NSGA-II-Selection step

slide-6
SLIDE 6

NSGA-II

slide-7
SLIDE 7

NSGA-II

slide-8
SLIDE 8

R-NSGA-II-Selection step

slide-9
SLIDE 9

Diversity Control

  • R-NSGA-II on deterministic benchmark problem ZDT1
slide-10
SLIDE 10

R-NSGA-II-Interactive

slide-11
SLIDE 11

R-NSGA-II-Interactive

slide-12
SLIDE 12

R-NSGA-II-Interactive

slide-13
SLIDE 13

Stochastic Simulation  Resampling

  • Noisy problem:
  • Simulation model has varying output for same input if

evaluatated multiple times

  • True output values are unknown
  •  Performance degradation
  • Handle noisy problem: Use sample mean and sample

standard deviation

slide-14
SLIDE 14

Resampling

  • Common: Static Resampling
  • Limited Budget
  • Trade-off: Exploration vs. Exploitation
  • Sampling allocation strategy needed
slide-15
SLIDE 15

Resampling strategies

  • Selection Sampling
  • Accuracy Sampling
  • Selection Sampling for single-objective problems: OCBA
  • R-NSGA-II uses scalar fitness criterion, but only

secondary

  •  complex

x, y

slide-16
SLIDE 16

Resampling techniques

  • Selection Sampling for EMO: complex
  •  Approximations for similar effect
  • Strategy: Good knowledge of solutions close to R will

support the algorithm

slide-17
SLIDE 17

Criteria for resampling

  • General
  • Time
  • Dominance-relation
  • Variance
  • Constraint Violation
  • Reference point
  • Distance to reference point
  • Progress
slide-18
SLIDE 18

Basic resampling techniques

  • Time-based resampling
  • Dominance-based
  • Pareto-rank-based
  • Standard Error Dynamic Resampling
slide-19
SLIDE 19

Time-based Resampling

  • Transformation function
  • NumSamples

ϵ {Min,..,Max}

  • Sampling need ϵ [0,1]
  • Mapping:

Need  NumSamples

slide-20
SLIDE 20
  • Linear allocation

Time-based resampling

slide-21
SLIDE 21
  • Delayed allocation

Time-based resampling

slide-22
SLIDE 22
  • Based on variance information
  • Single-objective version by Di Pietro (2004)
  • Multi-objective:
  • Adds k samples at a time
  • Checks whether max threshold

Standard Error Dynamic Resampling

  H se

i i

slide-23
SLIDE 23

Distance-based Resampling

  • Infeasible case
  • Feasible case
slide-24
SLIDE 24

Infeasible case

  • R is not attainable by any solution
  • R is ”below” Pareto-front
  • Minimum distance

slide-25
SLIDE 25

Infeasible case

  • R is not attainable by any solution
  •  Max number of samples never assigned
  •  Adapt transformation function
slide-26
SLIDE 26

Infeasible case

Use until a solution is found that dominates R

slide-27
SLIDE 27

Feasible case

  • R is ”over” the Pareto-front
  • Solutions can be found that dominate R
slide-28
SLIDE 28

Feasible case

  • Problem: Better solutions have higher distance to R
  • Define Virtual Reference Point VR as the solution that is
  • 1. Non-dominated
  • 2. And closest to R
slide-29
SLIDE 29

Feasible case

slide-30
SLIDE 30

Numerical experiments

  • Benchmark functions
  • Production line scheduling
  • Performance Measurement:
  • Measure average distance to R
  • From the α% closest solutions of the population
  • After every generation
slide-31
SLIDE 31

Numerical experiments

  • Benchmark functions
  • ZDT1
  • Additive noise
slide-32
SLIDE 32

R-NSGA-II, Static Resampling

  • Benchmark function ZDT1, addtive noise σ=0.15
  • R-NSGA-II, 5000 evals, R = (0.5, 0)
slide-33
SLIDE 33

Distance-based Resampling

  • Benchmark function ZDT1, addtive noise σ=0.15
  • R-NSGA-II, 5000 evals, R = (0.5, 0)
slide-34
SLIDE 34

R-NSGA-II, Time-based Resampling

  • Benchmark function ZDT1, addtive noise σ=0.15
  • R-NSGA-II, R = (0.5, 0)
slide-35
SLIDE 35

STANDARD ERROR DYNAMIC RESAMPLING

  • Benchmark function ZDT1, addtive noise σ=0.15
  • R-NSGA-II, R = (0.5, 0), SEDR
slide-36
SLIDE 36

Numerical experiments

  • Production line model
  • Minimize Work in progress
  • Maximize Throughput
  • High noise problem, CV = 1.5
  • 1 to 30 samples per solution
slide-37
SLIDE 37

Results

  • R = (WiP, TH) = (8, 0.8)
slide-38
SLIDE 38

Results

  • R = (WiP, TH) = (8, 0.8)
slide-39
SLIDE 39

Conclusions

  • Distance-based resampling is effective for R-NSGA-II
  • It performs better than algorithm independent strategies
slide-40
SLIDE 40

Future work

  • Evaluate on different noisy industrial SBO problems
  • Computing cluster with 100 workstations
  • Cooperation with Volvo
  • Obtain real output values for performance measurement
  • Accuracy Sampling
  • Selection Sampling
  • Distance to R is scalar value
  • Apply existing ranking and selection method, like OCBA
  • Simplify algorithm, only one fitness criterion
slide-41
SLIDE 41

Questions?

  • Thank you!