Genetic Algorithms for Optimization of Noisy Fitness Functions and - - PowerPoint PPT Presentation

genetic algorithms for optimization of noisy fitness
SMART_READER_LITE
LIVE PREVIEW

Genetic Algorithms for Optimization of Noisy Fitness Functions and - - PowerPoint PPT Presentation

Genetic Algorithms for Optimization of Noisy Fitness Functions and Adaptation to Changing Environments Hajime Kita 1 and Yasuhito Sano 2 1 Kyoto University 2 Nissan Motor Co. Ltd. SMAPIP, July 2003 (1) Outline of


slide-1
SLIDE 1

✬ ✫ ✩ ✪

Genetic Algorithms for Optimization of Noisy Fitness Functions and Adaptation to Changing Environments Hajime Kita1 and Yasuhito Sano2

1Kyoto University 2Nissan Motor Co. Ltd.

SMAPIP, July 2003 (1)

slide-2
SLIDE 2

✬ ✫ ✩ ✪

Outline of Talk

  • Optimization of Uncertain Functions

– Problems – Applications

  • Approach: Genetic Algorithms
  • Genetic Algorithms for Noisy Fitness Functions

– Memory-based Fitness Estimation GA

  • Genetic Algorithms for Adaptation to Changing Environments

– Genetic Algorithm using Sub-Population

SMAPIP, July 2003 (2)

slide-3
SLIDE 3

✬ ✫ ✩ ✪

Optimization of Uncertain Functions

Problems

  • Optimization Problem having uncertain parameters δ:

min

x f(x, δ)

  • Optimization of Noisy Fitness Function

min

x f(x) + δδ

  • Adaptation to Changing Environments

min

xt f(xt, δt)

for t = 1, 2, · · · where δt: unknown or predictable to some extent in deciding xt

SMAPIP, July 2003 (3)

slide-4
SLIDE 4

✬ ✫ ✩ ✪

Applications

Online adaptation of the systems working in the real world.

  • Systems facing difficulty in precise simulation.
  • Optimization through experiments.
  • Online adaptation during usage of it.

Simulation-based optimization of large and complex systems.

  • Optimization through simulation using random numbers.
  • Random fluctuation of observed performance.
  • Ex.: Traffic, communication and production systems.

SMAPIP, July 2003 (4)

slide-5
SLIDE 5

✬ ✫ ✩ ✪

Challenges

  • Problems themselves are more difficult than usual optimization

problems.

  • Available numbers of fitness evaluation is severely restricted in

practical applications.

  • Trade-off between

– ‘to know more about the system (for good estimation)’ and – ‘to behave better in the system (for good optimization)’

SMAPIP, July 2003 (5)

slide-6
SLIDE 6

✬ ✫ ✩ ✪

Example (Noisy Fitness, Online Adaptation): Problem of Engine Control for Motorcycle

  • Task: To improve response of the engine by dynamic control of

air-fuel ratio. – Acceleration operation of throttle – Time lag of fuel injection cause power down of engine. – Dynamic compensation of fuel injection for acceleration

  • Challenge

– Difficulty in constructing a precise simulator. – Large noise in observation of acceleration. – Optimization within 1000 evaluations.

SMAPIP, July 2003 (6)

slide-7
SLIDE 7

✬ ✫ ✩ ✪

Intake tube of a Engine

Throttle plate Fuel Injector Air flo Air flow Fuel flo Fuel flow 1, 2,

MVEM Initial A/F Neural Network Limiter : Engine Speed : Throttle Angle

p p

+

Me, Mf

p

c

d/dt

(a) Intake of Engine (b) Evaluation System

39000 38500 38000 37500 37000 0 200 400 600 800 1000 1200 1400 1600 1800 2000 Evaluation

fBest

Standard GA MFEGA Noiseless GA tested-MFEGA 39000 38500 38000 37500 370000 200 400 600 800 1000 1200 1400 1600 1800 2000 Sample 10-GA MFEGA Sample 3-GA Noiseless GA Evaulation

fBest

(c) Simulation Result

SMAPIP, July 2003 (7)

slide-8
SLIDE 8

✬ ✫ ✩ ✪

Example (Noisy Fitness, Simulation-based Optimization): Problem of Multi-Car Elevator Control

  • Multi-car Elevator: an elevator system having several cars in a

single elevator shaft using linear motor.

  • Applicability of conventional elevator group control is limited.
  • Task: to design controller through simulation-based optimization.
  • Challenge:

– Discrete event simulation use random numbers. – Observable variables are imperfect. – Single simulation run takes about 30sec.

SMAPIP, July 2003 (8)

slide-9
SLIDE 9

✬ ✫ ✩ ✪

Zone 1 Zone 2 Grage floor Terminal floor Down Car Escape Hall operation panel Up Car

Multi-Car Elevator System

SMAPIP, July 2003 (9)

slide-10
SLIDE 10

✬ ✫ ✩ ✪

10 20 30 40 50 60 70 80 1500 2000 2500 3000 3500 4000 4500 Frequency Performance MFEGA 10 20 30 40 50 60 70 1500 2000 2500 3000 3500 4000 4500 Frequency Performance Sample-5 GA

(a) MFEGA (b) Sample-5 GA

10 20 30 40 50 60 70 1500 2000 2500 3000 3500 4000 4500 Frequency Performance Standard GA 2000 3000 4000 5000 6000 7000 8000 50 100 150 200 250 300 350 400 Performance Generation 11dim 22dim, seed=27 22dim, seed=37 22dim, seed=47

(c) Standard GA (d) 11-dim/22-dim Problem

SMAPIP, July 2003 (10)

slide-11
SLIDE 11

✬ ✫ ✩ ✪

Example: (Changing Environment) Problem of Boat Engine Control

  • Operation of motor boat: steering and engine throttle
  • Dynamics of boat changes largely depending on steering
  • Task: to achieve steady velocity during steering by engine control
  • Challenge

– To detect driver’s will of steering. – Optimization of engine control in several operation modes (Go straight, Turn, Do Slalom). – Limited numbers of evaluation.

SMAPIP, July 2003 (11)

slide-12
SLIDE 12

✬ ✫ ✩ ✪

Update the Population of Control Scheme Posterior Estimation of the Environment Evaluation of the Performance Operation of the System Adoption of the Control Scheme Choice of a Control Scheme among the Population Prior Estimation of the Environment START

Control scheme in changing environment.

SMAPIP, July 2003 (12)

slide-13
SLIDE 13

✬ ✫ ✩ ✪

Straight Turn Slalom

1 2 3 100 120 140 160 180 200 Environment term True Environment Estimated Environment

(a) Model of Environment (b) Sample Data

500 1000 1500 2000 2500 3000 50 100 150 200 250 300 350 400 Evaluation E 3 Optimal of E 2 Optimal of E 1 Optimal of E 3 E 1 E 2

ft

ave

Fitness: Artificial (c) Result of GASP

SMAPIP, July 2003 (13)

slide-14
SLIDE 14

✬ ✫ ✩ ✪

Approach: Genetic Algorithms

Genetic Algorithms (GAs)

Optimization, adaptation and learning algorithms inspired by the natural selection theory of evolution:

  • Generate an initial population of solution candidates.
  • Repeat

– Generate new individuals using crossover and mutation

  • perations.

– Evaluate individuals by fitness function. – Select good individuals as survivors.

SMAPIP, July 2003 (14)

slide-15
SLIDE 15

✬ ✫ ✩ ✪

Advantages of GAs

  • Direct optimization method that uses only fitness function values,
  • Stochastic search for global optimization, and
  • Robustness of population-based search.

SMAPIP, July 2003 (15)

slide-16
SLIDE 16

✬ ✫ ✩ ✪

GAs for Noisy Fitness Functions

Problem

min

x F(x)δ

F(x) = f(x) + δ where x: continuous decision variable, F(x): observation of fitness value, f(x): true fitness function, δ:additive noise and δδ = 0.

SMAPIP, July 2003 (16)

slide-17
SLIDE 17

✬ ✫ ✩ ✪

GA Approaches to Noisy Fitness

Application of Conventional GAs :

  • Use self-averaging nature of the population based

search[Tsutsui,Ghosh 1997].

  • It requires large population sizes, and takes long time for

convergence. GA with Multiple Sampling :

  • To sample fitness values several times for each individual, and

use the mean[Fitzpatrick, Greffenstette, 1988].

  • Reduce variance without specific assumptions on fitness

function.

SMAPIP, July 2003 (17)

slide-18
SLIDE 18

✬ ✫ ✩ ✪

  • Improvement by adjustment and restriction of

sampling[Branke 1998, 1998].

  • Require large number of fitness evaluation.

Referring to Fitness Values of Other Individuals :

  • To evaluate an individual using fitness values of near

individuals.

  • Require some assumption (model) of fitness function.
  • To use fitness values of parents[Tamaki, Arai 1997, Tanooka

et al. 1999]. Systematic error due to selection.

  • To use nearby individuals[Branke 1998].
  • Memory-based Fitness Evaluation GA(MFEGA)[Sano, Kita

2000].

SMAPIP, July 2003 (18)

slide-19
SLIDE 19

✬ ✫ ✩ ✪

Memory-based Fitness Estimation GA

Concept of MFEGA

  • Aim: to use sampled fitness values through search as far as

possible.

  • To store sampled fitness values into memory as search history.
  • To introduce a simple stochastic model of fitness values for

estimation.

  • To estimate fitness values of points of interests using the history

for selection operation in GA.

SMAPIP, July 2003 (19)

slide-20
SLIDE 20

✬ ✫ ✩ ✪ A Stochastic Model of Fitness Functions

  • Fitness values of individuals F(h) distributed randomly around

the fitness value at the point of interest,

  • The variance of the fitness value depends only on the distance d

from the point of interest x. f(h) ∼ N(f(x), kd) δ ∼ N(0, σ2) F(h) = f(h) + δ ∼ N(f(x), kd + σ2)

SMAPIP, July 2003 (20)

slide-21
SLIDE 21

✬ ✫ ✩ ✪

F(h) F(x) x d kd kd+σ h

2

Stochastic model of noisy fitness.

SMAPIP, July 2003 (21)

slide-22
SLIDE 22

✬ ✫ ✩ ✪ Estimation of Fitness ML Estimation: ˜ f(x) = F(x) +

H

  • l=2

1 (k′dl + 1)F(hl) 1 +

H

  • l=2

1 (k′dl + 1) where hl, l = 1, ..., H be the H individuals in the search history, k′ = k/σ2. Estimation of k′: ML estimation assuming f(x) be the mean of fitness values of the solutions nearby the found best solution.

SMAPIP, July 2003 (22)

slide-23
SLIDE 23

✬ ✫ ✩ ✪ Prototype Algorithm of the MFEGA Use UNDX[Ono, Kobayashi 1997] for crossover, No mutation. (Initialization)

  • 1. Initialize the population of M individuals x1, ..., xM randomly.
  • 2. Let evaluation counter e = 0. Set the maximal evaluations to E.
  • 3. Let history H = φ.

(Main Loop)

  • 4. Choose xp1 and xp2 from the population as parents.
  • 5. Produce xc

1, ..., xc C by applying the crossover to the parents.

  • 6. Let y1 = xp1, y2 = xp2 and yi+2 = xc

i, i = 1, ..., C. Call

F = {y1, · · · , yC+2} a family.

SMAPIP, July 2003 (23)

slide-24
SLIDE 24

✬ ✫ ✩ ✪

  • 7. Sample fitness value F for yi, i = 1, ..., C + 2.
  • 8. Let e = e + C + 2.
  • 9. Store the sampled values into the history H, i.e,

H = H ∪ {(yi, F(yi))|i = 1, ..., C + 2}. (Estimation of the Parameter)

  • 10. Select the individual hmin having the smallest sampled fitness

value from H.

  • 11. Estimate k′ and σ2.

(Remedy for extrapolative search — tested-MFEGA)

  • 12. Find an individual yBest that has the smallest sampled fitness

value from F.

  • 13. Let YAccept = {yi ∈ F such as F(yi) < F(yBest) + Z}

SMAPIP, July 2003 (24)

slide-25
SLIDE 25

✬ ✫ ✩ ✪ (Estimation of Fitness and Generation Alternation)

  • 14. Estimate f(yi) ∈ YAccept.
  • 15. Substitute the individual having two smallest ˜

f(yi) among YAccept into xp1, xp2

  • 16. If e ≤ E, go to Step 4, otherwise terminate the

algorithm.

SMAPIP, July 2003 (25)

slide-26
SLIDE 26

✬ ✫ ✩ ✪ Numerical Examples Test Function :

  • Sphere function:

F1(x) =

10

  • j=1

x2

j + δ,

δ ∼ N(0, σ2

F1),

σ2

F1 = 1.0,

  • Ill-scaled function.
  • Offset of optimum.

Initial Population : sampled in [−0.5, 0.5]10 randomly.

SMAPIP, July 2003 (26)

slide-27
SLIDE 27

✬ ✫ ✩ ✪ Tested Algorithms :

  • Standard GA: to use a single fitness sample for evaluation.
  • Sample 10-GA: to use the mean of 10 fitness samples.
  • MFEGA.
  • tested-MFEGA

Parameters : M = 30, C = 5, Max. Evaluation = 1400

SMAPIP, July 2003 (27)

slide-28
SLIDE 28

✬ ✫ ✩ ✪

0.1 0.2 0.3 0.4 0.6 1.0 200 400 600 800 1000 1200 1400 Evaluation Sample 10-GA Standard GA Noiseless GA tested-MFEGA MFEGA

f

Best

0.1 0.2 0.3 0.5 0.7 1.0 200 400 600 800 1000 1200 1400

f

Best

Evaluation Noiseless GA MFEGA tested-MFEGA Sample 10-GA Standard GA

(a) F1 (b) Ill Scaled Function

1.0 2.0 3.0 0.8 200 400 600 800 1000 1200 1400

f Best

Evaluation Sample 10-GA Standard GA MFEGA Noiseless GA tested-MFEGA

(c) Offset Optimum Results of Experiments

SMAPIP, July 2003 (28)

slide-29
SLIDE 29

✬ ✫ ✩ ✪

Applications

  • Application to Engine Control for Motorcycle.
  • Application to Multi-car Elevator Control.

SMAPIP, July 2003 (29)

slide-30
SLIDE 30

✬ ✫ ✩ ✪

GSs for Adaptation to Changing Environments

What can we know environmental change?

  • Change is small.
  • Detection/estimation/prediction are possible.
  • Change is recurrent.

GA approaches

Search-based Approach :

  • Keep diversity of population is important.

SMAPIP, July 2003 (30)

slide-31
SLIDE 31

✬ ✫ ✩ ✪

  • Add randomly generated individual, increase mutation rate

(hyper-mutation)[Grefenstette 1992, Gobb, Grefenstette 1993]

  • Selection operation considering maintenance of diversity[Mori,

Kita, Nishikawa 1996] Memory-based Approach :

  • To memory results of adaptation and recall them.
  • Redundant genetic representation such as diploidy[Goldberg

1989, ookura 1995]

  • Keep memory explicitly such as Immune Algorithm[Mori et
  • al. 1993] and Memory-based TDGA[Mori et al.1997].
  • GA using sub-population(GASP)[Sano et al. 2002]

SMAPIP, July 2003 (31)

slide-32
SLIDE 32

✬ ✫ ✩ ✪

A Model of Changing Environment

  • General Problem is difficult to solve.
  • No use in realistic applications.
  • To start discussion from a practical application.
  • Application

– Engine control of motor boat. – Environmental change: steering operation ‘to go straight’, ‘to slalom’, and ‘to turn’ – Goal: keep performance of engine constant.

  • Problem: several environments switches randomly.

SMAPIP, July 2003 (32)

slide-33
SLIDE 33

✬ ✫ ✩ ✪

E2 E1 E3 f(x,E1) f(x,E2) f(x,E3) Random Transition Environment Fitness

Model of changing environment.

SMAPIP, July 2003 (33)

slide-34
SLIDE 34

✬ ✫ ✩ ✪

Control and Optimization Scheme

  • The prior estimation: estimation of the environment before

applying the control scheme.

  • The posterior estimation: estimation of the environment after
  • peration. More accurate than the prior estimation but may not

perfect.

  • Performance evaluation:
  • To construct optimizer using these information.

SMAPIP, July 2003 (34)

slide-35
SLIDE 35

✬ ✫ ✩ ✪

Update the Population of Control Scheme Posterior Estimation of the Environment Evaluation of the Performance Operation of the System Adoption of the Control Scheme Choice of a Control Scheme among the Population Prior Estimation of the Environment START

Control scheme in changing environment.

SMAPIP, July 2003 (35)

slide-36
SLIDE 36

✬ ✫ ✩ ✪

Genetic Algorithm using Sub-Population (GASP)

Concept of GASP

  • To prepare sub-population corresponding to appearing

environment.

  • Switch sub-population by prior/posterior estimation.

Prototype Algorithm

  • 1. Initialize individuals in each sub-population
  • 2. For each sub-population, prepare a family by selecting parents

and applying crossover to them.

SMAPIP, July 2003 (36)

slide-37
SLIDE 37

✬ ✫ ✩ ✪

  • 3. Obtain a prior estimation.
  • 4. Apply a control scheme in the family corresponding to the prior

estimation.

  • 5. Operate the system.
  • 6. Obtain evaluation of the applied control scheme.
  • 7. Obtain a posterior estimation.
  • 8. Put the evaluated control scheme into the family corresponding

to the posterior estimation.

  • 9. If all the members in the family are evaluated, apply selection
  • peration to the corresponding sub-population, and prepare a

new family for the sub-population.

  • 10. Go to Step 3 until termination criterion holds.

SMAPIP, July 2003 (37)

slide-38
SLIDE 38

✬ ✫ ✩ ✪

Application

  • Control of Engine for Motor Boat

SMAPIP, July 2003 (38)