Outline I t erat ive improvement algorit hms Hill climbing - - PDF document

outline
SMART_READER_LITE
LIVE PREVIEW

Outline I t erat ive improvement algorit hms Hill climbing - - PDF document

Outline I t erat ive improvement algorit hms Hill climbing search Local Search Simulat ed annealing Genet ic algorit hms CS 486/ 686 Univer sit y of Wat erloo May 12, 2005 1 2 CS486/686 Lecture Slides (c) 2005 K. Larson


slide-1
SLIDE 1

1

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

1

Local Search

CS 486/ 686 Univer sit y of Wat erloo May 12, 2005

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

2

Outline

  • I t erat ive improvement algorit hms
  • Hill climbing search
  • Simulat ed annealing
  • Genet ic algorit hms

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

3

I ntroduction

  • So f ar we have st udied algorit hms which

syst emat ically explore sear ch spaces

– Keep one or more pat hs in memory – When t he goal is f ound, t he solut ion consist s

  • f a pat h t o t he goal
  • For many problems t he pat h is unimpor t ant

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

4

Examples

Vehicle rout ing Channel Rout ing

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

5

Examples

J ob shop scheduling A v ~B v C ~A v C v D B v D v ~E ~C v ~D v ~E … Boolean Sat isf iabilit y

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

6

I ntroduction

  • I nf ormal charact erizat ion

– Combinat orial st ruct ure being opt imized – There is a cost f unct ion t o be opt imized

  • At least we want t o f ind a good solut ion

– Searching all possible st at es is inf easible – No known algorit hm f or f inding t he solut ion ef f icient ly – Some not ion of similar st at es having similar cost s

slide-2
SLIDE 2

2

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

7

Example - TSP

  • Goal is t o minimize t he lengt h of t he rout e
  • Const ruct ive met hod:

– St art f rom scrat ch and build up a solut ion

  • I terat ive improvement method:

– St art wit h a solut ion and t ry t o improve it

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

8

Constructive method

  • For t he opt imal solut ion we could use

A*!

– But we do not really need t o know how we got t o t he solut ion – we j ust want t he solut ion – Can be very expensive t o run

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

9

I terative improvement methods

  • I dea: I magine all possible solut ions laid
  • ut on a landscape

– We want t o f ind t he highest (or lowest ) point

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

10

I terative improvement methods

  • 1. St art at some random point on t he

landscape

  • 2. Generat e all possible point s t o move t o
  • 3. Choose a point of improvement and

move t o it

  • 4. I f you are st uck t hen rest art

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

11

I terative improvement methods

  • What does it mean t o “generat e point s

t o move t o”

– Somet imes called generat ing t he moveset

  • Depends on t he applicat ion

TSP 2-swap

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

12

Hill- climbing

  • 1. St art at some init ial conf igurat ion S
  • 2. Let V=Eval(S)
  • 3. Let N=Move_Set (S)
  • 4. For each Xi∈

∈ ∈ ∈N

– Let Vmax=maxi Eval(Xi) and Xmax=argmaxi Eval(Xi)

  • 5. I f Vmax ≤ V, ret urn S
  • 6. Let S=Xmax and V=Vmax. Go t o 3

“Like t rying t o f ind t he peak of Mt Everest in t he f og”, Russell and Norvig

slide-3
SLIDE 3

3

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

13

Hill Climbing

  • Always t ake a st ep in t he direct ion t hat

improves t he current solut ion value t he most

– Greedy

  • Good t hings about hill climbing

– Easy t o progr am! – Requires no memory of where we have been! – I t is import ant t o have a “good” set of moves

  • Not t oo many, not t oo f ew

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

14

Hill Climbing

  • I ssues wit h hill climbing

– I t can get st uck! – Local maximum (local minimum) – Plat eaus

current state

  • bjective function

state space global maximum local maximum "flat" local maximum shoulder

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

15

I mproving on hill climbing

  • Plat eaus

– Allow f or sideways moves, but be caref ul since may move sideways f orever!

  • Local Maximum

– Random rest art s: “I f at f irst you do not succeed, t ry, t ry again” – Random rest art s works well in pract ice

  • Randomized hill climbing

– Like hill climbing except you choose a random stat e f rom the move set , and t hen move t o it if it is bet t er t han current st at e. Cont inue unt il you are bored

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

16

Hill climbing example: GSAT

Av~BvC 1 ~AvCvD 1 BvDv~E 0 ~Cv~Dv~E 1 ~Av~CvE 1

Conf igurat ion A=1, B=0, C=1, D=0, E=1

Goal is t o maximize t he number of sat isf ied clauses: Eval(conf ig)=# sat isf ied clauses GSAT Move_Set: Flip any 1 variable WALKSAT (Randomized GSAT) Pick a random unsat isf ied clause; Consider f lipping each variable in t he clause I f any improve Eval, t hen accept t he best I f none improve Eval, t hen wit h prob p pick t he move t hat is least bad; prob (1-p) pick a random

  • ne

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

17

Simulated Annealing

  • Hill climbing algor it hms which never

make downhill moves ar e incomplet e

– Can get st uck at local maxima (minima)

  • A r andom walk is complet e but ver y

inef f icient

New I dea:

Allow t he algorit hm t o make some “bad” moves in

  • rder t o escape local maxima.

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

18

Simulated annealing

  • 1. Let S be t he init ial conf igurat ion and

V=Eval(S)

  • 2. Let i be a random move f rom t he

moveset and let Si be t he next conf igurat ion, Vi=Eval(Si)

  • 3. I f V<

Vi t hen S=Si and V=Vi

  • 4. Else wit h probabilit y p, S=Si and V=Vi
  • 5. Got o 2 unt il you are bored
slide-4
SLIDE 4

4

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

19

Simulated annealing

  • How should we choose t he probabilit y of

accept ing a “bad” move?

– I dea 1: p=0.1 (or some ot her f ixed value)? – I dea 2: Probabilit y t hat decreases wit h t ime? – I dea 3: Probabilit y t hat decreases wit h t ime and as V-Vi increases?

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

20

Selecting moves in simulat ed annealing

  • I f new value Vi is bet t er t han old value

V t hen def init ely move t o new solut ion

  • I f new value Vi is worse t han old value V

t hen move t o new solut ion wit h probabilit y Exp(-(V-Vi)/ T)

Bolt zmann dist ribut ion: T> 0 is a paramet er called t emperat ure. I t st art s high and decreases over t ime t owards 0 I f T is close t o 0 t hen t he probabilit y of making a bad move is almost 0

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

21

Properties of simulated annealing

  • I f T is decreased slowly enough t hen

simulat ed annealing is guarant eed (in t heory) t o reach best solut ion

– Annealing schedule is crit ical

  • When T is high: Explorat ory phase

(random walk)

  • When T is low: Exploit at ion phase

(r andomized hill climbing)

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

22

Genetic Algorithms

  • Problems are encoded int o a represent at ion

which allows cert ain operat ions t o occur

– Usually use a bit st ring – The represent at ion is key – needs t o be t hought out caref ully

  • An encoded candidat e solut ion is an individual
  • Each individual has a f it ness which is a numerical

value associat ed wit h it s qualit y of solut ion

  • A populat ion is a set of individuals
  • Populat ions change over gener at ions by applying

st rat egies t o t hem

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

23

Typical genetic algorithm

  • I nit ialize: Populat ion P consist s of N random

individuals (bit st rings)

  • Evaluat e: f or each x∈P, comput e f it ness(x)
  • Loop

– For i=1 t o N do

  • Choose 2 parent s each wit h probabilit y proport ional t o

f it ness scores

  • Crossover t he 2 parent s t o produce a new bit st ring (child)
  • Wit h some small probabilit y mut at e child
  • Add child t o t he populat ion
  • Unt il some child is f it enough or you get bored
  • Ret urn t he best child in t he populat ion

according t o f it ness f unct ion

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

24

Crossover

  • Consist s of combining part s of individuals

t o creat e new individuals

  • Choose a random crossover point

– Cut t he individuals t here and swap t he pieces

101| 0101 011|1110 Cross over 011| 0101 101|1110

I mplement at ion: use a crossover mask m Given t wo parent s a and b t he of f spring are (a m) v (b ~m) and (a ~m) v (b m)

v v v v

slide-5
SLIDE 5

5

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

25

Mutation

  • Mut at ion allows us t o gener at e desir able

f eat ur es t hat ar e not present in t he

  • riginal populat ion
  • Typically mut at ion j ust means f lipping a

bit in t he st ring

100111 mut at es t o 100101

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

26

Genetic Algorithms

32252124

(a) Initial Population (b) Fitness Function (c) Selection (d) Cross−Over (e) Mutation

24748552 32752411 24415124

24 23 20

32543213

11 29% 31% 26% 14%

32752411 24748552 32752411 24415124 32748552 24752411 32752124 24415411 24752411 32748152 24415417

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

27

Genetic algorithms and search

  • Why are genet ic algorit hms a t ype of

search?

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

28

Genetic algorithms and search

  • Why are genet ic algorit hms a t ype of

search?

– St at es: possible solut ions – Operat ors: mut at ion, crossover , select ion – Par allel search: since several solut ions are maint ained in parallel – Hill-climbing on t he f it ness f unct ion – Mut at ion and crossover allow us t o get out

  • f local opt ima

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

29

Discussion of local search

  • Usef ul f or opt imizat ion pr oblems!
  • Of t en t he second best way t o solve a problem

– I f you can, use A* or linear programming or… – But local search is easy t o program ☺

  • Hill climbing always moves in t he (locally) best

direct ion

– Can get st uck, but random rest art s can be really ef f ect ive

  • Simulat ed annealing allows moves downhill

CS486/686 Lecture Slides (c) 2005 K. Larson and P. Poupart

30

Next class

  • Const raint sat isf act ion (CSPs)

– Russell and Norving, Chapt er 5 (mainly sect ions 5.1-5.3)