Local search Han Hoogeveen April 28, 2015 Basic recipe - - PowerPoint PPT Presentation

local search
SMART_READER_LITE
LIVE PREVIEW

Local search Han Hoogeveen April 28, 2015 Basic recipe - - PowerPoint PPT Presentation

Local search Han Hoogeveen April 28, 2015 Basic recipe Initialisation 0. Determine initial solution x Iteration 1. Determine neighbor y of x by changing x a little 2. Decide to reject or accept y as your current solution 3. Go to Step 1,


slide-1
SLIDE 1

Local search

Han Hoogeveen

April 28, 2015

slide-2
SLIDE 2

Basic recipe

Initialisation

  • 0. Determine initial solution x

Iteration

  • 1. Determine ‘neighbor’ y of x by changing x a little
  • 2. Decide to reject or accept y as your current solution
  • 3. Go to Step 1, unless some stopping criterion is satisfied.

Remarks:

◮ It usually works very well, but there are no guarantees ◮ It takes some computation time (not real-time)

slide-3
SLIDE 3

Naming

◮ Neighbor of solution x:

alternative solution that can be determined by changing x according to some given recipe (algorithm)

◮ Neighborhood of solution x:

set containing all neighbors of x.

◮ Neighborhood-structure:

Recipe (algorithm) to determine neighbors. From now on we assume that we are looking for a feasible solution x with minimum cost; the cost of x is denoted by f(x) (it may be quite complicated to define f(x)).

slide-4
SLIDE 4

Example: the traveling salesman problem

Definition TSP: We are given a set of vertices (cities) with a given distance between each pair of cities. The goal is find the tour of minimum length that visits each city exactly once. We are looking for a subgraph of minimum length such that

◮ each city is connected to two other cities; ◮ The selected edges form one tour (without subcycles).

slide-5
SLIDE 5

Example

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

slide-6
SLIDE 6

A not so good solution

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

slide-7
SLIDE 7

2-Opt

14 15 16 17 18 19

slide-8
SLIDE 8

2-Opt

14 15 16 17 18 19

slide-9
SLIDE 9

2-Opt

14 15 16 17 18 19

slide-10
SLIDE 10

21

2-Opt (Shift)

5 6 7 8 9 10

slide-11
SLIDE 11

21

2-Opt (Shift)

5 6 7 8 9 10

slide-12
SLIDE 12

21

2-Opt (Shift)

4 5 6 7 8 9 10 11

slide-13
SLIDE 13

Examples of local search methods

◮ Iterative improvement ◮ Simulated annealing ◮ Tabu search ◮ Genetic algorithms ◮ Ant colony optimization ◮ ............ (many other examples from nature)

slide-14
SLIDE 14

Iterative improvement

Iteration

  • 1. Determine the neightbor y of x by changing x a little
  • 2. If f(y) ≤ f(x), then x ← y; Goto Step 1, unless you stop

because of some stopping criterion. The acceptance criterion is that the cost does not increase.

slide-15
SLIDE 15

Iterative improvement (2)

Disadvantage: When all neighbors have higher cost, then you cannot escape from a solution x that can be much worse than the optimum (x is a local optimum then, instead of a global

  • ptimum).

Possible remedy: Repeat the procedure with a large number of different initial solution (Multi-start). Better remedy: Allow deteriorations. This leads to the methods

◮ Simulated annealing ◮ Tabu search

slide-16
SLIDE 16

Simulated annealing in general

◮ Stochastic search process; decisions are made on basis of

a stochastic experiment.

◮ A neighbor y of x is chosen from the neighborhood

randomly.

◮ Always accept improvements; deteriorations are accepted

with a certain probability that depends on the size of the deterioration and the state of the process.

◮ Continue until some stopping criterion is met. ◮ Always remember the best solution so far.

slide-17
SLIDE 17

Simulated annealing: iteration

  • 1. Choose neighbor y from the neighborhood of x; compute

f(y).

  • 2. If f(y) ≤ f(x) (cost y is not higher), then x ← y (accept

y as new solution) If f(y) > f(x), then accept y with probability p (definition follows).

  • 3. If necessary, adjust the control parameter T, which

indicates the state of the process.

  • 4. If the stopping criterion is not satisfied, then go to Step 1.
slide-18
SLIDE 18

Simulated annealing: technical details

◮ T is the control paramater; it gets decreased. ◮ The start value of T is chosen such that in the beginning

approximately half of the deteriorations gets accepted (rule

  • f thumb).

◮ Every Q iterations T is decreased by multiplying it with α,

where α = 0, 99 or 0,95 (or something like that).

◮ Q is related to the size of the neighborhood (rule of

thumb).

◮ The probability of accepting a deterioration is equal to

p = exp f(x) − f(y) T

slide-19
SLIDE 19

Usual stopping criteria

◮ The number of iterations has reached a certain limit (time). ◮ The number of accepted deteriorations has dropped to 1%

  • r 2%.

◮ The best solution has not been improved for a long time.

After the process has stopped, you can allow a restart by increasing T again and choose for x:

◮ The best solution so far, to which you apply some major

changes.

◮ An old solution that looks interesting.

slide-20
SLIDE 20

Note that

◮ Simulated Annealing works only if it is possible to make

‘small’ changes

◮ The start values of the parameters can be different from

above; you may need some tweaking.

◮ You are allowed to choose some optimal features of y, as

long as there is enough randomness.

slide-21
SLIDE 21

Tabu search: general

◮ Do not just take any neighbor y, but the best one (or one

that is better than x).

◮ Keep track of a ‘tabu-list’ that contains former solutions

(of characteristics of former solutions); these are ‘tabu’ and cannot be chosen for y.

◮ Continue until you stop. ◮ Always remember the best solution so far.

slide-22
SLIDE 22

Tabu search: iteration

  • 1. Choose the first neighbor y of x such that f(y) ≤ f(x). If

such a y does not exist, then determine the best neighbor

  • f x (order of search is important here).
  • 2. If y is not tabu, then x ← y. If y is tabu is, then continue

your search, unless y improves the best solution so far.

  • 3. Adjust the tabu-list: add x or characteristics of x; remove

the oldest information from the tabu-list.

  • 4. If the stopping criterion is not satisfied, then go to Step 1.
slide-23
SLIDE 23

Stopcriteria

◮ Maximum number of iterations has been reached. ◮ Maximum number of iterations since the last improvement

has been reached.