Population-Based Search 2-3-16 Reading Quiz Question 1: Which of - - PowerPoint PPT Presentation

population based search
SMART_READER_LITE
LIVE PREVIEW

Population-Based Search 2-3-16 Reading Quiz Question 1: Which of - - PowerPoint PPT Presentation

Population-Based Search 2-3-16 Reading Quiz Question 1: Which of the following attributes do stochastic beam search and genetic algorithms NOT share. a) a temperature parameter b) a population of states c) probabilistic survival of


slide-1
SLIDE 1

Population-Based Search

2-3-16

slide-2
SLIDE 2

Reading Quiz

Question 1: Which of the following attributes do stochastic beam search and genetic algorithms NOT share. a) a temperature parameter b) a population of states c) probabilistic survival of individuals d) exhaustive generation of neighbors

slide-3
SLIDE 3

Simulated Annealing

state = get_candidate() best_state = state temp = INIT_TEMP for round = 1:MAX_ITERS: neighbor = random_step(state) if cost(neighbor) < cost(best_state) best_state = neighbor if accept(state, neighbor, temp) state = neighbor temp *= DECAY return best_state function accept(state, neighbor, temp): delta = cost(state) - cost(neighbor) r ~ U[0,1] return r < e^(delta / temp)

slide-4
SLIDE 4

Exercise

INIT_TEMP = 20; DECAY = .95; round = 4

Given the following state:

  • cost(DC, Baltimore, Boston, Philly, NY) = 16.6

If the following neighbor is generated, what is the probability that it is accepted?

  • cost(Boston, Baltimore, DC, Philly, NY) = 13.8

If the following neighbor is generated, what is the probability that it is accepted?

  • cost(Philly, Baltimore, Boston, DC, NY) = 19.7

function accept(state, neighbor, temp): delta = cost(state) - cost(neighbor) r ~ U[0,1] return r < e^(delta / temp)

slide-5
SLIDE 5

Choosing a temperature

slide-6
SLIDE 6

Choosing a decay rate

slide-7
SLIDE 7

Beam Search

population = [POP_SIZE random states] temp = INIT_TEMP for i = 1:MAX_ITERS: candidates = [] for individual in population: add all neighbors of individual to candidates population = [POP_SIZE lowest-cost candidates] temp *= DECAY return best state encountered

slide-8
SLIDE 8

Beam Search Illustrated

slide-9
SLIDE 9

Stochastic Beam Search

population = [POP_SIZE random states] temp = INIT_TEMP for i = 1:MAX_ITERS: candidates = [] for individual in population: add all neighbors of individual to candidates population = gibbs_samples(candidates, temp, POP_SIZE) temp *= DECAY return best state encountered

slide-10
SLIDE 10

Stochastic Beam Search - helper function

function gibbs_samples(candidates, temp, POP_SIZE): weights = [e^(-cost(n)/temp) for each candidate n] distribution = [weights[n] / sum(weights) for each candidate n] population = [POP_SIZE random draws from distribution] return population

Note that the samples are drawn independently, which means that some states may be repeated in the new population.

slide-11
SLIDE 11

Exercise: construct the distribution

INIT_TEMP = 20; DECAY = .95; round = 4

Suppose we have the following candidates: cost(Boston, DC, Philly, NY) = 13.8 cost(Philly, Boston, DC, NY) = 16.6 cost(NY, Boston, Philly, DC) = 13.8 cost(DC, Philly, Boston, NY) = 13.8 cost(DC, NY, Philly, Boston) = 16.6 cost(DC, Boston, NY, Philly) = 13.8 What is the probability distribution from which the next population will be drawn?

w = [e^(-cost(n)/temp) for each candidate n] distr = [w[n] / sum(w) for each candidate n]

slide-12
SLIDE 12

Genetic Algorithms

population = [POP_SIZE random states] for i = 1:MAX_ITERS: new_population = [] for j = 1:POP_SIZE: parent1, parent2 = select(population) add offspring(parent1, parent2) to new_population temp *= DECAY return best state encountered

slide-13
SLIDE 13

Selection

There are many ways this could be done, but a good one is Gibbs sampling. Call the function from before with POP_SIZE = 2. This means we need to add a temperature parameter.

function gibbs_samples(candidates, temp, POP_SIZE): weights = [e^(-cost(n)/temp) for each candidate n] distribution = [weights[n] / sum(weights) for each candidate n] population = [POP_SIZE random draws from distribution] return population

slide-14
SLIDE 14

Reproduction: crossover and mutation

function offspring(parent1, parent2) cross_point = random location in state representation child = parent1[start:cross_point] + parent2[cross_point:end] for each variable in child: if (small mutation probability): give it a random new value return child

slide-15
SLIDE 15

Reproduction in N-Queens

parent1 = (1,2,4,2,2,5), parent2 = (2,6,4,6,5,0) cross_point = random integer in [0,5] # suppose it’s 4

  • ffspring after crossover = (1,2,4,2,5,0)

each element mutated with probability .02 # suppose 1 is mutated mutated elements get random new values # suppose it’s 0

  • ffspring returned = (1,0,4,2,5,0)

Q Q Q Q Q Q

slide-16
SLIDE 16

Discussion: state representation for GAs

What effect does crossover have on the state representation for these problems? Is there an alternative state representation that would work better?