genetic algorithms
play

Genetic Algorithms IF offspring inherit traits from their - PDF document

Evolution Darwins Natural Selection IF there are organisms that reproduce, and Genetic Algorithms IF offspring inherit traits from their progenitors, and IF there is variability of traits, and IF the environment cannot


  1. Evolution – Darwin’s Natural Selection • IF there are organisms that reproduce, and Genetic Algorithms • IF offspring inherit traits from their progenitors, and • IF there is variability of traits, and • IF the environment cannot support all members of a growing population, MSE 2400 EaLiCaRA • THEN those members of the population with less- Dr. Tom Way adaptive traits (determined by the environment) will die out, and • THEN those members with more-adaptive traits (determined by the environment) will thrive The result is the evolution of species. 2 Basic Idea Of Principle Of An Example of Natural Selection Natural Selection • Giraffes have long necks . Giraffes with slightly longer necks could feed on leaves of higher branches when all lower ones had been eaten off.  They had a better chance of survival.  Favorable characteristic propagated through generations of “Select The Best, Discard The Rest” giraffes.  Now, evolved species has long necks. NOTE: Longer necks may have been a deviant characteristic (mutation) initially but since it was favorable, was propagated over generations. Now an established trait. So, some mutations are beneficial. 3 4 Evolution Through Natural Selection How Genetic Algorithms Work Initial Population Of Animals • Genetic Algorithms implement optimization Struggle For Existence-Survival Of the Fittest strategies by simulating evolution of species through natural selection. Surviving Individuals Reproduce, Propagate Favorable Characteristics • Iteratively improve a set of possible answers to a problem by combining best Millions Of Years parts of possible answers to form Evolved Species (hopefully) better answers. (Favorable Characteristic Now A Trait Of Species) 5 6 1

  2. Genetic Algorithms Background... • Evolution – Invented by John Holland 1975 – Organisms (animals or plants) produce a number of – Made popular by John Koza 1992 offspring which are almost, but not entirely, like themselves. – Extinction and Adaptation Some of these offspring may survive to their own —some won’t produce offspring of • The “better adapted” offspring are more likely to survive • Over time, later generations become better and better adapted – Genes and Chromosomes • Genes – “instructions” for building an organism • Chromosomes – a sequence of genes – Genetic Algorithms use this same process to “evolve” better programs 7 8 Genetic Algorithm Concept Survival of the Fittest • Genetic algorithm (GA) introduces the • The main principle of evolution used in principle of evolution and genetics into GAs is “ survival of the fittest ”. search among possible solutions to – The good solution survive, while bad ones die. given problem. • This is done by the creation within a machine of a population of individuals represented by chromosomes, in essence a set of character strings, that are analogous to the DNA, that we have in our own chromosomes. 9 10 So what is a genetic algorithm? Basic algorithm • Genetic algorithms are a randomized • Create an initial population, either random or “blank”. heuristic search strategy. • Basic idea: Simulate natural selection, • While the best candidate so far is not a where the population is composed of solution: candidate solutions . – Create new population using successor • Focus is on evolving a population from functions. – Evaluate the fitness of each candidate in the which strong and diverse candidates can population. emerge via mutation and crossover • Return the best candidate found. (mating). 11 12 2

  3. Flowchart of a Genetic Algorithm Crossover Begin • Crossover is the similar to natural Initialize population reproduction. Evaluate Solutions • Crossover combines genetic material T =0 from two parents, N Optimum Solution? in order to produce superior offspring. Selection Y • Few types of crossover: T=T+1 Crossover Stop – One-point – Multiple point. Mutation 13 14 Crossover Crossover • E.g. • E.g. 0 7 0 7 1 6 1 6 2 5 5 2 3 4 3 4 4 3 4 3 5 2 5 2 6 1 6 1 7 0 7 0 Parent 1 Parent 2 15 16 Mutation Mutation • Mutation introduces randomness into the population. Parent • Why ‘Mutation’ 1 1 1 0 1 0 0 0 0 1 – The idea of mutation is to reintroduce divergence into a converging population. • Mutation is performed on small part of population, 0 0 1 0 1 0 1 1 0 1 Child in order to avoid entering unstable state. 17 18 3

  4. Fitness Function Selection • Fitness Function is the evaluation • The selection operation copies a function that is used to evaluated the single individual, probabilistically solutions and find out the better selected based on fitness, into the solutions. next generation of the population. • Fitness of computed for each • Several possible ways individual based on the fitness – Keep the strongest – Keep some of the weaker solutions function and then determine what solutions are better than others. 19 20 Selection Stopping Criteria Survival of The Strongest • Final problem is to decide when to Previous generation stop execution of algorithm. • Two possible ways. 0.93 0.51 0.72 0.31 0.12 0.64 – First approach: • Stop after production of definite number of generations – Second approach: Next generation • Stop when the improvement in average fitness over 0.93 0.72 0.64 two generations is below a threshold 22 21 Simple example – alternating string Basic components • Candidate representation • Let’s try to evolve a length 4 alternating string – Important to choose this well. More work here • Initial population: C1=1000, C2=0011 means less work on the successor functions. • We roll the dice and end up creating C1’ = • Successor function(s) cross (C1, C2) = 1011 and C2’ = cross (C1, – Mutation, crossover C1) = 1000. • Fitness function • We mutate C1’ and the fourth bit flips, giving • Solution test 1010. We mutate C2’ and get 1001. • Some parameters • We run our solution test on each. C1’ is a – Population size solution, so we return it and are done. – Generation limit 23 24 4

  5. Candidate representation Candidate representation example • Let’s say we want to represent a rule for classifying • We want to encode candidates in a way bikes as mountain bikes or hybrid, based on these that makes mutation and crossover easy. attributes: – Make (Bridgestone, Cannondale, Nishiki, or Gary Fisher) • The typical candidate representation is a – Tire type (knobby, treads) – Handlebar type (straight, curved) binary string. This string can be thought of – Water bottle holder ( Boolean ) as the genetic code of a candidate – thus • We can encode a rule as a binary string, where each the term “genetic algorithm”! bit represents whether a value is accepted. – Other representations are possible, but they Make Tires Handlebars Water bottle make crossover and mutation harder. B C N G K T S C Y N 25 26 Candidate representation Successor functions example • Mutation – Given a candidate, return a • The candidate will be a bit string of length 10, slightly different candidate. because we have 10 possible attribute values. • Crossover – Given two candidates, produce • Let’s say we want a rule that will match any bike one that has elements of each. that is made by Bridgestone or Cannondale, has • We don’t always generate a successor for treaded tires, and has straight handlebars. This rule could be represented as 1100011011: each candidate. Rather, we generate a successor population based on the candidates in the current population, Make Tires Handlebars Water bottle weighted by fitness. 1 1 0 0 0 1 1 0 1 1 B C N G K T S C Y N 27 28 Successor functions Fitness function • If your candidate representation is just a • The fitness function is analogous to a heuristic that estimates how close a binary string, then these are easy: candidate is to being a solution. – Mutate(c): Copy c as c’. For each bit b in c’, • In general, the fitness function should be flip b with probability p. Return c’. consistent for better performance. However, – Cross (c1, c2): Create a candidate c such even if it is, there are no guarantees. This is that c[i] = c1[i] if i % 2 = 0, c[i] = c2[i] a probabilistic algorithm! otherwise. Return c. • In our classification rule example, one • Alternatively, any other scheme such that c gets possible fitness function would be information roughly equal information from c1 and c2. gain over training data. 29 30 5

Recommend


More recommend