Genetic Algorithms [Read Chapter 9] [Exercises 9.1, 9.2, 9.3, - - PDF document

genetic algorithms read chapter 9 exercises 9 1 9 2 9 3 9
SMART_READER_LITE
LIVE PREVIEW

Genetic Algorithms [Read Chapter 9] [Exercises 9.1, 9.2, 9.3, - - PDF document

Genetic Algorithms [Read Chapter 9] [Exercises 9.1, 9.2, 9.3, 9.4] Ev olutionary computation Protot ypical GA An example: GABIL Genetic Programming Individual learning and p opulation ev olution


slide-1
SLIDE 1 Genetic Algorithms [Read Chapter 9] [Exercises 9.1, 9.2, 9.3, 9.4]
  • Ev
  • lutionary
computation
  • Protot
ypical GA
  • An
example: GABIL
  • Genetic
Programming
  • Individual
learning and p
  • pulation
ev
  • lution
168 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-2
SLIDE 2 Ev
  • luationary
Computation 1. Computational pro cedures patterned after biological ev
  • lution
2. Searc h pro cedure that probabilisti cal l y applies searc h
  • p
erators to set
  • f
p
  • in
ts in the searc h space 169 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-3
SLIDE 3 Biological Ev
  • lution
Lamarc k and
  • thers:
  • Sp
ecies \transm ute"
  • v
er time Darwin and W allace:
  • Consisten
t, heritable v ariation among individuals in p
  • pulation
  • Natural
selection
  • f
the ttest Mendel and genetics:
  • A
mec hanism for inheriting traits
  • genot
yp e ! phenot yp e mapping 170 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-4
SLIDE 4 GA(F itness; F itness thr eshol d; p; r ; m)
  • Initialize:
P p random h yp
  • theses
  • Evaluate:
for eac h h in P , compute F itness(h)
  • While
[max h F itness(h)] < F itness thr eshol d 1. Sele ct: Probabilistic al ly select (1
  • r
)p mem b ers
  • f
P to add to P S . Pr (h i ) = F itness(h i ) P p j =1 F itness(h j ) 2. Cr
  • ssover:
Probabilistic al l y select r p 2 pairs
  • f
h yp
  • theses
from P . F
  • r
eac h pair, hh 1 ; h 2 i, pro duce t w
  • spring
b y applying the Crosso v er
  • p
erator. Add all
  • spring
to P s . 3. Mutate: In v ert a randomly selected bit in m
  • p
random mem b ers
  • f
P s 4. Up date: P P s 5. Evaluate: for eac h h in P , compute F itness(h)
  • Return
the h yp
  • thesis
from P that has the highest tness. 171 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-5
SLIDE 5 Represen ting Hyp
  • theses
Represen t (O utl
  • ok
= O v er cast _ R ain) ^ (W ind = S tr
  • ng
) b y O utl
  • ok
W ind 011 10 Represen t IF W ind = S tr
  • ng
THEN P l ay T ennis = y es b y O utl
  • ok
W ind P l ay T ennis 111 10 10 172 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-6
SLIDE 6 Op erators for Genetic Algorithms

Single-point crossover:

11101001000 00001010101 11111000000 11101010101

Initial strings Crossover Mask Offspring Two-point crossover:

11101001000 00001010101 00111110000 11001011000 10011010011

Uniform crossover: Point mutation:

11101001000 00001010101 10001000100 11101001000 11101011000 00101000101 00001001000 01101011001

173 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-7
SLIDE 7 Selecting Most Fit Hyp
  • theses
Fitness prop
  • rtionate
selection: Pr(h i ) = F itness(h i ) P p j =1 F itness(h j ) ... can lead to cr
  • wding
T
  • urnamen
t selection:
  • Pic
k h 1 ; h 2 at random with uniform prob.
  • With
probabilit y p, select the more t. Rank selection:
  • Sort
all h yp
  • theses
b y tness
  • Prob
  • f
selection is prop
  • rtional
to rank 174 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-8
SLIDE 8 GABIL [DeJong et al. 1993] Learn disjunctiv e set
  • f
prop
  • sitional
rules, comp etitiv e with C4.5 Fitness: F itness(h) = (cor r ect(h)) 2 Represen tatio n: IF a 1 = T ^a 2 = F THEN c = T ; IF a 2 = T THEN c = F represen ted b y a 1 a 2 c a 1 a 2 c 10 01 1 11 10 Genetic
  • p
erators: ???
  • w
an t v ariable length rule sets
  • w
an t
  • nly
w ell-formed bitstring h yp
  • theses
175 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-9
SLIDE 9 Crosso v er with V ariable-Length Bit- strings Start with a 1 a 2 c a 1 a 2 c h 1 : 10 01 1 11 10 h 2 : 01 11 10 01 1. c ho
  • se
crosso v er p
  • in
ts for h 1 , e.g., after bits 1, 8 2. no w restrict p
  • in
ts in h 2 to those that pro duce bitstrings with w ell-dened seman tics, e.g., h1; 3i, h1; 8i, h6; 8i. if w e c ho
  • se
h1; 3i, result is a 1 a 2 c h 3 : 11 10 a 1 a 2 c a 1 a 2 c a 1 a 2 c h 4 : 00 01 1 11 11 10 01 176 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-10
SLIDE 10 GABIL Extensions Add new genetic
  • p
erators, also applied probabilistic al l y: 1. A ddA lternative: generalize constrain t
  • n
a i b y c hanging a to 1 2. Dr
  • pCondition:
generalize constrain t
  • n
a i b y c hanging ev ery to 1 And, add new eld to bitstring to determine whether to allo w these a 1 a 2 c a 1 a 2 c AA D C 01 11 10 01 1 So no w the learning strategy also ev
  • lv
es! 177 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-11
SLIDE 11 GABIL Results P erformance
  • f
GABIL comparable to sym b
  • lic
rule/tree learning metho ds C4.5, ID5R, A Q14 Av erage p erformance
  • n
a set
  • f
12 syn thetic problems:
  • GABIL
without AA and D C
  • p
erators: 92.1% accuracy
  • GABIL
with AA and D C
  • p
erators: 95.2% accuracy
  • sym
b
  • lic
learning metho ds ranged from 91.2 to 96.6 178 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-12
SLIDE 12 Sc hemas Ho w to c haracterize ev
  • lution
  • f
p
  • pulation
in GA? Sc hema = string con taining 0, 1, * (\don't care")
  • T
ypical sc hema: 10**0*
  • Instances
  • f
ab
  • v
e sc hema: 101101, 100000, ... Characterize p
  • pulation
b y n um b er
  • f
instances represen ting eac h p
  • ssible
sc hema
  • m(s;
t) = n um b er
  • f
instances
  • f
sc hema s in p
  • p
at time t 179 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-13
SLIDE 13 Consider Just Selection
  • f
(t) = a v erage tness
  • f
p
  • p.
at time t
  • m(s;
t) = instances
  • f
sc hema s in p
  • p
at time t
  • ^
u(s; t) = a v e. tness
  • f
instances
  • f
s at time t Probabilit y
  • f
selecting h in
  • ne
selection step Pr (h) = f (h) P n i=1 f (h i ) = f (h) n
  • f
(t) Probabilt y
  • f
selecting an instance
  • f
s in
  • ne
step Pr (h 2 s) = X h2s\p t f (h) n
  • f
(t) = ^ u(s; t) n
  • f
(t) m(s; t) Exp ected n um b er
  • f
instances
  • f
s after n selections E [m(s; t + 1)] = ^ u(s; t)
  • f
(t) m(s; t) 180 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-14
SLIDE 14 Sc hema Theorem E [m(s; t+1)]
  • ^
u (s; t)
  • f
(t) m(s; t) B B @ 1
  • p
c d(s) l
  • 1
1 C C A (1p m )
  • (s)
  • m(s;
t) = instances
  • f
sc hema s in p
  • p
at time t
  • f
(t) = a v erage tness
  • f
p
  • p.
at time t
  • ^
u(s; t) = a v e. tness
  • f
instances
  • f
s at time t
  • p
c = probabilit y
  • f
single p
  • in
t crosso v er
  • p
erator
  • p
m = probabilit y
  • f
m utation
  • p
erator
  • l
= length
  • f
single bit strings
  • (s)
n um b er
  • f
dened (non \*") bits in s
  • d(s)
= distance b et w een leftmost, righ tmost dened bits in s 181 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-15
SLIDE 15 Genetic Programming P
  • pulation
  • f
programs represen ted b y trees sin (x) + r x 2 + y

^ sin x y 2 + x +

182 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-16
SLIDE 16 Crosso v er

^ sin x y 2 + x + ^ sin x y 2 + x + sin x y + x + ^ sin x y 2 + x + ^ 2

183 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-17
SLIDE 17 Blo c k Problem

u i v a n e r s l

Goal: sp ell UNIVERSAL T erminals:
  • CS
(\curren t stac k") = name
  • f
the top blo c k
  • n
stac k,
  • r
F .
  • TB
(\top correct blo c k") = name
  • f
topmost correct blo c k
  • n
stac k
  • NN
(\next necessary") = name
  • f
the next blo c k needed ab
  • v
e TB in the stac k 184 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-18
SLIDE 18 Primitiv e functions:
  • (MS
x): (\mo v e to stac k"), if blo c k x is
  • n
the table, mo v es x to the top
  • f
the stac k and returns the v alue T . Otherwise, do es nothing and returns the v alue F .
  • (MT
x): (\mo v e to table"), if blo c k x is somewhere in the stac k, mo v es the blo c k at the top
  • f
the stac k to the table and returns the v alue T . Otherwise, returns F .
  • (EQ
x y ): (\equal"), returns T if x equals y , and returns F
  • therwise.
  • (NOT
x): returns T if x = F , else returns F
  • (DU
x y ): (\do un til") executes the expression x rep eatedly un til expression y returns the v alue T 185 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-19
SLIDE 19 Learned Program T rained to t 166 test problems Using p
  • pulation
  • f
300 programs, found this after 10 generations: (EQ (DU (MT CS)(NOT CS)) (DU (MS NN)(NOT NN)) ) 186 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-20
SLIDE 20 Genetic Programming More in teresting example: design electronic lter circuits
  • Individuals
are programs that transform b egining circuit to nal circuit, b y adding/subtracting comp
  • nen
ts and connections
  • Use
p
  • pulation
  • f
640,000, run
  • n
64 no de parallel pro cessor
  • Disco
v ers circuits comp etitiv e with b est h uman designs 187 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-21
SLIDE 21 GP for Classifying Images [T eller and V eloso, 1997] Fitness: based
  • n
co v erage and accuracy Represen tatio n:
  • Primitiv
es include Add, Sub, Mult, Div, Not, Max, Min, Read, W rite, If-Then-Else, Either, Pixel, Least, Most, Av e, V ariance, Dierence, Mini, Library
  • Mini
refers to a lo cal subroutine that is separately co-ev
  • lv
ed
  • Library
refers to a global library subroutine (ev
  • lv
ed b y selecting the most useful minis) Genetic
  • p
erators:
  • Crosso
v er, m utation
  • Create
\mating p
  • ls"
and use rank prop
  • rtionate
repro duction 188 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-22
SLIDE 22 Biolog i cal Ev
  • lution
Lamark (19th cen tury)
  • Believ
ed individual genetic mak eup w as altered b y lifeti me exp erience
  • But
curren t evidence con tradicts this view What is the impact
  • f
individual learning
  • n
p
  • pulation
ev
  • lution?
189 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-23
SLIDE 23 Baldwin Eect Assume
  • Individual
learning has no direct inuence
  • n
individual DNA
  • But
abilit y to learn reduces need to \hard wire" traits in DNA Then
  • Abilit
y
  • f
individual s to learn will supp
  • rt
more div erse gene p
  • l
{ Because learning allo ws individual s with v arious \hard wired" traits to b e successful
  • More
div erse gene p
  • l
will supp
  • rt
faster ev
  • lution
  • f
gene p
  • l
! individual learning (indirectl y) increases rate
  • f
ev
  • lution
190 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-24
SLIDE 24 Baldwin Eect Plausible example: 1. New predator app ears in en vironmen t 2. Individuals who can learn (to a v
  • id
it) will b e selected 3. Increase in learning individuals will supp
  • rt
more div erse gene p
  • l
4. resulting in faster ev
  • lution
5. p
  • ssibly
resulting in new non-learned traits suc h as instin tiv e fear
  • f
predator 191 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-25
SLIDE 25 Computer Exp erimen ts
  • n
Baldwin Eect [Hin ton and No wlan, 1987] Ev
  • lv
e simple neural net w
  • rks:
  • Some
net w
  • rk
w eigh ts xed during lifeti me,
  • thers
trainable
  • Genetic
mak eup determines whic h are xed, and their w eigh t v alues Results:
  • With
no individual learning, p
  • pulation
failed to impro v e
  • v
er time
  • When
individual learning allo w ed { Early generations: p
  • pulation
con tained man y individual s with man y trainable w eigh ts { Later generations: higher tness, while n um b er
  • f
trainable w eigh ts decreased 192 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997
slide-26
SLIDE 26 Summary: Ev
  • lutionary
Program- ming
  • Conduct
randomized, parallel, hill-cl i m bing searc h through H
  • Approac
h learning as
  • ptimization
problem (optimize tness)
  • Nice
feature: ev aluation
  • f
Fitness can b e v ery indirect { consider learning rule set for m ultistep decision making { no issue
  • f
assigning credit/blame to indiv. steps 193 lecture slides for textb
  • k
Machine L e arning, T. Mitc hell, McGra w Hill, 1997