kh khaled rasheed co comp mputer science dept uni
play

Kh Khaled Rasheed Co Comp mputer Science Dept. Uni Universi - PowerPoint PPT Presentation

Kh Khaled Rasheed Co Comp mputer Science Dept. Uni Universi sity of Georgia ht http://www.cs. s.ug uga.edu/ u/~kha haled } Genetic al algorithms Pa Parallel el gen genet etic algo gorithms } Genetic program amming } Evolution


  1. Kh Khaled Rasheed Co Comp mputer Science Dept. Uni Universi sity of Georgia ht http://www.cs. s.ug uga.edu/ u/~kha haled

  2. } Genetic al algorithms ◦ Pa Parallel el gen genet etic algo gorithms } Genetic program amming } Evolution strat ategies } Clas assifi fier systems } Evolution program amming } Relat ated topics } Conclus Conclusion ion

  3. } Fit Fitness = Heig ight } Survival al of f the fi fittest

  4. } Mai aintai ain a a populat ation of f potential al so solutions } New solutions ar are generat ated by selecting, combining an and modify fying ex exist sting g so solutions ◦ Crossover ◦ Mutation } Objective fu function = Fitness fu function ◦ Better solutions favored for parenthood ◦ Worse solutions favored for replacement

  5. } ma maximiz imize 2X^ X^2-y+ y+5 where X: X:[0,3],Y:[0,3]

  6. } ma maximiz imize 2X^ X^2-y+ y+5 where X: X:[0,3],Y:[0,3]

  7. } Rep Repres esen enta tati tion } Fi Fitn tnes ess functi tion } Ini Initialization n st strategy } Se Sele lection ion st strategy } Cro Crossover r op operator ors } Mu Mutation op operator ors

  8. } Rep Repres esen enta tati tion } Fi Fitn tnes ess functi tion } Ini Initialization n st strategy } Se Sele lection ion st strategy } Cro Crossover r opera rators rs } Mu Mutation operators } Rep Replacem emen ent t st strategy

  9. } Proportional al selection (roulette wheel) ◦ Selection probability of individual = individual’s fitness/sum of fitness } Ran ank bas ased selection ◦ Example: decreasing arithmetic/geometric series ◦ Better when fitness range is very large or small } Tour Tourna nament nt selection on ◦ Virtual tournament between randomly selected individuals using fitness

  10. } Point crosso ssover (classi ssical) ◦ Parent1=x1,x2,x3,x4,x5,x6 ◦ Parent2=y1,y2,y3,y4,y5,y6 ◦ Child =x1,x2,x3,x4,y5,y6 } Uniform crosso ssover ◦ Parent1=x1,x2,x3,x4,x5,x6 ◦ Parent2=y1,y2,y3,y4,y5,y6 ◦ Child =x1,x2,y3,x4,y5,y6 } Arithmetic crosso ssover ◦ Parent1=x1,x2,x3 ◦ Parent2=y1,y2,y3 ◦ Child =(x1+y1)/2,(x2+y2)/2,(x3+y3)/2

  11. } ch chan ange on one or or mor ore com compon onents } Le Let Child=x1 =x1,x2 x2,P,x3 x3,x4 x4... } Ga Gaussi ussian n mut utation: n: ◦ P ¬ P ± ∆ p ◦ ∆ p: (small) random normal value } Un Uniform mutation: ◦ P ¬ P new ◦ p new : random uniform value } bo bounda dary mutation: ◦ P ¬ Pmin OR Pmax } Bi Binary y mutation=bit flip

  12. } Finds global al optima } Can an han andle discrete, continuous an and mixed var ariab able spac aces } Eas asy to use (short program ams) } Ro Robust t (less sensiti tive to to noise, ill con condit ition ions)

  13. } Relat atively slower than an other methods (not suitab able fo for eas asy problems) } Theory lag ags behind ap applicat ations

  14. } Coar arse-grai ained GA at at high level } Fin Fine-grai ained GA at at low level

  15. } Coar arse-grai ained GA at at high level } Global al par aral allel GA at at low level

  16. } Coar arse-grai ained GA at at high level } Coar arse-grai ained GA at at low level

  17. } Introduced (officially) by John Ko Koza in his bo book (g (gene netic pro progra ramming ng, 1992) } Ea Earl rly attempt pts da date ba back k to the he 50s (e (evolving po popu pulations of bi binary obje bject ct co codes) } Ide Idea is to evolve comput puter r pro progra rams } De Decl clarative p progr gramming l g langu guage ges us usua ually us used d (Lisp) p) } Pr Progr grams a are r e rep epres esen ented ted a as tr trees ees

  18. } A A populat ation of trees ees rep epres esen enting pr programs } Th The e prog ogram ams ar are e com compos osed ed of of el elem emen ents fr from the he FUNCT CTION SET and nd the he TERMINAL SE SET } Th Thes ese e set ets ar are e usual ally fixed ed set ets of of symbol ols } Th The e funct ction on set et for orms "non on-le leaf" n nodes. . (e (e.g .g. + . +,-,* ,*,s ,sin in,c ,cos) } Th The e ter erminal al set et for orms leaf eaf nod odes es. (e. e.g. x, x,3.7, random())

  19. } Fi Fitn tnes ess is usually based ed on I/O tr traces es } Cro Crossover r is implement nted by ra rand ndomly sw swapping su subtrees be between n indi ndividua duals } GP GP usually does not extensively rely on mu mutation ion (random om nod odes s or or su subtrees) } GP GPs are usually generational (sometimes with wi h a gene nera ration n gap) } GP usually uses huge populations (1M M in individ ividuals ls)

  20. } More fl flexible representat ation } Great ater ap applicat ation spectrum } If f trac actab able, evolving a a way ay to mak ake “things” is more usefu ful than an evolving the the “thi “thing ngs”. ”. } Exam ample: evolving a a lear arning rule fo for neural al networks (Am Amr Rad adi, GP , GP98) v vs. . evolving the weights of f a a par articular ar NN.

  21. } Ex Extre tremely y slow } Very poor han andling of f numbers } Very lar arge populat ations needed

  22. } Ge Gene netic programming ng with h line near geno nomes s (W (Wolfgang Ba Banzaf) ◦ Kind of going back to the evolution of binary program codes } Hyb Hybrids of GP P and other methods that be better handl dle numbe bers: ◦ Least squares methods ◦ Gradient based optimizers ◦ Genetic algorithms, other evolutionary computation methods } Ev Evolving things other than programs ◦ Example: electric circuits represented as trees (Koza, AI in design 1996)

  23. } Were invented to so solve numerical optimization pr probl blem ems } Or Originated in Europe in the 1960s } In Initially: two-me memb mber or (1+1) ES: S: ◦ one PARENT generates one OFFSPRING per GENERATION ◦ by applying normally distributed (Gaussian) mutations ◦ until offspring is better and replaces parent ◦ This simple structure allowed theoretical results to be obtained (speed of convergence, mutation size) } Later Later: en enhan anced ed to to a a (µ+1) st strategy which incorporated crosso ssover

  24. } Sc Schwefe fel l in introd oduced the mu mult lti- me memb mbered ESs Ss now ow denot oted by y (µ µ + λ ) ) an and (µ, , λ ) } (µ, , λ ) E ES: Th : The p e paren ent gen t gener erati tion i is di disjoint nt fro rom the he chi hild d gene nera ration } (µ µ + + λ ) ) ES: Some of the pa parents may be be se selected to o "prop opagate" to o the child ge generati tion

  25. } Real al val alued vectors consisting of f two par arts: ◦ Object variable: just like real-valued GA individual ◦ Strategy variable: a set of standard deviations for the Gaussian mutation } This structure al allows fo for "Self- ad adap aptat ation“ of f the mutat ation size ◦ Excellent feature for dynamically changing fitness landscape

  26. } In mac achine lear arning we seek a a good hy hypo pothe thesis } The hypothesis may ay be a a rule, a a neural al network, a a program am ... etc. } GAs an and other EC methods can an evolve rules, NNs, program ams ...etc. } Clas assifi fier systems (CFS) ar are the most explicit GA bas ased mac achine lear arning to tool.

  27. } Ru Rule a e and m mes essage s ge system tem ◦ if <condition> then <action> } Ap Apporti tionmen ent o t of c cred edit s t system tem ◦ Based on a set of training examples ◦ Credit (fitness) given to rules that match the example ◦ Example: Bucket brigade (auctions for examples, winner takes all, existence taxes) } Ge Genetic algorithm ◦ evolves a population of rules or a population of entire rule systems

  28. } Ev Evolves a population of rules, the final po popu pulation is used d as the rule and d message sy syst stem } Di Dive versity maintenance among rules is hard } If If done well converges faster } Ne Need to specify how to use the rules to cl clas assify ◦ what if multiple rules match example? ◦ exact matching only or inexact matching allowed?

  29. } Eac ach individual al is a a complete set of f ru rules or r comp mplete te soluti tion } Avoids the har ard credit as assignment pr probl blem } Slow becau ause of f complexity of f spac ace

  30. } Clas assical al EP evolves fi finite stat ate mac achines (or similar ar structures) } Relies on mutat ation (no crossover) } Fitness bas ased on trai aining seq sequen ence( e(s) s) } Good fo for sequence problems (DNA) an and prediction in time series

  31. } Add a a stat ate (with ran andom tran ansitions) } Delete a a stat ate (reas assign stat ate tran ansitions) } Chan ange an an output symbol } Chan ange a a stat ate tran ansition } Chan ange the star art stat ate

  32. } No specific representation } Similar to Evolution Strategies ◦ Most work in continuous optimization ◦ Self adaptation common } No crossover ever used!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend