AI AI Department of Computer Science University of Calgary CPSC - - PowerPoint PPT Presentation

ai ai
SMART_READER_LITE
LIVE PREVIEW

AI AI Department of Computer Science University of Calgary CPSC - - PowerPoint PPT Presentation

Particle Swarm Optimization Particle Swarm Optimization (PSO) (PSO) Adaptive Swarms for Optimization Adaptive Swarms for Optimization Christian Jacob AI AI Department of Computer Science University of Calgary CPSC 565 Winter 2003


slide-1
SLIDE 1

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 1

Particle Swarm Optimization Particle Swarm Optimization (PSO) (PSO)

Adaptive Swarms for Optimization Adaptive Swarms for Optimization

AI AI

Christian Jacob

Department of Computer Science University of Calgary

CPSC 565 — Winter 2003

slide-2
SLIDE 2

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 2

PSO: An Overview PSO: An Overview

  • Developed by

– Russ Eberhart, Purdue School of Engineering and Technology, Indianapolis – Jim Kennedy, Bureau of Labor Statistics, Washington, DC

  • A concept for optimizing non-linear functions using

particle swarm methodology

  • Has roots in Artificial Life and Evolutionary Computation
  • Simple concept
  • Easy to implement
  • Computationally efficient
  • Effective on a wide variety of problems
slide-3
SLIDE 3

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 3

Evolution of Concept and Paradigms Evolution of Concept and Paradigms

  • Discovered through simplified social model simulation
  • Related to bird flocking, fish schooling and swarming

theory

  • Related to evolutionary computation:

– Genetic algorithms – Evolution strategies

  • Kennedy developed the “cornfield vector” for birds

seeking food

  • Bird flock became swarm
  • Expanded to multi-dimensional search
  • Incorporated acceleration by distance
  • Paradigm simplified
slide-4
SLIDE 4

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 4

Flocks, Herds, and Schools Flocks, Herds, and Schools

  • Avoid collisions
  • Match neighbours’ velocity and orientation
  • Steer toward the center

Separation Alignment Cohesion

slide-5
SLIDE 5

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 5

PSO Algorithm PSO Algorithm

  • 1. Initialize population in hyperspace.
  • stochastically assign locations and velocities
  • 2. Evaluate fitness of individual particles.
  • 3. Keep track of location where individual had its highest

fitness.

  • 4. Modify velocities based on previous best and global (or

neighbourhood) best positions.

  • neighbourhoods don’t change
  • 5. Terminate if some condition is met.
  • 6. Go to step 2.

Fly solutions through problem space …

slide-6
SLIDE 6

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 6

Particle Swarm Optimization Particle Swarm Optimization

DEMO

slide-7
SLIDE 7

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 7

Particle Swarms Particle Swarms

  • Sociocognitive Space

– High-dimensional – Abstract: attitudes, behaviours, cognition – Heterogeneous with respect to evaluation (dissonance) – Multiple individuals

  • Individual is characterized by

– Position = “mental state”: xi – Changes = velocity: vi

slide-8
SLIDE 8

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 8

Particle Swarms: Particle Swarms: “ “Code Code” ”

  • Individuals (particles) learn from their own experience:

– vi := vi + j() · (pi - xi) – xi := xi + vi

  • xi: current position of individual i
  • vi: current velocity of individual i
  • pi: so-far best position for individual I
  • (pi - xi): acceleration towards previous best
  • j(): generates random positive number
  • This formula, iterated over time, causes the individual’s trajectory to
  • scillate around their previous best point pi in sociocognitive space.
  • The velocity of individual i is stochastically adjusted depending on

previous successes.

slide-9
SLIDE 9

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 9

Particle Swarms with Neighbourhood Best Particle Swarms with Neighbourhood Best

  • Sociocognitive space can contain many individuals that influence one

another:

– vi := vi + j1() · (pi - xi) + j2() · (pg - xi) – xi := xi + vi

  • pg: previous best position in the population
  • (pi - xi): acceleration towards previous best
  • (pg - xi): acceleration towards global best
  • j1(), j2(): generate random positive numbers
  • Evaluate your present position.
  • Compare it to your previous best and neighbourhood best.
  • Imitate self and others.
slide-10
SLIDE 10

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 10

General PSO Update Algorithm General PSO Update Algorithm

  • Global version:

– vid := wi vid + c1 j1() · (pid - xid) + c2 j2() · (pgd - xid) – xid := xid + vid

  • d: dimension
  • c1, c2 : positive constants

set exploration vs. exploitation

  • w: inertia
  • j1(), j2(): generate random positive numbers
  • For neighbourhood version:

– Change pgd to pld .

slide-11
SLIDE 11

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 11

The The “ “Drunkard Drunkard’ ’s Walk s Walk” ”

  • The particle will explode out of control if it is not limited

in some way. Three methods are widely used:

  • Vmax:

vi := vi + j1() · (pi - xi) + j2() · (pg - xi) if vi > Vmax then vi := Vmax else if vi < - Vmax then vi := Vmax

  • Inertia weight a:

vi := a vi + j1() · (pi - xi) + j2() · (pg - xi)

  • Constriction coefficient c:

vi := c (vi + j1() · (pi - xi) + j2() · (pg - xi))

slide-12
SLIDE 12

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 12

Important Parameters: V Important Parameters: Vmax

max

  • An important parameter in PSO; typically the only one

adjusted

  • Clamps particles’ velocities on each dimension
  • Determines “finteness” with which regions are searched:

– If too high, can fly past optimal solutions – If too low, can get stuck in local minima

  • Set Vmax to dynamic range of the variables.
slide-13
SLIDE 13

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 13

Important Parameters: Inertia Weight Important Parameters: Inertia Weight a a

  • Inertia weight a:

vi := a vi + j1() · (pi - xi) + j2() · (pg - xi)

  • It seems possible that one can get rid of Vmax by setting a

equal to the dynamic range of each variable.

  • Then a must be selected carefully and/or decreased over

the run.

  • Hence, the inertia weight a seems to have attributes of the

temperature in simulated annealing.

slide-14
SLIDE 14

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 14

Evolutionary Computation & Particle Swarms Evolutionary Computation & Particle Swarms

  • Culture as evolution (anthropology)
  • Adaptation / learning
  • Memetics
  • Evolutionary epistemology
  • Change vs. selection
  • Fitness and dissonance
  • Cooperation vs. competition
  • Evolution = competitive struggle
  • PS = cooperation inherent

PS = 5th EC paradigm?

slide-15
SLIDE 15

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 15

Basic Principles of Swarm Intelligence Basic Principles of Swarm Intelligence

  • Proximity principle:

– The population should be able to carry out simple space and time computations.

  • Quality principle:

– The population should be able to respond to quality factors in the environment.

  • Diverse response principle:

– The population should not commit its activities along excessively narrow channels.

  • Stability principle:

– The population should not change its mode of behaviour every time the environment changes.

  • Adaptability principle:

– The population must be able to change its behaviour mode when it’s worth the computational price.

slide-16
SLIDE 16

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 16

Adherence to Swarm Intelligence Principles Adherence to Swarm Intelligence Principles

  • Proximity:

– N-dimensional space calculations carried out over a series of time steps

  • Quality:

– Population responds to quality factors pbest and gbest (or lbest)

  • Diverse response:

– Responses allocated between pbest and gbest (or lbest)

  • Stability:

– Population changes state only when gbest (or lbest) changes

  • Adaptability:

– Population does change state when gbest (or lbest) changes

slide-17
SLIDE 17

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 17

Enhancements of PSO Enhancements of PSO

  • Elitist concept from GA might be helpful in PSO.

– Carry global best particle into next generation?

  • Could incorporate Gaussian distribution into stochastic

velocity changes.

– Variance might then be like inertia weight. – Put noise on decrease of inertia weights (better convergence)

  • Could assign Vmax on a parameter-by-parameter basis.

– Analogous to controlling severity of mutation in GA & EP

slide-18
SLIDE 18

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 18

GAs vs. PSO: Crossover GAs vs. PSO: Crossover

  • Does not have crossover.
  • Acceleration toward personal and global best is similar

concept.

  • Particles midway between swarms also exhibit crossover

features.

  • Recombination operator in evolution strategies may be

more analogous.

slide-19
SLIDE 19

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 19

GAs vs. PSO: Mutation GAs vs. PSO: Mutation

  • GAs are not actually ergodic:

– A number of mutations probably required – Low fitness individuals will not survive selection – Probability of survival decreases geometrically with generations

  • EP (for parameter optimization) is ergodic: can reach any

point in one jump

  • PSO seems to fall between GA and EP

– any particle can eventually go anywhere

  • PSO mutation-like behaviour is directional

– GA and EP are omni-directional

slide-20
SLIDE 20

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 20

GAs vs. PSO: Selection GAs vs. PSO: Selection

  • GA selection supports survival or the fittest

(when using elitist strategy)

  • There is no selection in PSO

– All particles survive for the length of the run. – Number of particles does not change.

  • PSO is the only “evolutionary algorithm” that does not

remove candidate population members.

slide-21
SLIDE 21

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 21

PSO as an Evolutionary Algorithm PSO as an Evolutionary Algorithm

  • Distinctions among EC paradigms continue to blur.
  • New hybrid PSO approaches will be emphasized.
  • Practical PSO applications will be emphasized, in addition

to benchmarking.

  • Focus on how the PSO paradigms work.
  • Many more (PSO) hybrids to come …
slide-22
SLIDE 22

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 22

PSO Example PSO Example

slide-23
SLIDE 23

Christian Jacob, University of Calgary Emergent Computing — CPSC 565 — Winter 2003 23

References References

  • Kennedy, J. and R. C. Eberhart (2001). Swarm Intelligence.

San Francisco, Morgan Kaufmann Publishers.

  • Kennedy, J. and R. C. Eberhart (2002). Tutorial on

Particle Swarm Optimization. 2002 World Congress on Computational Intelligence, Hawaii, USA.