Ant Colony Optimization and the Minimum Cut Problem Timo K otzing, - - PowerPoint PPT Presentation

ant colony optimization and the minimum cut problem
SMART_READER_LITE
LIVE PREVIEW

Ant Colony Optimization and the Minimum Cut Problem Timo K otzing, - - PowerPoint PPT Presentation

Ant Colony Optimization and the Minimum Cut Problem Timo K otzing, Per Kristian Lehre, Frank Neumann, Pietro S. Oliveto March 25, 2010 Ant Colony Optimization (ACO) We want to analyze the use of Ant Colony Optimization (ACO) for the


slide-1
SLIDE 1

Ant Colony Optimization and the Minimum Cut Problem

Timo K¨

  • tzing, Per Kristian Lehre,

Frank Neumann, Pietro S. Oliveto March 25, 2010

slide-2
SLIDE 2

Ant Colony Optimization (ACO)

◮ We want to analyze the use of Ant Colony Optimization

(ACO) for the Minimum Cut Problem.

◮ As input, the ACO algorithm gets an weighted undirected

graph G on n vertices.

◮ The ACO algorithm iteratively computes partitions of G’s

vertices into two non-empty sets, one per iteration.

◮ The algorithm keeps track of the best so far candidate

solution.

◮ We analyze the random variable of the number of iterations

required until an optimal solution is found.

2/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-3
SLIDE 3

Ant Colony Optimization (ACO)

◮ We want to analyze the use of Ant Colony Optimization

(ACO) for the Minimum Cut Problem.

◮ As input, the ACO algorithm gets an weighted undirected

graph G on n vertices.

◮ The ACO algorithm iteratively computes partitions of G’s

vertices into two non-empty sets, one per iteration.

◮ The algorithm keeps track of the best so far candidate

solution.

◮ We analyze the random variable of the number of iterations

required until an optimal solution is found.

2/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-4
SLIDE 4

Ant Colony Optimization (ACO)

◮ We want to analyze the use of Ant Colony Optimization

(ACO) for the Minimum Cut Problem.

◮ As input, the ACO algorithm gets an weighted undirected

graph G on n vertices.

◮ The ACO algorithm iteratively computes partitions of G’s

vertices into two non-empty sets, one per iteration.

◮ The algorithm keeps track of the best so far candidate

solution.

◮ We analyze the random variable of the number of iterations

required until an optimal solution is found.

2/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-5
SLIDE 5

Ant Colony Optimization (ACO)

◮ We want to analyze the use of Ant Colony Optimization

(ACO) for the Minimum Cut Problem.

◮ As input, the ACO algorithm gets an weighted undirected

graph G on n vertices.

◮ The ACO algorithm iteratively computes partitions of G’s

vertices into two non-empty sets, one per iteration.

◮ The algorithm keeps track of the best so far candidate

solution.

◮ We analyze the random variable of the number of iterations

required until an optimal solution is found.

2/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-6
SLIDE 6

Ant Colony Optimization (ACO)

◮ We want to analyze the use of Ant Colony Optimization

(ACO) for the Minimum Cut Problem.

◮ As input, the ACO algorithm gets an weighted undirected

graph G on n vertices.

◮ The ACO algorithm iteratively computes partitions of G’s

vertices into two non-empty sets, one per iteration.

◮ The algorithm keeps track of the best so far candidate

solution.

◮ We analyze the random variable of the number of iterations

required until an optimal solution is found.

2/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-7
SLIDE 7

Ant Colony Optimization (ACO)

◮ We want to analyze the use of Ant Colony Optimization

(ACO) for the Minimum Cut Problem.

◮ As input, the ACO algorithm gets an weighted undirected

graph G on n vertices.

◮ The ACO algorithm iteratively computes partitions of G’s

vertices into two non-empty sets, one per iteration.

◮ The algorithm keeps track of the best so far candidate

solution.

◮ We analyze the random variable of the number of iterations

required until an optimal solution is found.

2/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-8
SLIDE 8

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-9
SLIDE 9

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-10
SLIDE 10

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-11
SLIDE 11

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-12
SLIDE 12

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-13
SLIDE 13

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-14
SLIDE 14

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-15
SLIDE 15

Idea for Constructing Solutions

◮ Idea for Constructing Solutions (Karger and Stein): ◮ Any forest of n − 2 edges constitutes a partition into two sets

(the sets of vertices of the two trees).

◮ Karger and Stein give an algorithm with expected runtime

O(n2).

◮ Our ACO algorithm lets ants choose (sequentially) n − 2

edges to build candidate solutions (without creating cycles).

◮ The probability for an edge e to be picked depends on two

value associated with that edge:

◮ its weight w(e); and ◮ the pheromone value τe on e. 3/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-16
SLIDE 16

Pheromones

◮ Pheromones are additional information on the edges. ◮ A higher pheromone value on an edge e means that e is more

likely to be chosen for the next solution.

◮ Initially, all pheromone values are the same. ◮ After that, the pheromone value of an edge e that is used in

the best-so-far solution has a pheromone value h.

◮ All others have a pheromone value l.

4/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-17
SLIDE 17

Pheromones

◮ Pheromones are additional information on the edges. ◮ A higher pheromone value on an edge e means that e is more

likely to be chosen for the next solution.

◮ Initially, all pheromone values are the same. ◮ After that, the pheromone value of an edge e that is used in

the best-so-far solution has a pheromone value h.

◮ All others have a pheromone value l.

4/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-18
SLIDE 18

Pheromones

◮ Pheromones are additional information on the edges. ◮ A higher pheromone value on an edge e means that e is more

likely to be chosen for the next solution.

◮ Initially, all pheromone values are the same. ◮ After that, the pheromone value of an edge e that is used in

the best-so-far solution has a pheromone value h.

◮ All others have a pheromone value l.

4/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-19
SLIDE 19

Pheromones

◮ Pheromones are additional information on the edges. ◮ A higher pheromone value on an edge e means that e is more

likely to be chosen for the next solution.

◮ Initially, all pheromone values are the same. ◮ After that, the pheromone value of an edge e that is used in

the best-so-far solution has a pheromone value h.

◮ All others have a pheromone value l.

4/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-20
SLIDE 20

Pheromones

◮ Pheromones are additional information on the edges. ◮ A higher pheromone value on an edge e means that e is more

likely to be chosen for the next solution.

◮ Initially, all pheromone values are the same. ◮ After that, the pheromone value of an edge e that is used in

the best-so-far solution has a pheromone value h.

◮ All others have a pheromone value l.

4/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-21
SLIDE 21

Pheromones

◮ Pheromones are additional information on the edges. ◮ A higher pheromone value on an edge e means that e is more

likely to be chosen for the next solution.

◮ Initially, all pheromone values are the same. ◮ After that, the pheromone value of an edge e that is used in

the best-so-far solution has a pheromone value h.

◮ All others have a pheromone value l.

4/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-22
SLIDE 22

Heuristic Information vs. Pheromone Values

◮ Remember: The probability for an edge e to be picked

depends on the two values w(e) and τe.

◮ How do we balance these two values? ◮ We use two parameters, α and β. ◮ For an edge e with associated pheromone value τe and weight

w(e), the ant chooses e proportionally to τ α

e · (w(e))β.

5/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-23
SLIDE 23

Heuristic Information vs. Pheromone Values

◮ Remember: The probability for an edge e to be picked

depends on the two values w(e) and τe.

◮ How do we balance these two values? ◮ We use two parameters, α and β. ◮ For an edge e with associated pheromone value τe and weight

w(e), the ant chooses e proportionally to τ α

e · (w(e))β.

5/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-24
SLIDE 24

Heuristic Information vs. Pheromone Values

◮ Remember: The probability for an edge e to be picked

depends on the two values w(e) and τe.

◮ How do we balance these two values? ◮ We use two parameters, α and β. ◮ For an edge e with associated pheromone value τe and weight

w(e), the ant chooses e proportionally to τ α

e · (w(e))β.

5/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-25
SLIDE 25

Heuristic Information vs. Pheromone Values

◮ Remember: The probability for an edge e to be picked

depends on the two values w(e) and τe.

◮ How do we balance these two values? ◮ We use two parameters, α and β. ◮ For an edge e with associated pheromone value τe and weight

w(e), the ant chooses e proportionally to τ α

e · (w(e))β.

5/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-26
SLIDE 26

Heuristic Information vs. Pheromone Values

◮ Remember: The probability for an edge e to be picked

depends on the two values w(e) and τe.

◮ How do we balance these two values? ◮ We use two parameters, α and β. ◮ For an edge e with associated pheromone value τe and weight

w(e), the ant chooses e proportionally to τ α

e · (w(e))β.

5/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-27
SLIDE 27

Heuristic Information vs. Pheromone Values

◮ Remember: The probability for an edge e to be picked

depends on the two values w(e) and τe.

◮ How do we balance these two values? ◮ We use two parameters, α and β. ◮ For an edge e with associated pheromone value τe and weight

w(e), the ant chooses e proportionally to τ α

e · (w(e))β.

5/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-28
SLIDE 28

Results

τ α

e · (w(e))β

We proved the following expected optimization times of ACO.

◮ If α = 0 and β = 1 (greedy only), O(n2). ◮ If α = 1 and β = 1 with constant pheromone bounds, times

are still polynomially bounded.

◮ If α = 1 and β = 1 with at least linear pheromone bound

ratio, times are not polynomially bounded.

◮ If β > 1, times are not polynomially bounded. ◮ If α = 1 and β = 0 for sensible pheromone bounds, times are

gain not polynomially bounded.

6/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-29
SLIDE 29

Results

τ α

e · (w(e))β

We proved the following expected optimization times of ACO.

◮ If α = 0 and β = 1 (greedy only), O(n2). ◮ If α = 1 and β = 1 with constant pheromone bounds, times

are still polynomially bounded.

◮ If α = 1 and β = 1 with at least linear pheromone bound

ratio, times are not polynomially bounded.

◮ If β > 1, times are not polynomially bounded. ◮ If α = 1 and β = 0 for sensible pheromone bounds, times are

gain not polynomially bounded.

6/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-30
SLIDE 30

Results

τ α

e · (w(e))β

We proved the following expected optimization times of ACO.

◮ If α = 0 and β = 1 (greedy only), O(n2). ◮ If α = 1 and β = 1 with constant pheromone bounds, times

are still polynomially bounded.

◮ If α = 1 and β = 1 with at least linear pheromone bound

ratio, times are not polynomially bounded.

◮ If β > 1, times are not polynomially bounded. ◮ If α = 1 and β = 0 for sensible pheromone bounds, times are

gain not polynomially bounded.

6/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-31
SLIDE 31

Results

τ α

e · (w(e))β

We proved the following expected optimization times of ACO.

◮ If α = 0 and β = 1 (greedy only), O(n2). ◮ If α = 1 and β = 1 with constant pheromone bounds, times

are still polynomially bounded.

◮ If α = 1 and β = 1 with at least linear pheromone bound

ratio, times are not polynomially bounded.

◮ If β > 1, times are not polynomially bounded. ◮ If α = 1 and β = 0 for sensible pheromone bounds, times are

gain not polynomially bounded.

6/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-32
SLIDE 32

Results

τ α

e · (w(e))β

We proved the following expected optimization times of ACO.

◮ If α = 0 and β = 1 (greedy only), O(n2). ◮ If α = 1 and β = 1 with constant pheromone bounds, times

are still polynomially bounded.

◮ If α = 1 and β = 1 with at least linear pheromone bound

ratio, times are not polynomially bounded.

◮ If β > 1, times are not polynomially bounded. ◮ If α = 1 and β = 0 for sensible pheromone bounds, times are

gain not polynomially bounded.

6/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-33
SLIDE 33

Results

τ α

e · (w(e))β

We proved the following expected optimization times of ACO.

◮ If α = 0 and β = 1 (greedy only), O(n2). ◮ If α = 1 and β = 1 with constant pheromone bounds, times

are still polynomially bounded.

◮ If α = 1 and β = 1 with at least linear pheromone bound

ratio, times are not polynomially bounded.

◮ If β > 1, times are not polynomially bounded. ◮ If α = 1 and β = 0 for sensible pheromone bounds, times are

gain not polynomially bounded.

6/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-34
SLIDE 34

Results

τ α

e · (w(e))β

We proved the following expected optimization times of ACO.

◮ If α = 0 and β = 1 (greedy only), O(n2). ◮ If α = 1 and β = 1 with constant pheromone bounds, times

are still polynomially bounded.

◮ If α = 1 and β = 1 with at least linear pheromone bound

ratio, times are not polynomially bounded.

◮ If β > 1, times are not polynomially bounded. ◮ If α = 1 and β = 0 for sensible pheromone bounds, times are

gain not polynomially bounded.

6/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-35
SLIDE 35

Conclusions

◮ Don’t use an ACO algorithm to solve the Min-Cut Problem. ◮ ACO can simulate Karger and Stein’s algorithm. ◮ We now understand better how ACO algorithms work. ◮ We now understand better how to analyze ACO algorithms.

7/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-36
SLIDE 36

Conclusions

◮ Don’t use an ACO algorithm to solve the Min-Cut Problem. ◮ ACO can simulate Karger and Stein’s algorithm. ◮ We now understand better how ACO algorithms work. ◮ We now understand better how to analyze ACO algorithms.

7/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-37
SLIDE 37

Conclusions

◮ Don’t use an ACO algorithm to solve the Min-Cut Problem. ◮ ACO can simulate Karger and Stein’s algorithm. ◮ We now understand better how ACO algorithms work. ◮ We now understand better how to analyze ACO algorithms.

7/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-38
SLIDE 38

Conclusions

◮ Don’t use an ACO algorithm to solve the Min-Cut Problem. ◮ ACO can simulate Karger and Stein’s algorithm. ◮ We now understand better how ACO algorithms work. ◮ We now understand better how to analyze ACO algorithms.

7/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-39
SLIDE 39

Conclusions

◮ Don’t use an ACO algorithm to solve the Min-Cut Problem. ◮ ACO can simulate Karger and Stein’s algorithm. ◮ We now understand better how ACO algorithms work. ◮ We now understand better how to analyze ACO algorithms.

7/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut

slide-40
SLIDE 40

Thank you.

8/8 K¨

  • tzing, Lehre, Neumann, Oliveto

ACO and MinCut