Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn Graphs - - PowerPoint PPT Presentation

graph algorithms
SMART_READER_LITE
LIVE PREVIEW

Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn Graphs - - PowerPoint PPT Presentation

Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn Graphs Extremely important concept in computer science Graph , : node (or vertex) set : edge set Simple graph: no self


slide-1
SLIDE 1

Chapter 5

Graph Algorithms

Algorithm Theory WS 2012/13 Fabian Kuhn

slide-2
SLIDE 2

Algorithm Theory, WS 2012/13 Fabian Kuhn 2

Graphs

Extremely important concept in computer science Graph ,

  • : node (or vertex) set
  • ⊆ : edge set

– Simple graph: no self‐loops, no multiple edges – Undirected graph: we often think of edges as sets of size 2 (e.g., , ) – Directed graph: edges are sometimes also called arcs – Weighted graph: (positive) weight on edges (or nodes)

  • (simple) path: sequence , … , of nodes such that

, ∈ for all ∈ 0, … , 1

Many real‐world problems can be formulated as optimization problems on graphs

slide-3
SLIDE 3

Algorithm Theory, WS 2012/13 Fabian Kuhn 3

Graph Optimization: Examples

Minimum spanning tree (MST):

  • Compute min. weight spanning tree of a weighted undir. Graph

Shortest paths:

  • Compute (length) of shortest paths (single source, all pairs, …)

Traveling salesperson (TSP):

  • Compute shortest TSP path/tour in weighted graph

Vertex coloring:

  • Color the nodes such that neighbors get different colors
  • Goal: minimize the number of colors

Maximum matching:

  • Matching: set of pair‐wise non‐adjacent edges
  • Goal: maximize the size of the matching
slide-4
SLIDE 4

Algorithm Theory, WS 2012/13 Fabian Kuhn 4

Network Flow

Flow Network:

  • Directed graph , , ⊆
  • Each (directed) edge has a capacity 0

– Amount of flow (traffic) that the edge can carry

  • A single source node ∈ and a single sink node ∈

Flow: (informally)

  • Traffic from to such that each edge carries at most its capacity

Examples:

  • Highway system: edges are highways, flow is the traffic
  • Computer network: edges are network links that can carry

packets, nodes are switches

  • Fluid network: edges are pipes that carry liquid
slide-5
SLIDE 5

Algorithm Theory, WS 2012/13 Fabian Kuhn 5

Network Flow: Definition

Flow: function : →

  • is the amount of flow carried by edge

Capacity Constraints:

  • For each edge ∈ ,

Flow Conservation:

  • For each node ∈ ∖ , ,
  • Flow Value:

|| ≔

  • ,
  • ,
slide-6
SLIDE 6

Algorithm Theory, WS 2012/13 Fabian Kuhn 6

Example: Flow Network

20 20 10 10 30

slide-7
SLIDE 7

Algorithm Theory, WS 2012/13 Fabian Kuhn 7

Notation

We define: ≔ , ≔

  • For a set ⊆ :

≔ , ≔

  • Flow conservation: ∀ ∈ ∖ , :

Flow value: For simplicity: Assume that all capacities are positive integers

slide-8
SLIDE 8

Algorithm Theory, WS 2012/13 Fabian Kuhn 8

The Maximum‐Flow Problem

Maximum Flow: Given a flow network, find a flow of maximum possible value

  • Classical graph optimization problem
  • Many applications (also beyond the obvious ones)
  • Requires new algorithmic techniques
slide-9
SLIDE 9

Algorithm Theory, WS 2012/13 Fabian Kuhn 9

Maximum Flow: Greedy?

Does greedy work? A natural greedy algorithm:

  • As long as possible, find an ‐‐path with free capacity and

add as much flow as possible to the path

20 20 10 10 30

slide-10
SLIDE 10

Algorithm Theory, WS 2012/13 Fabian Kuhn 10

Improving the Greedy Solution

  • Try to push 10 units of flow on edge ,
  • Too much incoming flow at : reduce flow on edge ,
  • Add that flow on edge ,

20 20 10 10 30

slide-11
SLIDE 11

Algorithm Theory, WS 2012/13 Fabian Kuhn 11

Residual Graph

Given a flow network , with capacities (for ∈ ) For a flow on , define directed graph

, as follows:

  • Node set
  • For each edge , in , there are two edges in

:

– forward edge , with residual capacity – backward edge , with residual capacity

20 20 10 10 30

slide-12
SLIDE 12

Algorithm Theory, WS 2012/13 Fabian Kuhn 12

Residual Graph: Example

15 20 20 15 10 10 20 15 20 15 15 15 10 5 20 20

slide-13
SLIDE 13

Algorithm Theory, WS 2012/13 Fabian Kuhn 13

Residual Graph: Example

Flow

15 20 20 15 10 10 20 15 20 15 15 15 10 5 20 20

  • 5

10 15 5 15

slide-14
SLIDE 14

Algorithm Theory, WS 2012/13 Fabian Kuhn 14

Residual Graph: Example

Residual Graph

5 10 15 5 15 10 5 10 15 15 10 10 5 20 10 10 10 10 10 10 10 10 5 5 5

slide-15
SLIDE 15

Algorithm Theory, WS 2012/13 Fabian Kuhn 15

Augmenting Path

Residual Graph

5 10

  • 5

15

  • 5

10

  • 10

10 5 20

  • 10

10 10 10 10 10

  • 5

5 5

slide-16
SLIDE 16

Algorithm Theory, WS 2012/13 Fabian Kuhn 16

Augmenting Path

Augmenting Path

15 20 20 15 10 10 20 15 20 15 15 15 10 5 20 20

slide-17
SLIDE 17

Algorithm Theory, WS 2012/13 Fabian Kuhn 17

Augmenting Path

New Flow

15 20 20 15 10 10 20 15 20 15 15 15 10 5 20 20

slide-18
SLIDE 18

Algorithm Theory, WS 2012/13 Fabian Kuhn 18

Augmenting Path

Definition: An augmenting path is a (simple) ‐‐path on the residual graph on which each edge has residual capacity 0. bottleneck, : minimum residual capacity on any edge of the augmenting path Augment flow to get flow ′:

  • For every forward edge , on :
  • ,

≔ , ,

  • For every backward edge , on :
  • ,

≔ , ,

slide-19
SLIDE 19

Algorithm Theory, WS 2012/13 Fabian Kuhn 19

Augmented Flow

Lemma: Given a flow and an augmenting path , the resulting augmented flow ′ is legal and its value is , . Proof:

slide-20
SLIDE 20

Algorithm Theory, WS 2012/13 Fabian Kuhn 20

Augmented Flow

Lemma: Given a flow and an augmenting path , the resulting augmented flow ′ is legal and its value is , . Proof:

slide-21
SLIDE 21

Algorithm Theory, WS 2012/13 Fabian Kuhn 21

Ford‐Fulkerson Algorithm

  • Improve flow using an augmenting path as long as possible:
  • 1. Initially, 0 for all edges ∈ ,

2. while there is an augmenting ‐‐path in do 3. Let be an augmenting ‐‐path in ; 4. ≔ augment, ; 5. update to be ′; 6. update the residual graph 7. end;

slide-22
SLIDE 22

Algorithm Theory, WS 2012/13 Fabian Kuhn 22

Ford‐Fulkerson Running Time

Theorem: If all edge capacities are integers, the Ford‐Fulkerson algorithm terminates after at most iterations, where

  • .

Proof:

slide-23
SLIDE 23

Algorithm Theory, WS 2012/13 Fabian Kuhn 23

Ford‐Fulkerson Running Time

Theorem: If all edge capacities are integers, the Ford‐Fulkerson algorithm can be implemented to run in time. Proof:

slide-24
SLIDE 24

Algorithm Theory, WS 2012/13 Fabian Kuhn 24

‐ Cuts

Definition: An ‐ cut is a partition , of the vertex set such that ∈ and ∈

15 20 20 15 10 10 20 15 20 15 15 15 10 5 20 20

slide-25
SLIDE 25

Algorithm Theory, WS 2012/13 Fabian Kuhn 25

Cut Capacity

Definition: The capacity , of an ‐‐cut , is defined as , ≔

  • .

15 20 20 15 10 10 20

  • 20

15 15 15 10

slide-26
SLIDE 26

Algorithm Theory, WS 2012/13 Fabian Kuhn 26

Cuts and Flow Value

Lemma: Let be any ‐ flow, and , any ‐ cut. Then, . Proof:

slide-27
SLIDE 27

Algorithm Theory, WS 2012/13 Fabian Kuhn 27

Cuts and Flow Value

Lemma: Let be any ‐ flow, and , any ‐ cut. Then, . Lemma: Let be any ‐ flow, and , any ‐ cut. Then, . Proof:

slide-28
SLIDE 28

Algorithm Theory, WS 2012/13 Fabian Kuhn 28

Upper Bound on Flow Value

Lemma: Let be any ‐ flow and , and ‐ cut. Then , . Proof:

slide-29
SLIDE 29

Algorithm Theory, WS 2012/13 Fabian Kuhn 29

Ford‐Fulkerson Gives Optimal Solution

Lemma: If is an ‐ flow such that there is no augmenting path in , then there is an ‐ cut ∗, ∗ in for which ∗, ∗ . Proof:

  • Define ∗: set of nodes that can be reached from on a path

with positive residual capacities in :

  • For ∗ ∖ ∗, ∗, ∗ is an ‐ cut

– By definition ∈ ∗ and ∉ ∗

slide-30
SLIDE 30

Algorithm Theory, WS 2012/13 Fabian Kuhn 30

Ford‐Fulkerson Gives Optimal Solution

Lemma: If is an ‐ flow such that there is no augmenting path in , then there is an ‐ cut ∗, ∗ in for which ∗, ∗ . Proof:

slide-31
SLIDE 31

Algorithm Theory, WS 2012/13 Fabian Kuhn 31

Ford‐Fulkerson Gives Optimal Solution

Lemma: If is an ‐ flow such that there is no augmenting path in , then there is an ‐ cut ∗, ∗ in for which ∗, ∗ . Proof:

slide-32
SLIDE 32

Algorithm Theory, WS 2012/13 Fabian Kuhn 32

Ford‐Fulkerson Gives Optimal Solution

Theorem: The flow returned by the Ford‐Fulkerson algorithm is a maximum flow. Proof:

slide-33
SLIDE 33

Algorithm Theory, WS 2012/13 Fabian Kuhn 33

Min‐Cut Algorithm

Ford‐Fulkerson also gives a min‐cut algorithm: Theorem: Given a flow of maximum value, we can compute an ‐ cut of minimum capacity in time. Proof:

slide-34
SLIDE 34

Algorithm Theory, WS 2012/13 Fabian Kuhn 34

Max‐Flow Min‐Cut Theorem

Theorem: (Max‐Flow Min‐Cut Theorem) In every flow network, the maximum value of an ‐ flow is equal to the minimum capacity of an ‐ cut. Proof:

slide-35
SLIDE 35

Algorithm Theory, WS 2012/13 Fabian Kuhn 35

Integer Capacities

Theorem: (Integer‐Valued Flows) If all capacities in the flow network are integers, then there is a maximum flow for which the flow of every edge is an integer. Proof:

slide-36
SLIDE 36

Algorithm Theory, WS 2012/13 Fabian Kuhn 36

Non‐Integer Capacities

What if capacities are not integers?

  • rational capacities:

– can be turned into integers by multiplying them with large enough integer – algorithm still works correctly

  • real (non‐rational) capacities:

– not clear whether the algorithm always terminates

  • even for integer capacities, time can linearly depend on the value
  • f the maximum flow
slide-37
SLIDE 37

Algorithm Theory, WS 2012/13 Fabian Kuhn 37

Slow Execution

  • Number of iterations: 2000 (value of max. flow)

1000 1000 1000 1000 1

slide-38
SLIDE 38

Algorithm Theory, WS 2012/13 Fabian Kuhn 38

Improved Algorithm

Idea: Find the best augmenting path in each step

  • best: path with maximum bottleneck,
  • Best path might be rather expensive to find

 find almost best path

  • Scaling parameter :

(initially, Δ "max rounded down to next power of 2")

  • As long as there is an augmenting path that improves the flow by

at least Δ, augment using such a path

  • If there is no such path: Δ ≔

slide-39
SLIDE 39

Algorithm Theory, WS 2012/13 Fabian Kuhn 39

Scaling Parameter Analysis

Lemma: If all capacities are integers, number of different scaling parameters used is 1 log .

  • ‐scaling phase: Time during which scaling parameter is Δ
slide-40
SLIDE 40

Algorithm Theory, WS 2012/13 Fabian Kuhn 40

Length of a Scaling Phase

Lemma: If is the flow at the end of the Δ‐scaling phase, the maximum flow in the network has value at most Δ.

slide-41
SLIDE 41

Algorithm Theory, WS 2012/13 Fabian Kuhn 41

Length of a Scaling Phase

Lemma: The number of augmentation in each scaling phase is at most 2.

slide-42
SLIDE 42

Algorithm Theory, WS 2012/13 Fabian Kuhn 42

Running Time: Scaling Max Flow Alg.

Theorem: The number of augmentations of the algorithm with scaling parameter and integer capacities is at most log . The algorithm can be implemented in time log .

slide-43
SLIDE 43

Algorithm Theory, WS 2012/13 Fabian Kuhn 43

Strongly Polynomial Algorithm

  • Time of regular Ford‐Fulkerson algorithm with integer capacities:
  • Time of algorithm with scaling parameter:

log

  • log is polynomial in the size of the input, but not in
  • Can we get an algorithm that runs in time polynomial in ?
  • Always picking a shortest augmenting path leads to running time
slide-44
SLIDE 44

Algorithm Theory, WS 2012/13 Fabian Kuhn 44

Preflow‐Push Max‐Flow Algorithm

Definition: An ‐ preflow is a function : E → such that

  • For each edge ∈ :
  • For each node ∈ ∖ :
  • Excess of node :
  • A preflow with excess 0 for all , is a flow

with value .

slide-45
SLIDE 45

Algorithm Theory, WS 2012/13 Fabian Kuhn 45

Preflows and Labelings

Height function : →

  • Assigns an integer height to each node ∈

Source and Sink Conditions:

  • and 0

Steepness Condition:

  • For all edges , with residual capacity 0

(residual graph for a preflow defined as before for flows) 1

  • A preflow and a labeling are called compatible if source,

sink, and steepness conditions are satisfied

slide-46
SLIDE 46

Algorithm Theory, WS 2012/13 Fabian Kuhn 46

Compatible Labeling

height 1 2 3 4 5 6 7 8 Arrows: edges of with positive residual capacity

slide-47
SLIDE 47

Algorithm Theory, WS 2012/13 Fabian Kuhn 47

Preflows with Compatible Labelings

Lemma: If a preflow is compatible with a labeling , then there is no ‐ path in with only positive residual capacities.

slide-48
SLIDE 48

Algorithm Theory, WS 2012/13 Fabian Kuhn 48

Flows with Compatible Labelings

Lemma: If ‐ flow is compatible with a labeling , then is a flow of maximum value.

slide-49
SLIDE 49

Algorithm Theory, WS 2012/13 Fabian Kuhn 49

Turning a Preflow into a Flow

Algorithm Idea:

  • Start with a preflow and a compatible labeling
  • Extend preflow while keeping a compatible labeling
  • As soon as is a flow (nodes , have excess 0),

is a maximum flow Initialization:

  • Labeling : , 0 for all
  • Preflow :

– Edges , of out of need residual capacity 0: – Preflow on other edges does not matter: 0

slide-50
SLIDE 50

Algorithm Theory, WS 2012/13 Fabian Kuhn 50

Initialization

Initial labeling : , 0 for Initial preflow : edge out of : , other edges : 0 Claim: Initial labeling and preflow are compatible.

slide-51
SLIDE 51

Algorithm Theory, WS 2012/13 Fabian Kuhn 51

Push

Consider some node with excess 0:

  • Assume has a neighbor in the residual graph such that the

edge , has positive residual capacity and : push flow from to

  • If is a forward edge: increase by min ,
  • If is a backward edge: decrease by min ,
slide-52
SLIDE 52

Algorithm Theory, WS 2012/13 Fabian Kuhn 52

Relabel

Consider some node with excess 0:

  • Assume that it is not possible to push flow to a neighbor in :

For all edges , in with positive residual capacity, we have relabel : ≔

slide-53
SLIDE 53

Algorithm Theory, WS 2012/13 Fabian Kuhn 53

Preflow‐Push Algorithm

  • As long as there is a node with excess 0, if possible do

a push operation from to a neighbor, otherwise relabel Lemma: Throughout the Preflow‐Push Algorithm: i. Labels are non‐negative integers ii. If capacities are integers, is an integer preflow

  • iii. The preflow and the labeling are compatible

If the algorithm terminates, is a maximum flow.

slide-54
SLIDE 54

Algorithm Theory, WS 2012/13 Fabian Kuhn 54

Properties of Preflows

Lemma: If is a preflow and node has excess 0, then there is a path with positive residual capacities in from to .

slide-55
SLIDE 55

Algorithm Theory, WS 2012/13 Fabian Kuhn 55

Heights

Lemma: During the algorithm, all nodes have 2 1.

slide-56
SLIDE 56

Algorithm Theory, WS 2012/13 Fabian Kuhn 56

Number of Relabelings

Lemma: During the algorithm, each node is relabeled at most 2 1 times.

  • Hence: total number or relabeling operations 2
slide-57
SLIDE 57

Algorithm Theory, WS 2012/13 Fabian Kuhn 57

Number of Saturating Push Operations

  • A push operation on , is called saturating if:

– , is a forward edge and after the push, , – , is a backward edge and after the push, ,

Lemma: The number of saturating push operations is at most 2.

slide-58
SLIDE 58

Algorithm Theory, WS 2012/13 Fabian Kuhn 58

Number of Non‐Saturating Push Ops.

Lemma: There are 2 non‐saturating push operations. Proof:

  • Potential function:

Φ , ≔

  • :
  • At all times, ,
  • Non‐saturating push on , :

– Before push: 0, after push: 0 – Push might increase from 0 to 0 – 1  push decreases , by at least

  • Relabel: increases , by
  • Saturating push on , : might be positive afterwards

 , increases by at most

slide-59
SLIDE 59

Algorithm Theory, WS 2012/13 Fabian Kuhn 59

Number of Non‐Saturating Push Ops.

Lemma: There are 2 non‐saturating push operations. Proof:

  • Potential function Φ , 0
  • Non‐saturating push decreases Φ, by 1
  • Relabel increases Φ, by 1
  • Saturating push increase Φ , by 2 1
slide-60
SLIDE 60

Algorithm Theory, WS 2012/13 Fabian Kuhn 60

Preflow‐Push Algorithm

Theorem: The preflow‐push algorithm computes a maximum flow after at most push and relabel operations.

slide-61
SLIDE 61

Algorithm Theory, WS 2012/13 Fabian Kuhn 61

Choosing a Maximum Height Node

Lemma: If we always choose a node with 0 at maximum height, there are at most non‐saturating push operations. Proof: New potential function: ≔ max

:

slide-62
SLIDE 62

Algorithm Theory, WS 2012/13 Fabian Kuhn 62

Choosing a Maximum Height Node

Lemma: If we always choose a node with 0 at maximum height, there are at most non‐saturating push operations. Proof: New potential function: ≔ max

:

slide-63
SLIDE 63

Algorithm Theory, WS 2012/13 Fabian Kuhn 63

Improved Preflow‐Push Algorithm

Theorem: If we always use a maximum height node with positive excess, the preflow‐push algorithm computes a maximum flow after at most push and relabel operations. Theorem: The preflow‐push algorithm that always chooses a maximum height node with positive excess can be implemented in time . Proof: see exercises

slide-64
SLIDE 64

Algorithm Theory, WS 2012/13 Fabian Kuhn 64

Maximum Flow Applications

  • Maximum flow has many applications
  • Reducing a problem to a max flow problem can even be seen as

an important algorithmic technique

  • Examples:

– related network flow problems – computation of small cuts – computation of matchings – computing disjoint paths – scheduling problems – assignment problems with some side constraints – …

slide-65
SLIDE 65

Algorithm Theory, WS 2012/13 Fabian Kuhn 65

Undirected Edges and Vertex Capacities

Undirected Edges:

  • Undirected edge , : add edges , and , to network

Vertex Capacities:

  • Not only edge, but also (or only) nodes have capacities
  • Capacity of node ∉ , :
  • Replace node by edge , :
slide-66
SLIDE 66

Algorithm Theory, WS 2012/13 Fabian Kuhn 66

Minimum ‐ Cut

Given: undirected graph , , nodes , ∈ ‐ cut: Partition , of such that ∈ , ∈ Size of cut , : number of edges crossing the cut Objective: find ‐ cut of minimum size

slide-67
SLIDE 67

Algorithm Theory, WS 2012/13 Fabian Kuhn 67

Edge Connectivity

Definition: A graph , is ‐edge connected for an integer 1 if the graph , ∖ is connected for every edge set ⊆ , 1. Goal: Compute edge connectivity of (and edge set of size that divides into 2 parts)

  • minimum set is a minimum ‐ cut for some , ∈

– Actually for all , in different components of , ∖

  • Possible algorithm: fix and find min ‐ cut for all
slide-68
SLIDE 68

Algorithm Theory, WS 2012/13 Fabian Kuhn 68

Minimum ‐ Vertex‐Cut

Given: undirected graph , , nodes , ∈ ‐ vertex cut: Set ⊂ such that , ∉ and and are in different components of the sub‐graph ∖ induced by ∖ Size of vertex cut: || Objective: find ‐ vertex‐cut of minimum size

  • Replace undirected edge , by , and ,
  • Compute max ‐ flow for edge capacities ∞ and node capacities

1 for ,

  • Replace each node by and :
  • Min edge cut corresponds to min vertex cut in
slide-69
SLIDE 69

Algorithm Theory, WS 2012/13 Fabian Kuhn 69

Vertex Connectivity

Definition: A graph , is ‐vertex connected for an integer 1 if the sub‐graph ∖ induced by ∖ is connected for every edge set ⊆ , 1. Goal: Compute vertex connectivity of (and node set of size that divides into 2 parts)

  • Compute minimum ‐ vertex cut for fixed and all
slide-70
SLIDE 70

Algorithm Theory, WS 2012/13 Fabian Kuhn 70

Edge‐Disjoint Paths

Given: Graph , with nodes , ∈ Goal: Find as many edge‐disjoint ‐ paths as possible Solution:

  • Find max ‐ flow in with edge capacities 1 for all ∈

Flow induces edge‐disjoint paths:

  • Integral capacities  can compute integral max flow
  • Get edge‐disjoint paths by greedily picking them
  • Correctness follows from flow conservation
slide-71
SLIDE 71

Algorithm Theory, WS 2012/13 Fabian Kuhn 71

Vertex‐Disjoint Paths

Given: Graph , with nodes , ∈ Goal: Find as many internally vertex‐disjoint ‐ paths as possible Solution:

  • Find max ‐ flow in with node capacities 1 for all ∈

Flow induces vertex‐disjoint paths:

  • Integral capacities  can compute integral max flow
  • Get vertex‐disjoint paths by greedily picking them
  • Correctness follows from flow conservation
slide-72
SLIDE 72

Algorithm Theory, WS 2012/13 Fabian Kuhn 72

Menger’s Theorem

Theorem: (edge version) For every graph , with nodes , ∈ , the size of the minimum ‐ (edge) cut equals the maximum number of pairwise edge‐disjoint paths from to . Theorem: (node version) For every graph , with nodes , ∈ , the size of the minimum ‐ vertex cut equals the maximum number of pairwise internally vertex‐disjoint paths from to

  • Both versions can be seen as a special case of the max flow min

cut theorem

slide-73
SLIDE 73

Algorithm Theory, WS 2012/13 Fabian Kuhn 73

Baseball Elimination

  • Only wins/losses possible (no ties), winner: team with most wins
  • Which teams can still win (as least as many wins as top team)?
  • Boston is eliminated (cannot win):

– Boston can get at most 79 wins, New York already has 81 wins

  • If for some , :

 team is eliminated

  • Sufficient condition, but not a necessary one!

Team Wins Losses To Play Against =

  • NY

Balt.

  • T. Bay

Tor. Bost. New York 81 70 11 ‐ 2 4 2 3 Baltimore 79 77 6 2 ‐ 2 1 1 Tampa Bay 78 76 8 4 2 ‐ 1 1 Toronto 76 80 6 2 1 1 ‐ 2 Boston 72 83 7 3 1 1 2 ‐

slide-74
SLIDE 74

Algorithm Theory, WS 2012/13 Fabian Kuhn 74

Baseball Elimination

  • Can Toronto still finish first?
  • Toronto can get 82 81 wins, but:

NY and Tampa have to play 4 more times against each other  if NY wins one, it gets 82 wins, otherwise, Tampa has 82 wins

  • Hence: Toronto cannot finish first
  • How about the others? How can we solve this in general?

Team Wins Losses To Play Against =

  • NY

Balt.

  • T. Bay

Tor. Bost. New York 81 70 11 ‐ 2 4 2 3 Baltimore 79 77 6 2 ‐ 2 1 1 Tampa Bay 78 76 8 4 2 ‐ 1 1 Toronto 76 80 6 2 1 1 ‐ 2 Boston 72 83 7 3 1 1 2 ‐

slide-75
SLIDE 75

Algorithm Theory, WS 2012/13 Fabian Kuhn 75

Max Flow Formulation

  • Can team 3 finish with most wins?
  • Team 3 can finish first iff all source‐game edges are saturated

1 2 1‐2 1 4 1‐4 1 5 1‐5 2 4 2‐4 2 5 2‐5 4 5 4‐5

game nodes

1 2 4 5

team nodes

Remaining number

  • f games between

the 2 teams Number of wins team can have to not beat team

slide-76
SLIDE 76

Algorithm Theory, WS 2012/13 Fabian Kuhn 76

Reason for Elimination

  • Detroit could finish with 49 27 76 wins
  • Consider NY, Bal, Bos, Tor

– Have together already won 278 games – Must together win at least 27 more games

  • On average, teams in win
  • 76.25 games

Team Wins Losses To Play Against =

  • NY

Balt. Bost. Tor. Detr. New York 75 59 28 ‐ 3 8 7 3 Baltimore 71 63 28 3 ‐ 2 7 4 Boston 69 66 27 8 2 ‐ Toronto 63 72 27 7 7 ‐ Detroit 49 86 27 3 4 ‐ AL East: Aug 30, 1996

slide-77
SLIDE 77

Algorithm Theory, WS 2012/13 Fabian Kuhn 77

Reason for Elimination

Certificate of elimination: ⊆ , ≔

, ≔

, ,∈

Team ∈ is eliminated by if ||

.

#wins of nodes in #remaining games among nodes in

slide-78
SLIDE 78

Algorithm Theory, WS 2012/13 Fabian Kuhn 78

Reason for Elimination

Theorem: Team is eliminated if and only if there exists a subset ⊆ of the teams such that is eliminated by . Proof Idea:

  • Minimum cut gives a certificate…
  • If is eliminated, max flow solution does not saturate all
  • utgoing edges of the source.
  • Team nodes of unsaturated source‐game edges are saturated
  • Source side of min cut contains all teams of saturated team‐dest.

edges of unsaturated source‐game edges

  • Set of team nodes in source‐side of min cut give a certificate
slide-79
SLIDE 79

Algorithm Theory, WS 2012/13 Fabian Kuhn 79

Circulations with Demands

Given: Directed network with positive edge capacities Sources & Sinks: Instead of one source and one destination, several sources that generate flow and several sinks that absorb flow. Supply & Demand: sources have supply values, sinks demand values Goal: Compute a flow such that source supplies and sink demands are exactly satisfied

  • The circulation problem is a feasibility rather than a maximization

problem

slide-80
SLIDE 80

Algorithm Theory, WS 2012/13 Fabian Kuhn 80

Circulations with Demands: Formally

Given: Directed network , with

  • Edge capacities 0 for all ∈
  • Node demands ∈ for all ∈

– 0: node needs flow and therefore is a sink – 0: node has a supply of and is therefore a source – 0: node is neither a source nor a sink

Flow: Function : → satisfying

  • Capacity Conditions: ∀ ∈ : 0
  • Demand Conditions: ∀ ∈ :

Objective: Does a flow satisfying all conditions exist? If yes, find such a flow .

slide-81
SLIDE 81

Algorithm Theory, WS 2012/13 Fabian Kuhn 81

Example

‐3 ‐3 2 ‐3 ‐3 4 3 3 2 2 2 1 2 2 2 2

slide-82
SLIDE 82

Algorithm Theory, WS 2012/13 Fabian Kuhn 82

Condition on Demands

Claim: If there exists a feasible circulation with demands for ∈ , then

0. Proof:

  • of each edge appears twice in the above sum with

different signs  overall sum is 0 Total supply = total demand: Define ≔

: :

slide-83
SLIDE 83

Algorithm Theory, WS 2012/13 Fabian Kuhn 83

Reduction to Maximum Flow

  • Add “super‐source” ∗ and “super‐sink” ∗ to network

‐3 ‐3 ‐1 ‐1 ‐6 ‐6 3 2 1 4

∗ ∗

  • ∗ supplies

sources with flow ∗ siphons flow out

  • f sinks
slide-84
SLIDE 84

Algorithm Theory, WS 2012/13 Fabian Kuhn 84

Example

‐3 ‐3 2 ‐3 ‐3 4 3 3 2 2 2 1 2 2 2 2

∗ ∗

3 3 2 4 3 3 2 4

slide-85
SLIDE 85

Algorithm Theory, WS 2012/13 Fabian Kuhn 85

Formally…

Reduction: Get graph from graph as follows

  • Node set of is ∪ ∗, ∗
  • Edge set is and edges

– ∗, for all with 0, capacity of edge is – , ∗ for all with 0, capacity of edge is

Observations:

  • Capacity of min ∗‐∗ cut is at least (e.g., the cut ∗, ∪ ∗
  • A feasible circulation on can be turned into a feasible flow of

value of ′ by saturating all ∗, and , ∗ edges.

  • Any flow of ′ of value induces a feasible circulation on

– ∗, and , ∗ edges are saturated – By removing these edges, we get exactly the demand constraints

slide-86
SLIDE 86

Algorithm Theory, WS 2012/13 Fabian Kuhn 86

Circulation with Demands

Theorem: There is a feasible circulation with demands , ∈

  • n graph if and only if there is a flow of value on ′.
  • If all capacities and demands are integers, there is an integer

circulation The max flow min cut theorem also implies the following: Theorem: The graph has a feasible circulation with demands , ∈ if and only if for all cuts , , ,

.

slide-87
SLIDE 87

Algorithm Theory, WS 2012/13 Fabian Kuhn 87

Circulation: Demands and Lower Bounds

Given: Directed network , with

  • Edge capacities 0 and lower bounds ℓ for ∈
  • Node demands ∈ for all ∈

– 0: node needs flow and therefore is a sink – 0: node has a supply of and is therefore a source – 0: node is neither a source nor a sink

Flow: Function : → satisfying

  • Capacity Conditions: ∀ ∈ : ℓ
  • Demand Conditions: ∀ ∈ :

Objective: Does a flow satisfying all conditions exist? If yes, find such a flow .

slide-88
SLIDE 88

Algorithm Theory, WS 2012/13 Fabian Kuhn 88

Solution Idea

  • Define initial circulation

Satisfies capacity constraints: ∀ ∈ : ℓ

  • Define

  • If 0, demand condition is satisfied at by

, otherwise, we

need to superimpose another circulation

such that

  • Remaining capacity of edge :

≔ ℓ

  • We get a circulation problem with new demands

, new

capacities

, and no lower bounds

slide-89
SLIDE 89

Algorithm Theory, WS 2012/13 Fabian Kuhn 89

Eliminating a Lower Bound: Example

‐3 ‐3 2 ‐3 ‐3 4 3 3 2 2 2

Lower bound of 2

‐5 ‐5 2 ‐1 ‐1 4 1 3 2 2 2

slide-90
SLIDE 90

Algorithm Theory, WS 2012/13 Fabian Kuhn 90

Reduce to Problem Without Lower Bounds

Graph , :

  • Capacity: For each edge ∈ : ℓ
  • Demand: For each node ∈ :

Model lower bounds with supplies & demands: Create Network ′ (without lower bounds):

  • For each edge ∈ :

  • For each node ∈ :
  • Flow: ℓ
slide-91
SLIDE 91

Algorithm Theory, WS 2012/13 Fabian Kuhn 91

Circulation: Demands and Lower Bounds

Theorem: There is a feasible circulation in (with lower bounds) if and only if there is feasible circulation in ′ (without lower bounds).

  • Given circulation ′ in ′, ℓ is circulation in

– The capacity constraints are satisfied because ℓ – Demand conditions:

  • Given circulation ′ in ′, ℓ is circulation in

– The capacity constraints are satisfied because ℓ – Demand conditions: ′

slide-92
SLIDE 92

Algorithm Theory, WS 2012/13 Fabian Kuhn 92

Integrality

Theorem: Consider a circulation problem with integral capacities, flow lower bounds, and node demands. If the problem is feasible, then it also has an integral solution. Proof:

  • Graph ′ has only integral capacities and demands
  • Thus, the flow network used in the reduction to solve

circulation with demands and no lower bounds has only integral capacities

  • The theorem now follows because a max flow problem with

integral capacities also has an optimal integral solution

  • It also follows that with the max flow algorithms we studied,

we get an integral feasible circulation solution.

slide-93
SLIDE 93

Algorithm Theory, WS 2012/13 Fabian Kuhn 93

Matrix Rounding

  • Given: matrix , of real numbers
  • row sum: ∑ ,
  • , column sum:

∑ ,

  • Goal: Round each ,, as well as and

up or down to the

next integer so that the sum of rounded elements in each row (column) equals the rounded row (column) sum

  • Original application: publishing census data

Example:

3.14 6.80 7.30 17.24 9.60 2.40 0.70 12.70 3.60 1.20 6.50 11.30 16.34 10.40 14.50 3 7 7 17 10 2 1 13 3 1 7 11 16 10 15

  • riginal data

possible rounding

slide-94
SLIDE 94

Algorithm Theory, WS 2012/13 Fabian Kuhn 94

Matrix Rounding

Theorem: For any matrix, there exists a feasible rounding. Remark: Just rounding to the nearest integer doesn’t work

0.35 0.35 0.35 1.05 0.55 0.55 0.55 1.65 0.90 0.90 0.90 1 1 1 3 1 1 1 1 1 1 1 2 1 1 1

  • riginal data

feasible rounding rounding to nearest integer

slide-95
SLIDE 95

Algorithm Theory, WS 2012/13 Fabian Kuhn 95

Reduction to Circulation

3.14 6.80 7.30 17.24 9.60 2.40 0.70 12.70 3.60 1.20 6.50 11.30 16.34 10.40 14.50

  • rows:
  • columns:

3,4 2,3 12,13 10,11

Matrix elements and row/column sums give a feasible circulation that satisfies all lower bound, capacity, and demand constraints

all demands 0

slide-96
SLIDE 96

Algorithm Theory, WS 2012/13 Fabian Kuhn 96

Matrix Rounding

Theorem: For any matrix, there exists a feasible rounding. Proof:

  • The matrix entries , and the row and column sums and
  • give a feasible circulation for the constructed network
  • Every feasible circulation gives matrix entries with corresponding

row and column sums (follows from demand constraints)

  • Because all demands, capacities, and flow lower bounds are

integral, there is an integral solution to the circulation problem  gives a feasible rounding!

slide-97
SLIDE 97

Algorithm Theory, WS 2012/13 Fabian Kuhn 97

Matching

slide-98
SLIDE 98

Algorithm Theory, WS 2012/13 Fabian Kuhn 98

Gifts‐Children Graph

  • Which child likes which gift can be represented by a graph
slide-99
SLIDE 99

Algorithm Theory, WS 2012/13 Fabian Kuhn 99

Matching

Matching: Set of pairwise non‐incident edges Maximal Matching: A matching s.t. no more edges can be added Maximum Matching: A matching of maximum possible size Perfect Matching: Matching of size ⁄ (every node is matched)

slide-100
SLIDE 100

Algorithm Theory, WS 2012/13 Fabian Kuhn 100

Bipartite Graph

Definition: A graph , is called bipartite iff its node set can be partitioned into two parts

∪ such that for each

edge u, v ∈ , , ∩

1.

  • Thus, edges are only between the two parts

slide-101
SLIDE 101

Algorithm Theory, WS 2012/13 Fabian Kuhn 101

Santa’s Problem

Maximum Matching in Bipartite Graphs: Every child can get a gift iff there is a matching

  • f size #children

Clearly, every matching is at most as big If #children #gifts, there is a solution iff there is a perfect matching

slide-102
SLIDE 102

Algorithm Theory, WS 2012/13 Fabian Kuhn 102

Reducing to Maximum Flow

  • Like edge‐disjoint paths…

all capacities are

slide-103
SLIDE 103

Algorithm Theory, WS 2012/13 Fabian Kuhn 103

Reducing to Maximum Flow

Theorem: Every integer solution to the max flow problem on the constructed graph induces a maximum bipartite matching of . Proof:

  • 1. A flow of value || induces a matching of size ||

– Left nodes (gifts) have incoming capacity 1 – Right nodes (children) have outgoing capacity 1 – Left and right nodes are incident to 1 edge of with 1

  • 2. A matching of size implies a flow of value

– For each edge , of the matching:

  • ,

, , 1 – All other flow values are 0

slide-104
SLIDE 104

Algorithm Theory, WS 2012/13 Fabian Kuhn 104

Running Time of Max. Bipartite Matching

Theorem: A maximum matching of a bipartite graph can be computed in time

slide-105
SLIDE 105

Algorithm Theory, WS 2012/13 Fabian Kuhn 105

Perfect Matching?

  • There can only be a perfect matching if both sides of the

partition have size ⁄ .

  • There is no perfect matching, iff there is an ‐ cut of

size ⁄ in the flow network. 2

  • 2
slide-106
SLIDE 106

Algorithm Theory, WS 2012/13 Fabian Kuhn 106

‐ Cuts

Partition , of node set such that ∈ and ∈

  • If ∈ : edge , is in cut ,
  • If ∈ : edge , is in cut ,
  • Otherwise (if ∈ , ∈ ), all edges from to some

∈ are in cut ,

slide-107
SLIDE 107

Algorithm Theory, WS 2012/13 Fabian Kuhn 107

Hall’s Marriage Theorem

Theorem: A bipartite graph ∪ , for which || has a perfect matching if and only if ∀ ⊆ : , where ⊆ is the set of neighbors of nodes in ′. Proof: No perfect matching ⟺ some ‐ cut has capacity

  • 1. Assume there is ′ for which

|U|:

slide-108
SLIDE 108

Algorithm Theory, WS 2012/13 Fabian Kuhn 108

Hall’s Marriage Theorem

Theorem: A bipartite graph ∪ , for which || has a perfect matching if and only if ∀ ⊆ : , where ⊆ is the set of neighbors of nodes in ′. Proof: No perfect matching ⟺ some ‐ cut has capacity

  • 2. Assume that there is a cut , of capacity
slide-109
SLIDE 109

Algorithm Theory, WS 2012/13 Fabian Kuhn 109

Hall’s Marriage Theorem

Theorem: A bipartite graph ∪ , for which || has a perfect matching if and only if ∀ ⊆ : , where ⊆ is the set of neighbors of nodes in ′. Proof: No perfect matching ⟺ some ‐ cut has capacity

  • 2. Assume that there is a cut , of capacity
slide-110
SLIDE 110

Algorithm Theory, WS 2012/13 Fabian Kuhn 110

What About General Graphs

  • Can we efficiently compute a maximum matching if is not

bipartitie?

  • How good is a maximal matching?

– A matching that cannot be extended…

  • Vertex Cover: set ⊆ of nodes such that

∀ , ∈ , , ∩ ∅.

  • A vertex cover covers all edges by incident nodes
slide-111
SLIDE 111

Algorithm Theory, WS 2012/13 Fabian Kuhn 111

Vertex Cover vs Matching

Consider a matching and a vertex cover Claim: || Proof:

  • At least one node of every edge , ∈ is in
  • Needs to be a different node for different edges from
slide-112
SLIDE 112

Algorithm Theory, WS 2012/13 Fabian Kuhn 112

Vertex Cover vs Matching

Consider a matching and a vertex cover Claim: If is maximal and is minimum, 2 Proof:

  • is maximal: for every edge , ∈ , either or (or both)

are matched

  • Every edge ∈ is “covered” by at least one matching edge
  • Thus, the set of the nodes of all matching edges gives a vertex

cover of size 2||.

slide-113
SLIDE 113

Algorithm Theory, WS 2012/13 Fabian Kuhn 113

Maximal Matching Approximation

Theorem: For any maximal matching and any maximum matching ∗, it holds that ∗ 2 . Proof: Theorem: The set of all matched nodes of a maximal matching is a vertex cover of size at most twice the size of a min. vertex cover.

slide-114
SLIDE 114

Algorithm Theory, WS 2012/13 Fabian Kuhn 114

Augmenting Paths

Consider a matching of a graph , :

  • A node ∈ is called free iff it is not matched

Augmenting Path: A (odd‐length) path that starts and ends at a free node and visits edges in ∖ and edges in alternatingly.

  • Matching can be improved using an augmenting path by

switching the role of each edge along the path free nodes alternating path

slide-115
SLIDE 115

Algorithm Theory, WS 2012/13 Fabian Kuhn 115

Augmenting Paths

Theorem: A matching of , is maximum if and only if there is no augmenting path. Proof:

  • Consider non‐max. matching and max. matching ∗ and define

≔ ∖ ∗, ∗ ≔ ∗ ∖

  • Note that ∩ ∗ ∅ and |∗|
  • Each node ∈ is incident to at most one edge in both and ∗
  • ∪ ∗ induces even cycles and paths
slide-116
SLIDE 116

Algorithm Theory, WS 2012/13 Fabian Kuhn 116

Finding Augmenting Paths

free nodes

augmenting path

  • dd cycle
slide-117
SLIDE 117

Algorithm Theory, WS 2012/13 Fabian Kuhn 117

Blossoms

  • If we find an odd cycle…

free node

  • blossom
  • stem

  • contracted blossom

contract blossom

Graph Graph ′

root

Matching

Matching ∖ , is a matching of .

slide-118
SLIDE 118

Algorithm Theory, WS 2012/13 Fabian Kuhn 118

Lemma: Graph has an augmenting path w.r.t. matching iff ′ has an augmenting path w.r.t. matching ′ Also: The matching can be computed efficiently from ′.

Contracting Blossoms

′ ′ ′ Note: If stem has length 0, root of blossom if free and thus also the node is free in ′.

slide-119
SLIDE 119

Algorithm Theory, WS 2012/13 Fabian Kuhn 119

Edmond’s Blossom Algorithm

Algorithm Sketch:

  • 1. Build a tree for each free node
  • 2. Starting from an explored node at even distance from a free

node in the tree of , explore some unexplored edge , :

1. If is an unexplored node, is matched to some neighbor : add to the tree ( is now explored) 2. If is explored and in the same tree: at odd distance from root  ignore and move on at even distance from root  blossom found 3. If is explored and in another tree at odd distance from root  ignore and move on at even distance from root  augmenting path found

slide-120
SLIDE 120

Algorithm Theory, WS 2012/13 Fabian Kuhn 120

Running Time

Finding a Blossom: Repeat on smaller graph Finding an Augmenting Path: Improve matching Theorem: The algorithm can be implemented in time .

slide-121
SLIDE 121

Algorithm Theory, WS 2012/13 Fabian Kuhn 121

Matching Algorithms

We have seen:

  • time alg. to compute a max. matching in bipartite graphs
  • time alg. to compute a max. matching in general graphs

Better algorithms:

  • Best known running time (bipartite and general gr.):

Weighted matching:

  • Edges have weight, find a matching of maximum total weight
  • Bipartite graphs: flow reduction works in the same way
  • General graphs: can also be solved in polynomial time

(Edmond’s algorithms is used as blackbox)

slide-122
SLIDE 122

Algorithm Theory, WS 2012/13 Fabian Kuhn 122

Happy Holidays!