Finding small stabilizer for unstable graphs Adrian Bock 1 , Karthik - - PowerPoint PPT Presentation

finding small stabilizer for unstable graphs
SMART_READER_LITE
LIVE PREVIEW

Finding small stabilizer for unstable graphs Adrian Bock 1 , Karthik - - PowerPoint PPT Presentation

Finding small stabilizer for unstable graphs Adrian Bock 1 , Karthik Chandrasekaran 2 , Jochen K onemann 3 , Britta Peis 4 , and Laura Sanit a 3 ( 1 Lausanne, 2 Boston, 3 Waterloo, 4 Aachen) Lets recall some basics on matchings and vertex


slide-1
SLIDE 1

Finding small stabilizer for unstable graphs

Adrian Bock1, Karthik Chandrasekaran2, Jochen K¨

  • nemann3,

Britta Peis4, and Laura Sanit´ a3 (1Lausanne, 2Boston, 3Waterloo, 4Aachen)

slide-2
SLIDE 2

Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),

slide-3
SLIDE 3

Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),

◮ a matching is a set M ⊆ E of non-adjacent edges,

slide-4
SLIDE 4

Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),

◮ a matching is a set M ⊆ E of non-adjacent edges, ◮ a vertex cover is a set of vertices C ⊆ V such that each edge

has at least one endpoint in C.

slide-5
SLIDE 5

Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),

◮ a matching is a set M ⊆ E of non-adjacent edges, ◮ a vertex cover is a set of vertices C ⊆ V such that each edge

has at least one endpoint in C.

◮ Finding a maximum matching is ”easy” whereas finding a

minimum cover is ”hard”.

slide-6
SLIDE 6

As usual, let

◮ ν(G) = max{|M| | M matching in G}, and ◮ τ(G) = min{|C| | C vertex cover in G}.

slide-7
SLIDE 7

As usual, let

◮ ν(G) = max{|M| | M matching in G}, and ◮ τ(G) = min{|C| | C vertex cover in G}.

Consider the corresponding linear relaxations

◮ νf (G) = max{ e∈E ye | y(δ(v)) ≤ 1 ∀v ∈ V ; y ∈ RE +}, ◮ τf (G) = min{ v∈V xv | xu + xv ≥ 1 ∀uv ∈ E; x ∈ RV +}.

slide-8
SLIDE 8

As usual, let

◮ ν(G) = max{|M| | M matching in G}, and ◮ τ(G) = min{|C| | C vertex cover in G}.

Consider the corresponding linear relaxations

◮ νf (G) = max{ e∈E ye | y(δ(v)) ≤ 1 ∀v ∈ V ; y ∈ RE +}, ◮ τf (G) = min{ v∈V xv | xu + xv ≥ 1 ∀uv ∈ E; x ∈ RV +}.

By duality theory: ν(G) ≤ νf (G) = τf (G) ≤ τ(G).

slide-9
SLIDE 9

In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G).

slide-10
SLIDE 10

In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨

  • nig-Egerv´

ary graph.

slide-11
SLIDE 11

In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨

  • nig-Egerv´

ary graph.

1/2 1/2 1/2 1/2 1/2 1/2

If ν(G) = τf (G), graph G is called stable.

slide-12
SLIDE 12

In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨

  • nig-Egerv´

ary graph.

1/2 1/2 1/2 1/2 1/2 1/2

If ν(G) = τf (G), graph G is called stable. Stable graphs have a rich history in game theory:

slide-13
SLIDE 13

In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨

  • nig-Egerv´

ary graph.

1/2 1/2 1/2 1/2 1/2 1/2

If ν(G) = τf (G), graph G is called stable. Stable graphs have a rich history in game theory: In various graph-theoretic settings, stable graph are exactly those instances for which stable outcomes exist.

slide-14
SLIDE 14

In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨

  • nig-Egerv´

ary graph.

1/2 1/2 1/2 1/2 1/2 1/2

If ν(G) = τf (G), graph G is called stable. Stable graphs have a rich history in game theory: In various graph-theoretic settings, stable graph are exactly those instances for which stable outcomes exist. This talk: Given an unstable graph, how can we find a small stabilizer, i.e., a subset F ⊆ E s.t. G \ F is stable.

slide-15
SLIDE 15

Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where

slide-16
SLIDE 16

Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where

◮ the players are represented by vertices of some given instance

G = (V , E), and

slide-17
SLIDE 17

Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where

◮ the players are represented by vertices of some given instance

G = (V , E), and

◮ the value of each coalition S ⊆ V is ν(G[S]), i.e., the

maximum size of a matching achieved solely the players in S.

slide-18
SLIDE 18

Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where

◮ the players are represented by vertices of some given instance

G = (V , E), and

◮ the value of each coalition S ⊆ V is ν(G[S]), i.e., the

maximum size of a matching achieved solely the players in S. The core of the game consists of all ”fair” allocations of ν(G) among the player set V , i.e., Core(G) = {x ∈ RV | x(V ) = ν(G); x(S) ≥ ν(G[S]) ∀S ⊆ V }.

slide-19
SLIDE 19

Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where

◮ the players are represented by vertices of some given instance

G = (V , E), and

◮ the value of each coalition S ⊆ V is ν(G[S]), i.e., the

maximum size of a matching achieved solely the players in S. The core of the game consists of all ”fair” allocations of ν(G) among the player set V , i.e., Core(G) = {x ∈ RV | x(V ) = ν(G); x(S) ≥ ν(G[S]) ∀S ⊆ V }. Note: Core(G) = ∅ ⇐ ⇒ G is stable.

slide-20
SLIDE 20

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

slide-21
SLIDE 21

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

1 1 1 1 1

Edges indicate the possible unit-valued collaborations.

slide-22
SLIDE 22

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

1 1 1 1 1

Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player.

slide-23
SLIDE 23

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

1 1 1 1 1

Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . .

slide-24
SLIDE 24

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E). Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E

slide-25
SLIDE 25

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

1/2 1/2 1

Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M.

slide-26
SLIDE 26

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

"blocking edge"

1/2 1/2 1

Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M.

slide-27
SLIDE 27

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

"blocking edge"

1/2 1/2 1

Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M. Outcome (M, x) is stable if xu + xv ≥ 1 for each edge {u, v} ∈ E.

slide-28
SLIDE 28

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

"stable outcome"

1 1

Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M. Outcome (M, x) is stable if xu + xv ≥ 1 for each edge {u, v} ∈ E.

slide-29
SLIDE 29

Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).

"stable outcome"

1 1

Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M. Outcome (M, x) is stable if xu + xv ≥ 1 for each edge {u, v} ∈ E. Note: G admits a stable outcome ⇐ ⇒ ν(G) = τf (G).

slide-30
SLIDE 30

Known: The following statements are equivalent:

◮ ν(G) = τf (G), i.e., G is stable;

slide-31
SLIDE 31

Known: The following statements are equivalent:

◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅;

slide-32
SLIDE 32

Known: The following statements are equivalent:

◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome;

slide-33
SLIDE 33

Known: The following statements are equivalent:

◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome;

slide-34
SLIDE 34

Known: The following statements are equivalent:

◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome; ◮ the set of inessential vertices forms an independent set;

slide-35
SLIDE 35

Known: The following statements are equivalent:

◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome; ◮ the set of inessential vertices forms an independent set; ◮ G contains no M-flower for a maximum matching M;

M−flower = M−blossom + even M−alt path

slide-36
SLIDE 36

Known: The following statements are equivalent:

◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome; ◮ the set of inessential vertices forms an independent set; ◮ G contains no M-flower for a maximum matching M; 1 0.5 0.5 0.5 0.5 0.5 1

slide-37
SLIDE 37

Edmonds’ algorithm either computes a stable solution (M, x),

slide-38
SLIDE 38

Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:

slide-39
SLIDE 39

Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists: Algorithm:

  • 1. compute a max matching M;
  • 2. consider the M-alternating forest;
slide-40
SLIDE 40

Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists: Algorithm:

  • 1. compute a max matching M;
  • 2. consider the M-alternating forest;
  • 3. either there is an M-flower
slide-41
SLIDE 41

Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists: Algorithm:

  • 1. compute a max matching M;
  • 2. consider the M-alternating forest;
  • 3. either there is an M-flower proving that G is not stable, or
slide-42
SLIDE 42

Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:

C B A B B B B A A

Algorithm:

  • 1. compute a max matching M;
  • 2. consider the M-alternating forest;
  • 3. either there is an M-flower proving that G is not stable, or
  • 4. the Gallai-Edmonds-decomposition V = B ∪ A ∪ C
slide-43
SLIDE 43

Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:

A B C

Algorithm:

  • 1. compute a max matching M;
  • 2. consider the M-alternating forest;
  • 3. either there is an M-flower proving that G is not stable, or
  • 4. the Gallai-Edmonds-decomposition V = B ∪ A ∪ C
slide-44
SLIDE 44

Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:

x(v)=1

B C A

x(v)=0 x(v)=1/2

Algorithm:

  • 1. compute a max matching M;
  • 2. consider the M-alternating forest;
  • 3. either there is an M-flower proving that G is not stable, or
  • 4. the Gallai-Edmonds-decomposition V = B ∪ A ∪ Cdefines a

cover x of size |M|;

slide-45
SLIDE 45

Our focus: graphs that are not stable.

slide-46
SLIDE 46

Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible?

slide-47
SLIDE 47

Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.

slide-48
SLIDE 48

Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size. Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable.

slide-49
SLIDE 49

Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size. Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality

slide-50
SLIDE 50

Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.

1 1 1

Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality

slide-51
SLIDE 51

Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.

1 1 1

Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality If possible such that ν(G) = ν(G \ F).

slide-52
SLIDE 52

Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.

1 1 1

Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality If possible such that ν(G) = ν(G \ F).

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

slide-53
SLIDE 53

Thus, there is a minimum stabilizer F so that at least one maximum matching M survives.

slide-54
SLIDE 54

Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:

Theorem

The minimum stabilizer problem is NP-hard (vertex cover).

slide-55
SLIDE 55

Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:

Theorem

The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.

slide-56
SLIDE 56

Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:

Theorem

The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.

Theorem

There is a 2-approximation for the stabilizer problem with given M.

slide-57
SLIDE 57

Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:

Theorem

The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.

Theorem

There is a 2-approximation for the stabilizer problem with given M. Open: Constant factor approximation for the general min stabilizer problem?

slide-58
SLIDE 58

Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:

Theorem

The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.

Theorem

There is a 2-approximation for the stabilizer problem with given M. Open: Constant factor approximation for the general min stabilizer problem?

Theorem

4ω-approximation, where ω is the sparsity of the graph.

slide-59
SLIDE 59

Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:

Theorem

The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.

Theorem

There is a 2-approximation for the stabilizer problem with given M. Open: Constant factor approximation for the general min stabilizer problem?

Theorem

4ω-approximation, where ω is the sparsity of the graph.

Theorem

2-approximation for regular graphs.

slide-60
SLIDE 60

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

slide-61
SLIDE 61

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal.

slide-62
SLIDE 62

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅.

slide-63
SLIDE 63

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M.

slide-64
SLIDE 64

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable.

slide-65
SLIDE 65

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′,

slide-66
SLIDE 66

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F.

slide-67
SLIDE 67

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.

slide-68
SLIDE 68

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.Thus, there exists some M \ F-augmenting path P in G \ F.

slide-69
SLIDE 69

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.Thus, there exists some M \ F-augmenting path P in G \ F. However, as M is max matching in G, one end of the path must be adjacent to f ∈ M ∩ F.

slide-70
SLIDE 70

Theorem (Key)

If F is a minimum stabilizer, then ν(G \ F) = ν(G).

Proof.

Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.Thus, there exists some M \ F-augmenting path P in G \ F. However, as M is max matching in G, one end of the path must be adjacent to f ∈ M ∩ F. It follows that ˆ M = M∆(P + f ) is max matching in G with | ˆ M ∩ F| < |M ∩ F|. Contradiction!

slide-71
SLIDE 71

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G))

slide-72
SLIDE 72

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof: Consider the Gallai−Edmonds’ Decomposition V=B+A+C

B A C

slide-73
SLIDE 73

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof: isolated components in G[B] as possible

B A C

Choose max matching M linking as much

slide-74
SLIDE 74

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:

1 1 k

U2 Let U , ..., U denote the non−trivial components in G[B] with at least one M−exposed vertex U

slide-75
SLIDE 75

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:

i

U1 U2 Each U has at least one vertex v essential in G\F

i

slide-76
SLIDE 76

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:

2

U1 U2 Can assume that v is M−exposed

i

v v1

slide-77
SLIDE 77

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:

2

U1 U2

i

Choose max matching N in G\F

i

v v1

slide-78
SLIDE 78

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:

2

N M is disjoint union of even cycles and even paths v v

1

slide-79
SLIDE 79

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:

i

v v

1 2

Note: each of the k paths starting at v has at least

  • ne edge in F.
slide-80
SLIDE 80

Theorem

F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof: Thus, |F|> k, as desired. v v

1 2

Note: each of the k paths starting at v has at least

  • ne edge in F.
slide-81
SLIDE 81

Theorem

There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT,

slide-82
SLIDE 82

Theorem

There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.

slide-83
SLIDE 83

Theorem

There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.

Lemma

If ν(G) < νf (G), can find in poly-time a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

slide-84
SLIDE 84

Theorem

There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.

Lemma

If ν(G) < νf (G), can find in poly-time a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

◮ Initialize F ← ∅; ◮ WHILE G \ F is not stable:

◮ apply Lemma to find L; ◮ F ← F ∪ L;

slide-85
SLIDE 85

Theorem

There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.

Lemma

If ν(G) < νf (G), can find in poly-time a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

◮ Initialize F ← ∅; ◮ WHILE G \ F is not stable:

◮ apply Lemma to find L; ◮ F ← F ∪ L;

Note: In each iteration, at most 4ω edges are added to F; At most 2(νf (G) − ν(G)) ≤OPT iterations.

slide-86
SLIDE 86

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

slide-87
SLIDE 87

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

slide-88
SLIDE 88

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

slide-89
SLIDE 89

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

  • 3. Construct max matching M in support of ˆ

y.

slide-90
SLIDE 90

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

  • 3. Construct max matching M in support of ˆ

y.

  • 4. Mark all vertices in H := C1;
slide-91
SLIDE 91

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

  • 3. Construct max matching M in support of ˆ

y.

  • 4. Mark all vertices in H := C1;
  • 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ

xv = 1

2}| ≤ 4ω,

STOP; Return L = Lu;

slide-92
SLIDE 92

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

  • 3. Construct max matching M in support of ˆ

y.

  • 4. Mark all vertices in H := C1;
  • 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ

xv = 1

2}| ≤ 4ω,

STOP; Return L = Lu;

  • 6. Else, choose marked vertex in H being adjacent to some

w ∈ H with ˆ xw = 1

2;

slide-93
SLIDE 93

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

  • 3. Construct max matching M in support of ˆ

y.

  • 4. Mark all vertices in H := C1;
  • 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ

xv = 1

2}| ≤ 4ω,

STOP; Return L = Lu;

  • 6. Else, choose marked vertex in H being adjacent to some

w ∈ H with ˆ xw = 1

2;

  • 7. Add w and its matching neighbour z to H; Mark z and

iterate;

slide-94
SLIDE 94

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

  • 3. Construct max matching M in support of ˆ

y.

  • 4. Mark all vertices in H := C1;
  • 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ

xv = 1

2}| ≤ 4ω,

STOP; Return L = Lu;

  • 6. Else, choose marked vertex in H being adjacent to some

w ∈ H with ˆ xw = 1

2;

  • 7. Add w and its matching neighbour z to H; Mark z and

iterate; Invariant: Isolating a marked vertex won’t decrease ν(G),

slide-95
SLIDE 95

Lemma

If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1

2.

Algorithm:

  • 1. Compute optimal (half-integral) fractional matching ˆ

y with least number of odd cycles C1, . . . , Cm in support.

  • 2. Compute optimal (half-integral) fractional vertex cover ˆ

x.

  • 3. Construct max matching M in support of ˆ

y.

  • 4. Mark all vertices in H := C1;
  • 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ

xv = 1

2}| ≤ 4ω,

STOP; Return L = Lu;

  • 6. Else, choose marked vertex in H being adjacent to some

w ∈ H with ˆ xw = 1

2;

  • 7. Add w and its matching neighbour z to H; Mark z and

iterate; Invariant: Isolating a marked vertex won’t decrease ν(G),but τf (G) will decrease by 1

2.

slide-96
SLIDE 96

Summary

Stable graphs are nicely characterized.

slide-97
SLIDE 97

Summary

Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover.

slide-98
SLIDE 98

Summary

Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover. We have (so far)

◮ 2-approximation for regular graphs; ◮ 2-approximation for the min stabilizer problem w.r.t. a fixed

matching M (also vertex-cover-hard);

◮ 4ω-approximation; ◮ ν(G) = ν(G \ F) for any minimum stabilizer F;

slide-99
SLIDE 99

Summary

Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover. We have (so far)

◮ 2-approximation for regular graphs; ◮ 2-approximation for the min stabilizer problem w.r.t. a fixed

matching M (also vertex-cover-hard);

◮ 4ω-approximation; ◮ ν(G) = ν(G \ F) for any minimum stabilizer F;

We would love to find a constant-factor approximation!!!

slide-100
SLIDE 100

Summary

Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover. We have (so far)

◮ 2-approximation for regular graphs; ◮ 2-approximation for the min stabilizer problem w.r.t. a fixed

matching M (also vertex-cover-hard);

◮ 4ω-approximation; ◮ ν(G) = ν(G \ F) for any minimum stabilizer F;

We would love to find a constant-factor approximation!!!

Thank you!