SLIDE 1 Finding small stabilizer for unstable graphs
Adrian Bock1, Karthik Chandrasekaran2, Jochen K¨
Britta Peis4, and Laura Sanit´ a3 (1Lausanne, 2Boston, 3Waterloo, 4Aachen)
SLIDE 2
Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),
SLIDE 3
Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),
◮ a matching is a set M ⊆ E of non-adjacent edges,
SLIDE 4
Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),
◮ a matching is a set M ⊆ E of non-adjacent edges, ◮ a vertex cover is a set of vertices C ⊆ V such that each edge
has at least one endpoint in C.
SLIDE 5
Let’s recall some basics on matchings and vertex cover: Given an undirected graph G = (V , E),
◮ a matching is a set M ⊆ E of non-adjacent edges, ◮ a vertex cover is a set of vertices C ⊆ V such that each edge
has at least one endpoint in C.
◮ Finding a maximum matching is ”easy” whereas finding a
minimum cover is ”hard”.
SLIDE 6
As usual, let
◮ ν(G) = max{|M| | M matching in G}, and ◮ τ(G) = min{|C| | C vertex cover in G}.
SLIDE 7
As usual, let
◮ ν(G) = max{|M| | M matching in G}, and ◮ τ(G) = min{|C| | C vertex cover in G}.
Consider the corresponding linear relaxations
◮ νf (G) = max{ e∈E ye | y(δ(v)) ≤ 1 ∀v ∈ V ; y ∈ RE +}, ◮ τf (G) = min{ v∈V xv | xu + xv ≥ 1 ∀uv ∈ E; x ∈ RV +}.
SLIDE 8
As usual, let
◮ ν(G) = max{|M| | M matching in G}, and ◮ τ(G) = min{|C| | C vertex cover in G}.
Consider the corresponding linear relaxations
◮ νf (G) = max{ e∈E ye | y(δ(v)) ≤ 1 ∀v ∈ V ; y ∈ RE +}, ◮ τf (G) = min{ v∈V xv | xu + xv ≥ 1 ∀uv ∈ E; x ∈ RV +}.
By duality theory: ν(G) ≤ νf (G) = τf (G) ≤ τ(G).
SLIDE 9
In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G).
SLIDE 10 In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨
ary graph.
SLIDE 11 In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨
ary graph.
1/2 1/2 1/2 1/2 1/2 1/2
If ν(G) = τf (G), graph G is called stable.
SLIDE 12 In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨
ary graph.
1/2 1/2 1/2 1/2 1/2 1/2
If ν(G) = τf (G), graph G is called stable. Stable graphs have a rich history in game theory:
SLIDE 13 In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨
ary graph.
1/2 1/2 1/2 1/2 1/2 1/2
If ν(G) = τf (G), graph G is called stable. Stable graphs have a rich history in game theory: In various graph-theoretic settings, stable graph are exactly those instances for which stable outcomes exist.
SLIDE 14 In general: ν(G) ≤ νf (G) = τf (G) ≤ τ(G). If ν(G) = τ(G), graph G is called K¨
ary graph.
1/2 1/2 1/2 1/2 1/2 1/2
If ν(G) = τf (G), graph G is called stable. Stable graphs have a rich history in game theory: In various graph-theoretic settings, stable graph are exactly those instances for which stable outcomes exist. This talk: Given an unstable graph, how can we find a small stabilizer, i.e., a subset F ⊆ E s.t. G \ F is stable.
SLIDE 15
Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where
SLIDE 16
Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where
◮ the players are represented by vertices of some given instance
G = (V , E), and
SLIDE 17
Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where
◮ the players are represented by vertices of some given instance
G = (V , E), and
◮ the value of each coalition S ⊆ V is ν(G[S]), i.e., the
maximum size of a matching achieved solely the players in S.
SLIDE 18
Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where
◮ the players are represented by vertices of some given instance
G = (V , E), and
◮ the value of each coalition S ⊆ V is ν(G[S]), i.e., the
maximum size of a matching achieved solely the players in S. The core of the game consists of all ”fair” allocations of ν(G) among the player set V , i.e., Core(G) = {x ∈ RV | x(V ) = ν(G); x(S) ≥ ν(G[S]) ∀S ⊆ V }.
SLIDE 19
Assignment games [Shapley & Shubik ’71]: The assignment game is a cooperative game where
◮ the players are represented by vertices of some given instance
G = (V , E), and
◮ the value of each coalition S ⊆ V is ν(G[S]), i.e., the
maximum size of a matching achieved solely the players in S. The core of the game consists of all ”fair” allocations of ν(G) among the player set V , i.e., Core(G) = {x ∈ RV | x(V ) = ν(G); x(S) ≥ ν(G[S]) ∀S ⊆ V }. Note: Core(G) = ∅ ⇐ ⇒ G is stable.
SLIDE 20
Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
SLIDE 21 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
1 1 1 1 1
Edges indicate the possible unit-valued collaborations.
SLIDE 22 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
1 1 1 1 1
Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player.
SLIDE 23 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
1 1 1 1 1
Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . .
SLIDE 24
Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E). Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E
SLIDE 25 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
1/2 1/2 1
Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M.
SLIDE 26 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
"blocking edge"
1/2 1/2 1
Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M.
SLIDE 27 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
"blocking edge"
1/2 1/2 1
Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M. Outcome (M, x) is stable if xu + xv ≥ 1 for each edge {u, v} ∈ E.
SLIDE 28 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
"stable outcome"
1 1
Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M. Outcome (M, x) is stable if xu + xv ≥ 1 for each edge {u, v} ∈ E.
SLIDE 29 Network bargaining games [Kleinberg/Tardos’08]: Players are nodes of an undirected graph G = (V , E).
"stable outcome"
1 1
Edges indicate the possible unit-valued collaborations. Each player can collaborate with at most one player. Players start negotiating . . . Solution/outcome: A tuple (M, x) consisting of matching M ⊆ E and an allocation x ∈ RV of value |M| among the endpoints of M. Outcome (M, x) is stable if xu + xv ≥ 1 for each edge {u, v} ∈ E. Note: G admits a stable outcome ⇐ ⇒ ν(G) = τf (G).
SLIDE 30
Known: The following statements are equivalent:
◮ ν(G) = τf (G), i.e., G is stable;
SLIDE 31
Known: The following statements are equivalent:
◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅;
SLIDE 32
Known: The following statements are equivalent:
◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome;
SLIDE 33
Known: The following statements are equivalent:
◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome;
SLIDE 34
Known: The following statements are equivalent:
◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome; ◮ the set of inessential vertices forms an independent set;
SLIDE 35
Known: The following statements are equivalent:
◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome; ◮ the set of inessential vertices forms an independent set; ◮ G contains no M-flower for a maximum matching M;
M−flower = M−blossom + even M−alt path
SLIDE 36
Known: The following statements are equivalent:
◮ ν(G) = τf (G), i.e., G is stable; ◮ Core(G) = ∅; ◮ Graph G = (V , E) admits a stable outcome; ◮ G admits a ”balanced” outcome; ◮ the set of inessential vertices forms an independent set; ◮ G contains no M-flower for a maximum matching M; 1 0.5 0.5 0.5 0.5 0.5 1
SLIDE 37
Edmonds’ algorithm either computes a stable solution (M, x),
SLIDE 38
Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:
SLIDE 39 Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists: Algorithm:
- 1. compute a max matching M;
- 2. consider the M-alternating forest;
SLIDE 40 Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists: Algorithm:
- 1. compute a max matching M;
- 2. consider the M-alternating forest;
- 3. either there is an M-flower
SLIDE 41 Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists: Algorithm:
- 1. compute a max matching M;
- 2. consider the M-alternating forest;
- 3. either there is an M-flower proving that G is not stable, or
SLIDE 42 Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:
C B A B B B B A A
Algorithm:
- 1. compute a max matching M;
- 2. consider the M-alternating forest;
- 3. either there is an M-flower proving that G is not stable, or
- 4. the Gallai-Edmonds-decomposition V = B ∪ A ∪ C
SLIDE 43 Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:
A B C
Algorithm:
- 1. compute a max matching M;
- 2. consider the M-alternating forest;
- 3. either there is an M-flower proving that G is not stable, or
- 4. the Gallai-Edmonds-decomposition V = B ∪ A ∪ C
SLIDE 44 Edmonds’ algorithm either computes a stable solution (M, x), or gives a certificate (M-flower) proving that none exists:
x(v)=1
B C A
x(v)=0 x(v)=1/2
Algorithm:
- 1. compute a max matching M;
- 2. consider the M-alternating forest;
- 3. either there is an M-flower proving that G is not stable, or
- 4. the Gallai-Edmonds-decomposition V = B ∪ A ∪ Cdefines a
cover x of size |M|;
SLIDE 45
Our focus: graphs that are not stable.
SLIDE 46
Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible?
SLIDE 47
Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.
SLIDE 48
Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size. Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable.
SLIDE 49
Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size. Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality
SLIDE 50
Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.
1 1 1
Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality
SLIDE 51
Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.
1 1 1
Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality If possible such that ν(G) = ν(G \ F).
SLIDE 52
Our focus: graphs that are not stable. Can we stabilize unstable graphs by altering the graph in as few places as possible? Suppose G admits no matching and fractional cover of equal size.
1 1 1
Call F ⊆ E a stabilizer if G \ F = G[E \ F] is stable. Min Stabilizer problem: find a stabilizer F of minimum cardinality If possible such that ν(G) = ν(G \ F).
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
SLIDE 53
Thus, there is a minimum stabilizer F so that at least one maximum matching M survives.
SLIDE 54
Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:
Theorem
The minimum stabilizer problem is NP-hard (vertex cover).
SLIDE 55
Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:
Theorem
The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.
SLIDE 56
Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:
Theorem
The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.
Theorem
There is a 2-approximation for the stabilizer problem with given M.
SLIDE 57
Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:
Theorem
The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.
Theorem
There is a 2-approximation for the stabilizer problem with given M. Open: Constant factor approximation for the general min stabilizer problem?
SLIDE 58
Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:
Theorem
The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.
Theorem
There is a 2-approximation for the stabilizer problem with given M. Open: Constant factor approximation for the general min stabilizer problem?
Theorem
4ω-approximation, where ω is the sparsity of the graph.
SLIDE 59
Thus, there is a minimum stabilizer F so that at least one maximum matching M survives. We use this structural insight to show:
Theorem
The minimum stabilizer problem is NP-hard (vertex cover). Even if the matching M is given.
Theorem
There is a 2-approximation for the stabilizer problem with given M. Open: Constant factor approximation for the general min stabilizer problem?
Theorem
4ω-approximation, where ω is the sparsity of the graph.
Theorem
2-approximation for regular graphs.
SLIDE 60
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
SLIDE 61
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal.
SLIDE 62
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅.
SLIDE 63
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M.
SLIDE 64
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable.
SLIDE 65
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′,
SLIDE 66
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F.
SLIDE 67
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.
SLIDE 68
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.Thus, there exists some M \ F-augmenting path P in G \ F.
SLIDE 69
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.Thus, there exists some M \ F-augmenting path P in G \ F. However, as M is max matching in G, one end of the path must be adjacent to f ∈ M ∩ F.
SLIDE 70
Theorem (Key)
If F is a minimum stabilizer, then ν(G \ F) = ν(G).
Proof.
Let F be a minimum stabilizer. Choose a max matching M with |F ∩ M| minimal. Suppose F ∩ M = ∅. Delete edges in F \ M. The resulting graph G ′ is not stable. Moreover, M is max matching in G ′. Thus, there exists some M-flower in G ′, not hit by F. Since G \ F is stable, M \ F isn’t a max matching in G \ F.Thus, there exists some M \ F-augmenting path P in G \ F. However, as M is max matching in G, one end of the path must be adjacent to f ∈ M ∩ F. It follows that ˆ M = M∆(P + f ) is max matching in G with | ˆ M ∩ F| < |M ∩ F|. Contradiction!
SLIDE 71
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G))
SLIDE 72
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof: Consider the Gallai−Edmonds’ Decomposition V=B+A+C
B A C
SLIDE 73
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof: isolated components in G[B] as possible
B A C
Choose max matching M linking as much
SLIDE 74
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:
1 1 k
U2 Let U , ..., U denote the non−trivial components in G[B] with at least one M−exposed vertex U
SLIDE 75
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:
i
U1 U2 Each U has at least one vertex v essential in G\F
i
SLIDE 76
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:
2
U1 U2 Can assume that v is M−exposed
i
v v1
SLIDE 77
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:
2
U1 U2
i
Choose max matching N in G\F
i
v v1
SLIDE 78
Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:
2
N M is disjoint union of even cycles and even paths v v
1
SLIDE 79 Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof:
i
v v
1 2
Note: each of the k paths starting at v has at least
SLIDE 80 Theorem
F stabilizer of G = ⇒ |F| ≥ 2(νf (G) − ν(G)) Proof: Thus, |F|> k, as desired. v v
1 2
Note: each of the k paths starting at v has at least
SLIDE 81
Theorem
There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT,
SLIDE 82
Theorem
There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.
SLIDE 83
Theorem
There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.
Lemma
If ν(G) < νf (G), can find in poly-time a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
SLIDE 84 Theorem
There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.
Lemma
If ν(G) < νf (G), can find in poly-time a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
◮ Initialize F ← ∅; ◮ WHILE G \ F is not stable:
◮ apply Lemma to find L; ◮ F ← F ∪ L;
SLIDE 85 Theorem
There is a poly-time algorithm that finds a stabilizer F with ν(G) = ν(G \F) and |F| ≤ 4ω ·OPT, where ω is the sparsity of G.
Lemma
If ν(G) < νf (G), can find in poly-time a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
◮ Initialize F ← ∅; ◮ WHILE G \ F is not stable:
◮ apply Lemma to find L; ◮ F ← F ∪ L;
Note: In each iteration, at most 4ω edges are added to F; At most 2(νf (G) − ν(G)) ≤OPT iterations.
SLIDE 86
Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
SLIDE 87 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
SLIDE 88 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
SLIDE 89 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
- 3. Construct max matching M in support of ˆ
y.
SLIDE 90 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
- 3. Construct max matching M in support of ˆ
y.
- 4. Mark all vertices in H := C1;
SLIDE 91 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
- 3. Construct max matching M in support of ˆ
y.
- 4. Mark all vertices in H := C1;
- 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ
xv = 1
2}| ≤ 4ω,
STOP; Return L = Lu;
SLIDE 92 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
- 3. Construct max matching M in support of ˆ
y.
- 4. Mark all vertices in H := C1;
- 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ
xv = 1
2}| ≤ 4ω,
STOP; Return L = Lu;
- 6. Else, choose marked vertex in H being adjacent to some
w ∈ H with ˆ xw = 1
2;
SLIDE 93 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
- 3. Construct max matching M in support of ˆ
y.
- 4. Mark all vertices in H := C1;
- 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ
xv = 1
2}| ≤ 4ω,
STOP; Return L = Lu;
- 6. Else, choose marked vertex in H being adjacent to some
w ∈ H with ˆ xw = 1
2;
- 7. Add w and its matching neighbour z to H; Mark z and
iterate;
SLIDE 94 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
- 3. Construct max matching M in support of ˆ
y.
- 4. Mark all vertices in H := C1;
- 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ
xv = 1
2}| ≤ 4ω,
STOP; Return L = Lu;
- 6. Else, choose marked vertex in H being adjacent to some
w ∈ H with ˆ xw = 1
2;
- 7. Add w and its matching neighbour z to H; Mark z and
iterate; Invariant: Isolating a marked vertex won’t decrease ν(G),
SLIDE 95 Lemma
If ν(G) < νf (G), the following algorithm returns a set L ⊆ E with |L| ≤ 4ω, ν(G) = ν(G \ L), and νf (G \ L) = νf (G) − 1
2.
Algorithm:
- 1. Compute optimal (half-integral) fractional matching ˆ
y with least number of odd cycles C1, . . . , Cm in support.
- 2. Compute optimal (half-integral) fractional vertex cover ˆ
x.
- 3. Construct max matching M in support of ˆ
y.
- 4. Mark all vertices in H := C1;
- 5. If exists marked u ∈ H with |Lu := {uv ∈ E | ˆ
xv = 1
2}| ≤ 4ω,
STOP; Return L = Lu;
- 6. Else, choose marked vertex in H being adjacent to some
w ∈ H with ˆ xw = 1
2;
- 7. Add w and its matching neighbour z to H; Mark z and
iterate; Invariant: Isolating a marked vertex won’t decrease ν(G),but τf (G) will decrease by 1
2.
SLIDE 96
Summary
Stable graphs are nicely characterized.
SLIDE 97
Summary
Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover.
SLIDE 98
Summary
Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover. We have (so far)
◮ 2-approximation for regular graphs; ◮ 2-approximation for the min stabilizer problem w.r.t. a fixed
matching M (also vertex-cover-hard);
◮ 4ω-approximation; ◮ ν(G) = ν(G \ F) for any minimum stabilizer F;
SLIDE 99
Summary
Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover. We have (so far)
◮ 2-approximation for regular graphs; ◮ 2-approximation for the min stabilizer problem w.r.t. a fixed
matching M (also vertex-cover-hard);
◮ 4ω-approximation; ◮ ν(G) = ν(G \ F) for any minimum stabilizer F;
We would love to find a constant-factor approximation!!!
SLIDE 100
Summary
Stable graphs are nicely characterized. The problem to find a stabilizer of minimum cardinality is at least as hard as vertex cover. We have (so far)
◮ 2-approximation for regular graphs; ◮ 2-approximation for the min stabilizer problem w.r.t. a fixed
matching M (also vertex-cover-hard);
◮ 4ω-approximation; ◮ ν(G) = ν(G \ F) for any minimum stabilizer F;
We would love to find a constant-factor approximation!!!
Thank you!