Computing sparsity stuff in real world graphs Marcin Pilipczuk a - - PowerPoint PPT Presentation

computing sparsity stuff in real world graphs
SMART_READER_LITE
LIVE PREVIEW

Computing sparsity stuff in real world graphs Marcin Pilipczuk a - - PowerPoint PPT Presentation

Computing sparsity stuff in real world graphs Marcin Pilipczuk a lot of slides by Wojciech Nadara and Micha l Pilipczuk Institute of Informatics, University of Warsaw, Poland Shonan Village Center 4th March 2019 Marcin Pilipczuk Sparsity


slide-1
SLIDE 1

Computing sparsity stuff in real world graphs

Marcin Pilipczuk

a lot of slides by Wojciech Nadara and Micha l Pilipczuk Institute of Informatics, University of Warsaw, Poland

Shonan Village Center 4th March 2019

Marcin Pilipczuk Sparsity 1/38

slide-2
SLIDE 2

Mandatory slide

Current research (and my stay here) is a part of projects that have received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme under grant agreement No 714704. Experimental work (later in the talk) supported by the Recent trends in kernelization: theory and experimental evaluation project, carried out within the Homing programme of the Foundation for Polish Science co-financed by the European Union under the European Regional Development Fund.

Marcin Pilipczuk Sparsity 2/38

slide-3
SLIDE 3

What does it mean sparse?

For the purpose of this talk we focus on graphs.

Marcin Pilipczuk Sparsity 3/38

slide-4
SLIDE 4

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Marcin Pilipczuk Sparsity 3/38

slide-5
SLIDE 5

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Positive ex: Bounded max degree, planar, bounded treewidth, ...

Marcin Pilipczuk Sparsity 3/38

slide-6
SLIDE 6

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Positive ex: Bounded max degree, planar, bounded treewidth, ... Negative ex: Cliques, bicliques, ...

Marcin Pilipczuk Sparsity 3/38

slide-7
SLIDE 7

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Positive ex: Bounded max degree, planar, bounded treewidth, ... Negative ex: Cliques, bicliques, ...

Attempt 1. Edge density bounded by a constant: density(G) := |E(G)| |V (G)| c.

Marcin Pilipczuk Sparsity 3/38

slide-8
SLIDE 8

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Positive ex: Bounded max degree, planar, bounded treewidth, ... Negative ex: Cliques, bicliques, ...

Attempt 1. Edge density bounded by a constant: density(G) := |E(G)| |V (G)| c. Note: density(G) is half of the average degree.

Marcin Pilipczuk Sparsity 3/38

slide-9
SLIDE 9

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Positive ex: Bounded max degree, planar, bounded treewidth, ... Negative ex: Cliques, bicliques, ...

Attempt 1. Edge density bounded by a constant: density(G) := |E(G)| |V (G)| c. Note: density(G) is half of the average degree. Problem: Take a clique of size n plus n2 − n isolated vertices.

Marcin Pilipczuk Sparsity 3/38

slide-10
SLIDE 10

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Positive ex: Bounded max degree, planar, bounded treewidth, ... Negative ex: Cliques, bicliques, ...

Attempt 1. Edge density bounded by a constant: density(G) := |E(G)| |V (G)| c. Note: density(G) is half of the average degree. Problem: Take a clique of size n plus n2 − n isolated vertices.

This has density < 1

2.

Marcin Pilipczuk Sparsity 3/38

slide-11
SLIDE 11

What does it mean sparse?

For the purpose of this talk we focus on graphs. Q: What does it mean that a graph is sparse?

Positive ex: Bounded max degree, planar, bounded treewidth, ... Negative ex: Cliques, bicliques, ...

Attempt 1. Edge density bounded by a constant: density(G) := |E(G)| |V (G)| c. Note: density(G) is half of the average degree. Problem: Take a clique of size n plus n2 − n isolated vertices.

This has density < 1

2.

Issue: Although the density is small, contains a dense substructure.

Marcin Pilipczuk Sparsity 3/38

slide-12
SLIDE 12

What does it mean sparse?

Attempt 2. Every subgraph of G has bounded edge density: max

H⊆G {density(H)} c.

Marcin Pilipczuk Sparsity 4/38

slide-13
SLIDE 13

What does it mean sparse?

Attempt 2. Every subgraph of G has bounded edge density: max

H⊆G {density(H)} c.

Remark: This is essentially equivalent to degeneracy or arboricity.

Marcin Pilipczuk Sparsity 4/38

slide-14
SLIDE 14

What does it mean sparse?

Attempt 2. Every subgraph of G has bounded edge density: max

H⊆G {density(H)} c.

Remark: This is essentially equivalent to degeneracy or arboricity. Problem: Take a clique Kn with each edge subdivided once.

Marcin Pilipczuk Sparsity 4/38

slide-15
SLIDE 15

What does it mean sparse?

Attempt 2. Every subgraph of G has bounded edge density: max

H⊆G {density(H)} c.

Remark: This is essentially equivalent to degeneracy or arboricity. Problem: Take a clique Kn with each edge subdivided once.

In every subgraph of this graph, the number of edges is at most twice the number of vertices.

Marcin Pilipczuk Sparsity 4/38

slide-16
SLIDE 16

What does it mean sparse?

Attempt 2. Every subgraph of G has bounded edge density: max

H⊆G {density(H)} c.

Remark: This is essentially equivalent to degeneracy or arboricity. Problem: Take a clique Kn with each edge subdivided once.

In every subgraph of this graph, the number of edges is at most twice the number of vertices. Issue: We see a dense structure “at depth” 1.

Marcin Pilipczuk Sparsity 4/38

slide-17
SLIDE 17

Shallow minors

Need: notion of embedding that looks at constant “depth”.

Marcin Pilipczuk Sparsity 5/38

slide-18
SLIDE 18

Shallow minors

Need: notion of embedding that looks at constant “depth”. Graph H is a minor of G if there is a minor model φ of H in G.

H G φ

Marcin Pilipczuk Sparsity 5/38

slide-19
SLIDE 19

Shallow minors

Need: notion of embedding that looks at constant “depth”. Graph H is a minor of G if there is a minor model φ of H in G.

Model φ maps vertices u ∈ V (H) to pairwise disjoint connected subgraphs φ(u) of G, called branch sets. H G φ

Marcin Pilipczuk Sparsity 5/38

slide-20
SLIDE 20

Shallow minors

Need: notion of embedding that looks at constant “depth”. Graph H is a minor of G if there is a minor model φ of H in G.

Model φ maps vertices u ∈ V (H) to pairwise disjoint connected subgraphs φ(u) of G, called branch sets. If uv ∈ E(H), then there should be an edge between φ(u) and φ(v). H G φ

Marcin Pilipczuk Sparsity 5/38

slide-21
SLIDE 21

Shallow minors

Need: notion of embedding that looks at constant “depth”. Graph H is a minor of G if there is a minor model φ of H in G.

Model φ maps vertices u ∈ V (H) to pairwise disjoint connected subgraphs φ(u) of G, called branch sets. If uv ∈ E(H), then there should be an edge between φ(u) and φ(v).

Graph H is a depth-d minor of G if there is a minor model of H in G where each branch set has radius at most d.

H G φ H G φ

Marcin Pilipczuk Sparsity 5/38

slide-22
SLIDE 22

Shallow minors

Need: notion of embedding that looks at constant “depth”. Graph H is a minor of G if there is a minor model φ of H in G.

Model φ maps vertices u ∈ V (H) to pairwise disjoint connected subgraphs φ(u) of G, called branch sets. If uv ∈ E(H), then there should be an edge between φ(u) and φ(v).

Graph H is a depth-d minor of G if there is a minor model of H in G where each branch set has radius at most d. Idea: Replace subgraphs with shallow minors in the def. of sparsity.

H G φ H G φ

Marcin Pilipczuk Sparsity 5/38

slide-23
SLIDE 23

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph.

Marcin Pilipczuk Sparsity 6/38

slide-24
SLIDE 24

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Marcin Pilipczuk Sparsity 6/38

slide-25
SLIDE 25

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Ex: C▽0 is the closure of C under subgraphs.

Marcin Pilipczuk Sparsity 6/38

slide-26
SLIDE 26

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Ex: C▽0 is the closure of C under subgraphs.

Bounded expansion A class of graphs C has bounded expansion if there is a function f : N → N such that density(H) f (d) for all d ∈ N and H ∈ C▽d.

Marcin Pilipczuk Sparsity 6/38

slide-27
SLIDE 27

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Ex: C▽0 is the closure of C under subgraphs.

Bounded expansion A class of graphs C has bounded expansion if there is a function f : N → N such that density(H) f (d) for all d ∈ N and H ∈ C▽d. Nowhere dense A class of graphs C is nowhere dense if there is a function t : N → N such that Kt(d) / ∈ C▽d for all d ∈ N. Equivalently, C▽d = Graphs for all d ∈ N.

Marcin Pilipczuk Sparsity 6/38

slide-28
SLIDE 28

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Ex: C▽0 is the closure of C under subgraphs.

Bounded expansion A class of graphs C has bounded expansion if there is a function f : N → N such that density(H) f (d) for all d ∈ N and H ∈ C▽d. Nowhere dense A class of graphs C is nowhere dense if there is a function t : N → N such that Kt(d) / ∈ C▽d for all d ∈ N. Equivalently, C▽d = Graphs for all d ∈ N.

Marcin Pilipczuk Sparsity 6/38

slide-29
SLIDE 29

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Ex: C▽0 is the closure of C under subgraphs.

Bounded expansion A class of graphs C has bounded expansion if there is a function f : N → N such that density(H) f (d) for all d ∈ N and H ∈ C▽d. Nowhere dense A class of graphs C is nowhere dense if there is a function t : N → N such that Kt(d) / ∈ C▽d for all d ∈ N. Equivalently, C▽d = Graphs for all d ∈ N. Intuition: At every constant depth we see a sparse class, but the parameters can deteriorate with increasing depth.

Marcin Pilipczuk Sparsity 6/38

slide-30
SLIDE 30

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Ex: C▽0 is the closure of C under subgraphs.

Bounded expansion A class of graphs C has bounded expansion if there is a function f : N → N such that density(H) f (d) for all d ∈ N and H ∈ C▽d. Nowhere dense A class of graphs C is nowhere dense if there is a function t : N → N such that Kt(d) / ∈ C▽d for all d ∈ N. Equivalently, C▽d = Graphs for all d ∈ N. Intuition: At every constant depth we see a sparse class, but the parameters can deteriorate with increasing depth. Note: Nowhere dense classes are also sparse.

Marcin Pilipczuk Sparsity 6/38

slide-31
SLIDE 31

Sparsity, formally

Note: Sparsity is a property of a graph class, not of a single graph. Notation: C▽d = {depth-d minors of graphs from C}.

Ex: C▽0 is the closure of C under subgraphs.

Bounded expansion A class of graphs C has bounded expansion if there is a function f : N → N such that density(H) f (d) for all d ∈ N and H ∈ C▽d. Nowhere dense A class of graphs C is nowhere dense if there is a function t : N → N such that Kt(d) / ∈ C▽d for all d ∈ N. Equivalently, C▽d = Graphs for all d ∈ N. Intuition: At every constant depth we see a sparse class, but the parameters can deteriorate with increasing depth. Note: Nowhere dense classes are also sparse.

If H ∈ C▽d, then H has Oε,d(n1+ε) edges, for any ε > 0.

Marcin Pilipczuk Sparsity 6/38

slide-32
SLIDE 32

Hierarchy of sparsity

Star forests Bounded treedepth Bounded treewidth Excluding a minor Excluding a topological minor Bounded expansion Outerplanar Planar Bounded genus Linear forests Bounded degree Locally bounded treewidth Locally excluding a minor Forests

r r

Locally bounded expansion Nowhere dense

∇ ∇

r

ω

Figure by Felix Reidl

Marcin Pilipczuk Sparsity 7/38

slide-33
SLIDE 33

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Marcin Pilipczuk Sparsity 8/38

slide-34
SLIDE 34

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Marcin Pilipczuk Sparsity 8/38

slide-35
SLIDE 35

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Marcin Pilipczuk Sparsity 8/38

slide-36
SLIDE 36

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Earliest definition of nowhere denseness: Podewski and Ziegler in 1976.

Marcin Pilipczuk Sparsity 8/38

slide-37
SLIDE 37

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Earliest definition of nowhere denseness: Podewski and Ziegler in 1976.

In summary:

Marcin Pilipczuk Sparsity 8/38

slide-38
SLIDE 38

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Earliest definition of nowhere denseness: Podewski and Ziegler in 1976.

In summary:

Bounded expansion and nowhere denseness are fundamental concepts that have multiple equivalent characterizations.

Marcin Pilipczuk Sparsity 8/38

slide-39
SLIDE 39

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Earliest definition of nowhere denseness: Podewski and Ziegler in 1976.

In summary:

Bounded expansion and nowhere denseness are fundamental concepts that have multiple equivalent characterizations. Each characterization yields a different viewpoint and a tool.

Marcin Pilipczuk Sparsity 8/38

slide-40
SLIDE 40

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Earliest definition of nowhere denseness: Podewski and Ziegler in 1976.

In summary:

Bounded expansion and nowhere denseness are fundamental concepts that have multiple equivalent characterizations. Each characterization yields a different viewpoint and a tool. Applications for combinatorics, algorithms, and logic.

Marcin Pilipczuk Sparsity 8/38

slide-41
SLIDE 41

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Earliest definition of nowhere denseness: Podewski and Ziegler in 1976.

In summary:

Bounded expansion and nowhere denseness are fundamental concepts that have multiple equivalent characterizations. Each characterization yields a different viewpoint and a tool. Applications for combinatorics, algorithms, and logic. Nowhere denseness delimits tractability for many basic problems.

Marcin Pilipczuk Sparsity 8/38

slide-42
SLIDE 42

Theory of sparsity

Developed by Neˇ setˇ ril and Ossona de Mendez since 2005.

Monograph Sparsity presents the field as of 2012.

Many concepts appeared already much earlier.

Earliest definition of nowhere denseness: Podewski and Ziegler in 1976.

In summary:

Bounded expansion and nowhere denseness are fundamental concepts that have multiple equivalent characterizations. Each characterization yields a different viewpoint and a tool. Applications for combinatorics, algorithms, and logic. Nowhere denseness delimits tractability for many basic problems. Toolbox seems much more suitable than using decomposition theorems for classes excluding a fixed (topological) minor.

Marcin Pilipczuk Sparsity 8/38

slide-43
SLIDE 43

Characterizations

Sparsity of shallow minors

Marcin Pilipczuk Sparsity 9/38

slide-44
SLIDE 44

Characterizations

Sparsity of shallow minors Sparsity of shallow topological minors

Marcin Pilipczuk Sparsity 9/38

slide-45
SLIDE 45

Characterizations

Sparsity of shallow minors Sparsity of shallow topological minors Degeneracy Weak coloring number Generalized coloring numbers

Marcin Pilipczuk Sparsity 9/38

slide-46
SLIDE 46

Characterizations

Sparsity of shallow minors Sparsity of shallow topological minors Generalized coloring numbers Uniform quasi-wideness

Marcin Pilipczuk Sparsity 9/38

slide-47
SLIDE 47

Characterizations

Sparsity of shallow minors Sparsity of shallow topological minors Generalized coloring numbers Uniform quasi-wideness

A

Neighborhood complexity

A

Marcin Pilipczuk Sparsity 9/38

slide-48
SLIDE 48

Characterizations

Sparsity of shallow minors Sparsity of shallow topological minors Generalized coloring numbers Uniform quasi-wideness Neighborhood complexity

A

Low treedepth colorings Fraternal augmentations k-Helly property Neighborhood covers Splitter game

Marcin Pilipczuk Sparsity 9/38

slide-49
SLIDE 49

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework.

Marcin Pilipczuk Sparsity 10/38

slide-50
SLIDE 50

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework. What (meta-)class of problems is famously local on graphs?

Marcin Pilipczuk Sparsity 10/38

slide-51
SLIDE 51

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework. What (meta-)class of problems is famously local on graphs? FO model-checking Input Graph G, FO sentence ϕ Question Does G | = ϕ?

Marcin Pilipczuk Sparsity 10/38

slide-52
SLIDE 52

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework. What (meta-)class of problems is famously local on graphs? FO model-checking Input Graph G, FO sentence ϕ Question Does G | = ϕ? In general graphs, there is an O(nϕ)-time algorithm.

Marcin Pilipczuk Sparsity 10/38

slide-53
SLIDE 53

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework. What (meta-)class of problems is famously local on graphs? FO model-checking Input Graph G, FO sentence ϕ Question Does G | = ϕ? In general graphs, there is an O(nϕ)-time algorithm. Goal: runtime f (ϕ) · nc for a fixed constant c and some function f .

Marcin Pilipczuk Sparsity 10/38

slide-54
SLIDE 54

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework. What (meta-)class of problems is famously local on graphs? FO model-checking Input Graph G, FO sentence ϕ Question Does G | = ϕ? In general graphs, there is an O(nϕ)-time algorithm. Goal: runtime f (ϕ) · nc for a fixed constant c and some function f .

Called fixed-parameter tractable, or FPT, parameterized by ϕ.

Marcin Pilipczuk Sparsity 10/38

slide-55
SLIDE 55

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework. What (meta-)class of problems is famously local on graphs? FO model-checking Input Graph G, FO sentence ϕ Question Does G | = ϕ? In general graphs, there is an O(nϕ)-time algorithm. Goal: runtime f (ϕ) · nc for a fixed constant c and some function f .

Called fixed-parameter tractable, or FPT, parameterized by ϕ.

No such algorithm on general graphs, unless FPT = AW[⋆].

Marcin Pilipczuk Sparsity 10/38

slide-56
SLIDE 56

Applications

Our definition of sparsity is based on local contractions, so we should study local problems in this framework. What (meta-)class of problems is famously local on graphs? FO model-checking Input Graph G, FO sentence ϕ Question Does G | = ϕ? In general graphs, there is an O(nϕ)-time algorithm. Goal: runtime f (ϕ) · nc for a fixed constant c and some function f .

Called fixed-parameter tractable, or FPT, parameterized by ϕ.

No such algorithm on general graphs, unless FPT = AW[⋆]. FPT algorithms for bounded degree, planar, H-minor-free, ...

Marcin Pilipczuk Sparsity 10/38

slide-57
SLIDE 57

FO model-checking dichotomy

Theorem

[Grohe et al., Dvoˇ r´ ak et al.]

Let C be a monotone graph class (closed under taking subgraphs). Then: If C is nowhere dense, then FO model-checking can be done in time f (ϕ) · n1+ε on graphs from C, for any ε > 0. If C is somewhere dense, then FO model-checking is AW[⋆]-complete

  • n graphs from C.

Marcin Pilipczuk Sparsity 11/38

slide-58
SLIDE 58

FO model-checking dichotomy

Theorem

[Grohe et al., Dvoˇ r´ ak et al.]

Let C be a monotone graph class (closed under taking subgraphs). Then: If C is nowhere dense, then FO model-checking can be done in time f (ϕ) · n1+ε on graphs from C, for any ε > 0. If C is somewhere dense, then FO model-checking is AW[⋆]-complete

  • n graphs from C.

Nowhere denseness exactly characterizes monotone classes where FO model-checking is tractable from the parameterized viewpoint.

Marcin Pilipczuk Sparsity 11/38

slide-59
SLIDE 59

FO model-checking dichotomy

Theorem

[Grohe et al., Dvoˇ r´ ak et al.]

Let C be a monotone graph class (closed under taking subgraphs). Then: If C is nowhere dense, then FO model-checking can be done in time f (ϕ) · n1+ε on graphs from C, for any ε > 0. If C is somewhere dense, then FO model-checking is AW[⋆]-complete

  • n graphs from C.

Nowhere denseness exactly characterizes monotone classes where FO model-checking is tractable from the parameterized viewpoint. Provides a natural barrier for locality-based methods.

Marcin Pilipczuk Sparsity 11/38

slide-60
SLIDE 60

Gaifman normal form

Gaifman normal form Every FO sentence on graphs is equivalent to a boolean combination of basic local sentences, each having the following form: There exist u1, u2, . . . , uk that are pairwise at distance > 2r, and ψr(ui) holds for each i = 1, . . . , k. where r is some integer and ψr(x) is an r-local formula, i.e. satisfaction

  • f ψr(u) depends only on the r-neighborhood of u.

Marcin Pilipczuk Sparsity 12/38

slide-61
SLIDE 61

Gaifman normal form

Gaifman normal form Every FO sentence on graphs is equivalent to a boolean combination of basic local sentences, each having the following form: There exist u1, u2, . . . , uk that are pairwise at distance > 2r, and ψr(ui) holds for each i = 1, . . . , k. where r is some integer and ψr(x) is an r-local formula, i.e. satisfaction

  • f ψr(u) depends only on the r-neighborhood of u.

Ergo, FO model-checking reduces to basic local sentences.

Marcin Pilipczuk Sparsity 12/38

slide-62
SLIDE 62

Gaifman normal form

Gaifman normal form Every FO sentence on graphs is equivalent to a boolean combination of basic local sentences, each having the following form: There exist u1, u2, . . . , uk that are pairwise at distance > 2r, and ψr(ui) holds for each i = 1, . . . , k. where r is some integer and ψr(x) is an r-local formula, i.e. satisfaction

  • f ψr(u) depends only on the r-neighborhood of u.

Ergo, FO model-checking reduces to basic local sentences. Roughly, the approach for bounded-degree, planar, H-minor free:

Marcin Pilipczuk Sparsity 12/38

slide-63
SLIDE 63

Gaifman normal form

Gaifman normal form Every FO sentence on graphs is equivalent to a boolean combination of basic local sentences, each having the following form: There exist u1, u2, . . . , uk that are pairwise at distance > 2r, and ψr(ui) holds for each i = 1, . . . , k. where r is some integer and ψr(x) is an r-local formula, i.e. satisfaction

  • f ψr(u) depends only on the r-neighborhood of u.

Ergo, FO model-checking reduces to basic local sentences. Roughly, the approach for bounded-degree, planar, H-minor free:

Design a procedure for checking r-local formulas.

Marcin Pilipczuk Sparsity 12/38

slide-64
SLIDE 64

Gaifman normal form

Gaifman normal form Every FO sentence on graphs is equivalent to a boolean combination of basic local sentences, each having the following form: There exist u1, u2, . . . , uk that are pairwise at distance > 2r, and ψr(ui) holds for each i = 1, . . . , k. where r is some integer and ψr(x) is an r-local formula, i.e. satisfaction

  • f ψr(u) depends only on the r-neighborhood of u.

Ergo, FO model-checking reduces to basic local sentences. Roughly, the approach for bounded-degree, planar, H-minor free:

Design a procedure for checking r-local formulas. Solve an (annotated) instance of r-Scattered Set.

Marcin Pilipczuk Sparsity 12/38

slide-65
SLIDE 65

Gaifman normal form

Gaifman normal form Every FO sentence on graphs is equivalent to a boolean combination of basic local sentences, each having the following form: There exist u1, u2, . . . , uk that are pairwise at distance > 2r, and ψr(ui) holds for each i = 1, . . . , k. where r is some integer and ψr(x) is an r-local formula, i.e. satisfaction

  • f ψr(u) depends only on the r-neighborhood of u.

Ergo, FO model-checking reduces to basic local sentences. Roughly, the approach for bounded-degree, planar, H-minor free:

Design a procedure for checking r-local formulas. Solve an (annotated) instance of r-Scattered Set.

Can be lifted to bounded expansion and nowhere dense classes.

Marcin Pilipczuk Sparsity 12/38

slide-66
SLIDE 66

Scattered sets and dominating sets

r-Scattered Set I Graph G, vertex subset A ⊆ V (G), integer k Q Is there I ⊆ A with |I| = k s.t. r-balls around vrts of I are disjoint? r-Dominating Set I Graph G, vertex subset A ⊆ V (G), integer k Q Is there D ⊆ V (G) with |D| = k s.t. every vertex of A is at distance r from some vertex of D?

A A Marcin Pilipczuk Sparsity 13/38

slide-67
SLIDE 67

Scattered sets and dominating sets

A A

Note: scar(G, A) domr(G, A)

Marcin Pilipczuk Sparsity 14/38

slide-68
SLIDE 68

Scattered sets and dominating sets

A A

Note: scar(G, A) domr(G, A) Fact: For every class C of bounded expansion and every r ∈ N, there is a constant c such that for each G ∈ C and A ⊆ V (G), scar(G, A) domr(G, A) c · scar(G, A).

Marcin Pilipczuk Sparsity 14/38

slide-69
SLIDE 69

Scattered sets and dominating sets

A A

Note: scar(G, A) domr(G, A) Fact: For every class C of bounded expansion and every r ∈ N, there is a constant c such that for each G ∈ C and A ⊆ V (G), scar(G, A) domr(G, A) c · scar(G, A). For both problems, dichotomy theorems for monotone classes:

Marcin Pilipczuk Sparsity 14/38

slide-70
SLIDE 70

Scattered sets and dominating sets

A A

Note: scar(G, A) domr(G, A) Fact: For every class C of bounded expansion and every r ∈ N, there is a constant c such that for each G ∈ C and A ⊆ V (G), scar(G, A) domr(G, A) c · scar(G, A). For both problems, dichotomy theorems for monotone classes:

r-ScaSet: FPT for all r on every nowhere dense class, W[1]-hard for some r on every somewhere dense class.

Marcin Pilipczuk Sparsity 14/38

slide-71
SLIDE 71

Scattered sets and dominating sets

A A

Note: scar(G, A) domr(G, A) Fact: For every class C of bounded expansion and every r ∈ N, there is a constant c such that for each G ∈ C and A ⊆ V (G), scar(G, A) domr(G, A) c · scar(G, A). For both problems, dichotomy theorems for monotone classes:

r-ScaSet: FPT for all r on every nowhere dense class, W[1]-hard for some r on every somewhere dense class. r-DomSet: FPT for all r on every nowhere dense class, W[2]-hard for some r on every somewhere dense class.

Marcin Pilipczuk Sparsity 14/38

slide-72
SLIDE 72

Scattered sets and dominating sets

A A

Note: scar(G, A) domr(G, A) Fact: For every class C of bounded expansion and every r ∈ N, there is a constant c such that for each G ∈ C and A ⊆ V (G), scar(G, A) domr(G, A) c · scar(G, A). For both problems, dichotomy theorems for monotone classes:

r-ScaSet: FPT for all r on every nowhere dense class, W[1]-hard for some r on every somewhere dense class. r-DomSet: FPT for all r on every nowhere dense class, W[2]-hard for some r on every somewhere dense class.

Now: runtime f (k) · |G|2 for r-ScaSet on any nowhere dense C.

Marcin Pilipczuk Sparsity 14/38

slide-73
SLIDE 73

Uniform quasi-wideness

Uniform quasi-wideness Class C is uniformly quasi-wide with margins s(·) and N(·, ·) if for every graph G ∈ C, all r, m ∈ N, and every vertex subset A ⊆ V (G) of size larger than N(r, m), there exist sets S ⊆ V (G) and B ⊆ A − S with |S| s(r) and |B| > m such that B is r-scattered in G − S.

A |A| > N(r, m) B |B| > m S |S| s(r)

slide-74
SLIDE 74

Uniform quasi-wideness

Uniform quasi-wideness Class C is uniformly quasi-wide with margins s(·) and N(·, ·) if for every graph G ∈ C, all r, m ∈ N, and every vertex subset A ⊆ V (G) of size larger than N(r, m), there exist sets S ⊆ V (G) and B ⊆ A − S with |S| s(r) and |B| > m such that B is r-scattered in G − S. Theorem

[Neˇ setˇ ril and Ossona de Mendez]

A class is uniformly quasi-wide iff it is nowhere dense.

A |A| > N(r, m) B |B| > m S |S| s(r)

slide-75
SLIDE 75

Uniform quasi-wideness

Uniform quasi-wideness Class C is uniformly quasi-wide with margins s(·) and N(·, ·) if for every graph G ∈ C, all r, m ∈ N, and every vertex subset A ⊆ V (G) of size larger than N(r, m), there exist sets S ⊆ V (G) and B ⊆ A − S with |S| s(r) and |B| > m such that B is r-scattered in G − S. Theorem

[Neˇ setˇ ril and Ossona de Mendez]

A class is uniformly quasi-wide iff it is nowhere dense. Remark: For fixed C, we have N(r, m) mf (r).

A |A| > N(r, m) B |B| > m S |S| s(r) Marcin Pilipczuk Sparsity 15/38

slide-76
SLIDE 76

Uniform quasi-wideness

Uniform quasi-wideness Class C is uniformly quasi-wide with margins s(·) and N(·, ·) if for every graph G ∈ C, all r, m ∈ N, and every vertex subset A ⊆ V (G) of size larger than N(r, m), there exist sets S ⊆ V (G) and B ⊆ A − S with |S| s(r) and |B| > m such that B is r-scattered in G − S. Theorem

[Neˇ setˇ ril and Ossona de Mendez]

A class is uniformly quasi-wide iff it is nowhere dense. Remark: For fixed C, we have N(r, m) mf (r). Given (G, A), sets S and B can be found in time poly(m) · |G|.

A |A| > N(r, m) B |B| > m S |S| s(r) Marcin Pilipczuk Sparsity 15/38

slide-77
SLIDE 77

Uniform quasi-wideness

Uniform quasi-wideness Class C is uniformly quasi-wide with margins s(·) and N(·, ·) if for every graph G ∈ C, all r, m ∈ N, and every vertex subset A ⊆ V (G) of size larger than N(r, m), there exist sets S ⊆ V (G) and B ⊆ A − S with |S| s(r) and |B| > m such that B is r-scattered in G − S. Theorem

[Neˇ setˇ ril and Ossona de Mendez]

A class is uniformly quasi-wide iff it is nowhere dense. Remark: For fixed C, we have N(r, m) mf (r). Given (G, A), sets S and B can be found in time poly(m) · |G|. Very useful statement when working with Gaifman normal form.

A |A| > N(r, m) B |B| > m S |S| s(r) Marcin Pilipczuk Sparsity 15/38

slide-78
SLIDE 78

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

A

Marcin Pilipczuk Sparsity 16/38

slide-79
SLIDE 79

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

Goal: Is there an r-scattered set of size k contained in A? A

Marcin Pilipczuk Sparsity 16/38

slide-80
SLIDE 80

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

Goal: Is there an r-scattered set of size k contained in A?

Claim: There is some M = M(k) with the following property: Provided |A| > M, we can find some u ∈ A such that A contains an r-scattered set of size k iff A − {u} does.

u

A

|A| > M Marcin Pilipczuk Sparsity 16/38

slide-81
SLIDE 81

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

Goal: Is there an r-scattered set of size k contained in A?

Claim: There is some M = M(k) with the following property: Provided |A| > M, we can find some u ∈ A such that A contains an r-scattered set of size k iff A − {u} does. Algorithm:

u

A

|A| > M Marcin Pilipczuk Sparsity 16/38

slide-82
SLIDE 82

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

Goal: Is there an r-scattered set of size k contained in A?

Claim: There is some M = M(k) with the following property: Provided |A| > M, we can find some u ∈ A such that A contains an r-scattered set of size k iff A − {u} does. Algorithm:

Starting from original A, remove vertices one by one until A reaches size M(k).

u

A

|A| > M Marcin Pilipczuk Sparsity 16/38

slide-83
SLIDE 83

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

Goal: Is there an r-scattered set of size k contained in A?

Claim: There is some M = M(k) with the following property: Provided |A| > M, we can find some u ∈ A such that A contains an r-scattered set of size k iff A − {u} does. Algorithm:

Starting from original A, remove vertices one by one until A reaches size M(k). Then brute-force through all M(k)

k

  • = f (k) subsets of size k.

u

A

|A| > M Marcin Pilipczuk Sparsity 16/38

slide-84
SLIDE 84

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

Goal: Is there an r-scattered set of size k contained in A?

Claim: There is some M = M(k) with the following property: Provided |A| > M, we can find some u ∈ A such that A contains an r-scattered set of size k iff A − {u} does. Algorithm:

Starting from original A, remove vertices one by one until A reaches size M(k). Then brute-force through all M(k)

k

  • = f (k) subsets of size k.

Variant of the irrelevant vertex technique.

u

A

|A| > M Marcin Pilipczuk Sparsity 16/38

slide-85
SLIDE 85

Removing irrelevant candidates

Setting: Fix r ∈ N, class C, graph G ∈ C, and A ⊆ V (G).

Goal: Is there an r-scattered set of size k contained in A?

Claim: There is some M = M(k) with the following property: Provided |A| > M, we can find some u ∈ A such that A contains an r-scattered set of size k iff A − {u} does. Algorithm:

Starting from original A, remove vertices one by one until A reaches size M(k). Then brute-force through all M(k)

k

  • = f (k) subsets of size k.

Variant of the irrelevant vertex technique. Note: One may obtain M(k) = O(k1+ε) for any ε > 0.

u

A

|A| > M Marcin Pilipczuk Sparsity 16/38

slide-86
SLIDE 86

Removing irrelevant candidates

Fix M(k) := N( 2r , (2r + 1)s(2r) · k ), where s(·) and N(·, ·) are margins for uqw of C.

A

Marcin Pilipczuk Sparsity 17/38

slide-87
SLIDE 87

Removing irrelevant candidates

Fix M(k) := N( 2r , (2r + 1)s(2r) · k ), where s(·) and N(·, ·) are margins for uqw of C. Step 1: If |A| > M(k), then we can find S and B ⊆ A − S with |S| s(2r) and |B| > (2r + 1)s(2r) · k, and B being 2r-scattered in G − S.

Marcin Pilipczuk Sparsity 17/38

slide-88
SLIDE 88

Removing irrelevant candidates

Fix M(k) := N( 2r , (2r + 1)s(2r) · k ), where s(·) and N(·, ·) are margins for uqw of C. Step 1: If |A| > M(k), then we can find S and B ⊆ A − S with |S| s(2r) and |B| > (2r + 1)s(2r) · k, and B being 2r-scattered in G − S. Step 2: Classify vertices of B according to profiles towards S:

Marcin Pilipczuk Sparsity 17/38

slide-89
SLIDE 89

Removing irrelevant candidates

Fix M(k) := N( 2r , (2r + 1)s(2r) · k ), where s(·) and N(·, ·) are margins for uqw of C. Step 1: If |A| > M(k), then we can find S and B ⊆ A − S with |S| s(2r) and |B| > (2r + 1)s(2r) · k, and B being 2r-scattered in G − S. Step 2: Classify vertices of B according to profiles towards S:

Profile of b ∈ B is the vector of distances to elements of S, where anything > 2r maps to +∞.

5 3 +∞ Marcin Pilipczuk Sparsity 17/38

slide-90
SLIDE 90

Removing irrelevant candidates

Fix M(k) := N( 2r , (2r + 1)s(2r) · k ), where s(·) and N(·, ·) are margins for uqw of C. Step 1: If |A| > M(k), then we can find S and B ⊆ A − S with |S| s(2r) and |B| > (2r + 1)s(2r) · k, and B being 2r-scattered in G − S. Step 2: Classify vertices of B according to profiles towards S:

Profile of b ∈ B is the vector of distances to elements of S, where anything > 2r maps to +∞. At most (2r + 1)s(2r) possible profiles ⇒ Set B′ ⊆ B of more than k vertices with same profile.

5 3 +∞ Marcin Pilipczuk Sparsity 17/38

slide-91
SLIDE 91

Removing irrelevant candidates

Step 3: We claim that any u ∈ B′ is an irrelevant candidate.

u

Marcin Pilipczuk Sparsity 18/38

slide-92
SLIDE 92

Removing irrelevant candidates

Step 3: We claim that any u ∈ B′ is an irrelevant candidate. Fix any r-scattered I ⊆ A with |I| = k; suppose u ∈ A.

u

Marcin Pilipczuk Sparsity 18/38

slide-93
SLIDE 93

Removing irrelevant candidates

Step 3: We claim that any u ∈ B′ is an irrelevant candidate. Fix any r-scattered I ⊆ A with |I| = k; suppose u ∈ A.

We need to find some I ′ ⊆ A − u with these properties.

u

Marcin Pilipczuk Sparsity 18/38

slide-94
SLIDE 94

Removing irrelevant candidates

Step 3: We claim that any u ∈ B′ is an irrelevant candidate. Fix any r-scattered I ⊆ A with |I| = k; suppose u ∈ A.

We need to find some I ′ ⊆ A − u with these properties.

2r-balls in G − S around vertices of B are pairwise disjoint.

u

Marcin Pilipczuk Sparsity 18/38

slide-95
SLIDE 95

Removing irrelevant candidates

Step 3: We claim that any u ∈ B′ is an irrelevant candidate. Fix any r-scattered I ⊆ A with |I| = k; suppose u ∈ A.

We need to find some I ′ ⊆ A − u with these properties.

2r-balls in G − S around vertices of B are pairwise disjoint. Since |B′| > k, we can find some v ∈ B′, v = u, with no vertex of I in the corresponding ball.

u v

Marcin Pilipczuk Sparsity 18/38

slide-96
SLIDE 96

Removing irrelevant candidates

Step 3: We claim that any u ∈ B′ is an irrelevant candidate. Fix any r-scattered I ⊆ A with |I| = k; suppose u ∈ A.

We need to find some I ′ ⊆ A − u with these properties.

2r-balls in G − S around vertices of B are pairwise disjoint. Since |B′| > k, we can find some v ∈ B′, v = u, with no vertex of I in the corresponding ball. Claim: I ′ := I − u + v is still r-scattered.

u v

move Marcin Pilipczuk Sparsity 18/38

slide-97
SLIDE 97

Removing irrelevant candidates

Step 3: We claim that any u ∈ B′ is an irrelevant candidate. Fix any r-scattered I ⊆ A with |I| = k; suppose u ∈ A.

We need to find some I ′ ⊆ A − u with these properties.

2r-balls in G − S around vertices of B are pairwise disjoint. Since |B′| > k, we can find some v ∈ B′, v = u, with no vertex of I in the corresponding ball. Claim: I ′ := I − u + v is still r-scattered. Pf: If some w ∈ I − u conflicted v, then it would already conflict u.

u v

move

w

Marcin Pilipczuk Sparsity 18/38

slide-98
SLIDE 98

Neighborhood complexity

Suppose we have a graph G ∈ C for some class C, and a subset of vertices A ⊆ V (G).

A

Marcin Pilipczuk Sparsity 19/38

slide-99
SLIDE 99

Neighborhood complexity

Suppose we have a graph G ∈ C for some class C, and a subset of vertices A ⊆ V (G). For fixed r ∈ N, define the following equivalence relation on V (G): u ∼r v iff Br(u) ∩ A = Br(v) ∩ A, where Br(x) is the r-ball with center x.

A

Marcin Pilipczuk Sparsity 19/38

slide-100
SLIDE 100

Neighborhood complexity

Suppose we have a graph G ∈ C for some class C, and a subset of vertices A ⊆ V (G). For fixed r ∈ N, define the following equivalence relation on V (G): u ∼r v iff Br(u) ∩ A = Br(v) ∩ A, where Br(x) is the r-ball with center x. How many equivalence classes may ∼r have? In general, even 2|A|.

A

Marcin Pilipczuk Sparsity 19/38

slide-101
SLIDE 101

Neighborhood complexity

Suppose we have a graph G ∈ C for some class C, and a subset of vertices A ⊆ V (G). For fixed r ∈ N, define the following equivalence relation on V (G): u ∼r v iff Br(u) ∩ A = Br(v) ∩ A, where Br(x) is the r-ball with center x. How many equivalence classes may ∼r have? In general, even 2|A|. Thm: C has b.e. ⇒ index(∼r) cr|A| for some constant cr.

A

Marcin Pilipczuk Sparsity 19/38

slide-102
SLIDE 102

Neighborhood complexity

Suppose we have a graph G ∈ C for some class C, and a subset of vertices A ⊆ V (G). For fixed r ∈ N, define the following equivalence relation on V (G): u ∼r v iff Br(u) ∩ A = Br(v) ∩ A, where Br(x) is the r-ball with center x. How many equivalence classes may ∼r have? In general, even 2|A|. Thm: C has b.e. ⇒ index(∼r) cr|A| for some constant cr. Thm: C is n.d. ⇒ for any ε > 0, index(∼r) cr,ε|A|1+ε for some constant cr,ε.

A

Marcin Pilipczuk Sparsity 19/38

slide-103
SLIDE 103

Neighborhood complexity

Suppose we have a graph G ∈ C for some class C, and a subset of vertices A ⊆ V (G). For fixed r ∈ N, define the following equivalence relation on V (G): u ∼r v iff Br(u) ∩ A = Br(v) ∩ A, where Br(x) is the r-ball with center x. How many equivalence classes may ∼r have? In general, even 2|A|. Thm: C has b.e. ⇒ index(∼r) cr|A| for some constant cr. Thm: C is n.d. ⇒ for any ε > 0, index(∼r) cr,ε|A|1+ε for some constant cr,ε. Again, a very useful tool for algorithmic analysis of the instance.

A

Marcin Pilipczuk Sparsity 19/38

slide-104
SLIDE 104

Low treedepth colorings

A p-centered coloring of G is a function c : V (G) → [k] such that every connected subgraph of G either uses more than p colors or has a vertex of unique color.

Marcin Pilipczuk Sparsity 20/38

slide-105
SLIDE 105

Low treedepth colorings

A p-centered coloring of G is a function c : V (G) → [k] such that every connected subgraph of G either uses more than p colors or has a vertex of unique color. The same as: every connected subgraph that uses ℓ p colors has treedepth at most ℓ.

Marcin Pilipczuk Sparsity 20/38

slide-106
SLIDE 106

Low treedepth colorings

A p-centered coloring of G is a function c : V (G) → [k] such that every connected subgraph of G either uses more than p colors or has a vertex of unique color. The same as: every connected subgraph that uses ℓ p colors has treedepth at most ℓ. Equivalent characterization of bounded expansion a graph class G is

  • f bounded expansion if for every p there exists k(p) such that every

G ∈ G admits a p-centered coloring with at most k(p) colors.

Marcin Pilipczuk Sparsity 20/38

slide-107
SLIDE 107

Low treedepth colorings

A p-centered coloring of G is a function c : V (G) → [k] such that every connected subgraph of G either uses more than p colors or has a vertex of unique color. The same as: every connected subgraph that uses ℓ p colors has treedepth at most ℓ. Equivalent characterization of bounded expansion a graph class G is

  • f bounded expansion if for every p there exists k(p) such that every

G ∈ G admits a p-centered coloring with at most k(p) colors. Very useful for finding or counting constant-sized subgraphs.

Say we want to count the number of P4 subgraphs in G. Take 4-centered coloring c. For every X ⊆ [k] of size at most 4 count the number of P4s that use exactly colors of X. Complexity O(k4n) (without finding the coloring).

Marcin Pilipczuk Sparsity 20/38

slide-108
SLIDE 108

Degeneracy

1

Graph G is d-degenerated iff every subgraph of G has vertex of degree at most d

Marcin Pilipczuk Sparsity 21/38

slide-109
SLIDE 109

Degeneracy

1

Graph G is d-degenerated iff every subgraph of G has vertex of degree at most d

2

If graph is d-degenerated then its vertices can be put in order so that every vertex has at most d edges going to left.

Marcin Pilipczuk Sparsity 21/38

slide-110
SLIDE 110

Degeneracy

1

Graph G is d-degenerated iff every subgraph of G has vertex of degree at most d

2

If graph is d-degenerated then its vertices can be put in order so that every vertex has at most d edges going to left.

3

G is d-degenerated ⇒ ∇0(G) ≤ d

Marcin Pilipczuk Sparsity 21/38

slide-111
SLIDE 111

Degeneracy

1

Graph G is d-degenerated iff every subgraph of G has vertex of degree at most d

2

If graph is d-degenerated then its vertices can be put in order so that every vertex has at most d edges going to left.

3

G is d-degenerated ⇒ ∇0(G) ≤ d

4

∇0(G) ≤ d ⇒ G is 2d-degenerated

Marcin Pilipczuk Sparsity 21/38

slide-112
SLIDE 112

Generalized coloring number

1

Weak reachability: u is weakly r-reachable from v with respect to

  • rder σ if there exists a path P from v to u using at most r edges

whose every vertex w satisfies u ≤σ w.

Marcin Pilipczuk Sparsity 22/38

slide-113
SLIDE 113

Generalized coloring number

1

Weak reachability: u is weakly r-reachable from v with respect to

  • rder σ if there exists a path P from v to u using at most r edges

whose every vertex w satisfies u ≤σ w.

2

Strong reachability: the inner vertices of P are allowed only to be right of v.

Marcin Pilipczuk Sparsity 22/38

slide-114
SLIDE 114

Generalized coloring number

1

Weak reachability: u is weakly r-reachable from v with respect to

  • rder σ if there exists a path P from v to u using at most r edges

whose every vertex w satisfies u ≤σ w.

2

Strong reachability: the inner vertices of P are allowed only to be right of v.

3

Weak r-coloring number of order: wcolr(G, σ) = max

v∈V (G)|WReach[G, σ, v]|

Marcin Pilipczuk Sparsity 22/38

slide-115
SLIDE 115

Generalized coloring number

1

Weak reachability: u is weakly r-reachable from v with respect to

  • rder σ if there exists a path P from v to u using at most r edges

whose every vertex w satisfies u ≤σ w.

2

Strong reachability: the inner vertices of P are allowed only to be right of v.

3

Weak r-coloring number of order: wcolr(G, σ) = max

v∈V (G)|WReach[G, σ, v]|

4

Weak r-coloring number of graph: wcolr(G) = min

σ∈Π(G)wcolr(G, σ)

Marcin Pilipczuk Sparsity 22/38

slide-116
SLIDE 116

Generalized coloring number

1

Weak reachability: u is weakly r-reachable from v with respect to

  • rder σ if there exists a path P from v to u using at most r edges

whose every vertex w satisfies u ≤σ w.

2

Strong reachability: the inner vertices of P are allowed only to be right of v.

3

Weak r-coloring number of order: wcolr(G, σ) = max

v∈V (G)|WReach[G, σ, v]|

4

Weak r-coloring number of graph: wcolr(G) = min

σ∈Π(G)wcolr(G, σ)

5

G has bounded expansion iff there exists function f : N → N such that ∀r∀G∈Gwcolr(G) ≤ f (r).

Marcin Pilipczuk Sparsity 22/38

slide-117
SLIDE 117

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Marcin Pilipczuk Sparsity 23/38

slide-118
SLIDE 118

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula.

Marcin Pilipczuk Sparsity 23/38

slide-119
SLIDE 119

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Marcin Pilipczuk Sparsity 23/38

slide-120
SLIDE 120

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Marcin Pilipczuk Sparsity 23/38

slide-121
SLIDE 121

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}.

Marcin Pilipczuk Sparsity 23/38

slide-122
SLIDE 122

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Marcin Pilipczuk Sparsity 23/38

slide-123
SLIDE 123

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Issue: We are given only the graph G ϕ, without the preimage G.

Marcin Pilipczuk Sparsity 23/38

slide-124
SLIDE 124

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Issue: We are given only the graph G ϕ, without the preimage G.

On-going work: so far we are able to non-effectively characterize images of bounded expansion classes.

Marcin Pilipczuk Sparsity 23/38

slide-125
SLIDE 125

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Issue: We are given only the graph G ϕ, without the preimage G.

On-going work: so far we are able to non-effectively characterize images of bounded expansion classes.

Idea 2: Find a more general property of graph classes such that:

Marcin Pilipczuk Sparsity 23/38

slide-126
SLIDE 126

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Issue: We are given only the graph G ϕ, without the preimage G.

On-going work: so far we are able to non-effectively characterize images of bounded expansion classes.

Idea 2: Find a more general property of graph classes such that:

(1) restricted to monotone classes, it collapses to nowhere denseness;

Marcin Pilipczuk Sparsity 23/38

slide-127
SLIDE 127

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Issue: We are given only the graph G ϕ, without the preimage G.

On-going work: so far we are able to non-effectively characterize images of bounded expansion classes.

Idea 2: Find a more general property of graph classes such that:

(1) restricted to monotone classes, it collapses to nowhere denseness; (2) is closed under FO interpretations; and

Marcin Pilipczuk Sparsity 23/38

slide-128
SLIDE 128

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Issue: We are given only the graph G ϕ, without the preimage G.

On-going work: so far we are able to non-effectively characterize images of bounded expansion classes.

Idea 2: Find a more general property of graph classes such that:

(1) restricted to monotone classes, it collapses to nowhere denseness; (2) is closed under FO interpretations; and (3) gives hope for fixed-parameter tractability of FO model checking.

Marcin Pilipczuk Sparsity 23/38

slide-129
SLIDE 129

Beyond sparsity

Obs: For every nowhere dense class C, FO model checking on the class of complements C := {G : G ∈ C} is also FPT.

Pf: Negate all edge predicates in the input formula. Might generalize to images under FO interpretations.

Idea 1: Characterize images of nowhere dense classes under (simple) FO interpretations.

Given G = (V , E) and a formula ϕ(x, y), define G ϕ := (V , {(u, v): G | = ϕ(u, v)}) and Cϕ := {G ϕ : G ∈ C}. Problem: Is FO model-checking FPT on Cϕ, for each nowhere dense C and formula ϕ(x, y)?

Issue: We are given only the graph G ϕ, without the preimage G.

On-going work: so far we are able to non-effectively characterize images of bounded expansion classes.

Idea 2: Find a more general property of graph classes such that:

(1) restricted to monotone classes, it collapses to nowhere denseness; (2) is closed under FO interpretations; and (3) gives hope for fixed-parameter tractability of FO model checking.

Candidate: Stability

Marcin Pilipczuk Sparsity 23/38

slide-130
SLIDE 130

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Marcin Pilipczuk Sparsity 24/38

slide-131
SLIDE 131

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions.

Marcin Pilipczuk Sparsity 24/38

slide-132
SLIDE 132

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions. An abundance of equivalent characterizations and viewpoints.

Marcin Pilipczuk Sparsity 24/38

slide-133
SLIDE 133

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions. An abundance of equivalent characterizations and viewpoints. Each characterization provides a tool that can be used to study combinatorial, algorithmic, and logical aspects.

Marcin Pilipczuk Sparsity 24/38

slide-134
SLIDE 134

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions. An abundance of equivalent characterizations and viewpoints. Each characterization provides a tool that can be used to study combinatorial, algorithmic, and logical aspects. Intuition: Delimits the border of tractability for “local” problems.

Marcin Pilipczuk Sparsity 24/38

slide-135
SLIDE 135

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions. An abundance of equivalent characterizations and viewpoints. Each characterization provides a tool that can be used to study combinatorial, algorithmic, and logical aspects. Intuition: Delimits the border of tractability for “local” problems.

Intriguing connections to stability.

Marcin Pilipczuk Sparsity 24/38

slide-136
SLIDE 136

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions. An abundance of equivalent characterizations and viewpoints. Each characterization provides a tool that can be used to study combinatorial, algorithmic, and logical aspects. Intuition: Delimits the border of tractability for “local” problems.

Intriguing connections to stability.

Rather a transfer of concepts, techniques, and proof strategies, than concrete results.

Marcin Pilipczuk Sparsity 24/38

slide-137
SLIDE 137

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions. An abundance of equivalent characterizations and viewpoints. Each characterization provides a tool that can be used to study combinatorial, algorithmic, and logical aspects. Intuition: Delimits the border of tractability for “local” problems.

Intriguing connections to stability.

Rather a transfer of concepts, techniques, and proof strategies, than concrete results. Still largely unexplored.

Marcin Pilipczuk Sparsity 24/38

slide-138
SLIDE 138

Conclusions for the theory part

Theory of structural sparsity in a nutshell:

Basic definitions capture the concept of sparsity that persists under local contractions. An abundance of equivalent characterizations and viewpoints. Each characterization provides a tool that can be used to study combinatorial, algorithmic, and logical aspects. Intuition: Delimits the border of tractability for “local” problems.

Intriguing connections to stability.

Rather a transfer of concepts, techniques, and proof strategies, than concrete results. Still largely unexplored.

If interested, see lecture notes: https://www.mimuw.edu.pl/~mp248287/sparsity/

Marcin Pilipczuk Sparsity 24/38

slide-139
SLIDE 139

Experiments

Is this theory anywhere close to being practical? Are real-world graphs of bounded expansion or nowhere dense? How good are the parameters?

Marcin Pilipczuk Sparsity 25/38

slide-140
SLIDE 140

Experiments

Is this theory anywhere close to being practical? Are real-world graphs of bounded expansion or nowhere dense? How good are the parameters? How good are our algorithmic primitives?

weak coloring numbers uniform quasi-wideness low treedepth colorings

Marcin Pilipczuk Sparsity 25/38

slide-141
SLIDE 141

Experiments

Is this theory anywhere close to being practical? Are real-world graphs of bounded expansion or nowhere dense? How good are the parameters? How good are our algorithmic primitives?

weak coloring numbers uniform quasi-wideness low treedepth colorings

Evaluate one complete algorithm.

Kernelization for Dominating Set

Marcin Pilipczuk Sparsity 25/38

slide-142
SLIDE 142

Experiments: literature

Are real-world graphs of bounded expansion or nowhere dense? How good are the parameters?

Marcin Pilipczuk Sparsity 26/38

slide-143
SLIDE 143

Experiments: literature

Are real-world graphs of bounded expansion or nowhere dense? How good are the parameters? Demaine, Reidl, Rossmanith, Sanchez Villaamil, Sikdar, Sullivan, arXiv:1406.2587

Marcin Pilipczuk Sparsity 26/38

slide-144
SLIDE 144

Experiments: literature

Are real-world graphs of bounded expansion or nowhere dense? How good are the parameters? Demaine, Reidl, Rossmanith, Sanchez Villaamil, Sikdar, Sullivan, arXiv:1406.2587 Analysis of a number of random graph models, proving sparsity properties.

Marcin Pilipczuk Sparsity 26/38

slide-145
SLIDE 145

Experiments: literature

Are real-world graphs of bounded expansion or nowhere dense? How good are the parameters? Demaine, Reidl, Rossmanith, Sanchez Villaamil, Sikdar, Sullivan, arXiv:1406.2587 Analysis of a number of random graph models, proving sparsity properties. Analysis of low treedepth coloring numbers for a number of real-world graphs.

Marcin Pilipczuk Sparsity 26/38

slide-146
SLIDE 146

DRRSSS results

Marcin Pilipczuk Sparsity 27/38

slide-147
SLIDE 147

Counting small subgraphs

O’Brien, Sullivan, arXiv:1712.06690

Marcin Pilipczuk Sparsity 28/38

slide-148
SLIDE 148

Counting small subgraphs

O’Brien, Sullivan, arXiv:1712.06690 Compared with NXVF2 on a number of random graphs from sparse models.

Marcin Pilipczuk Sparsity 28/38

slide-149
SLIDE 149

Counting small subgraphs

O’Brien, Sullivan, arXiv:1712.06690 Compared with NXVF2 on a number of random graphs from sparse models. Punchline: too slow, low treedepth colorings have too many colors.

Marcin Pilipczuk Sparsity 28/38

slide-150
SLIDE 150

Counting small subgraphs

O’Brien, Sullivan, arXiv:1712.06690 Compared with NXVF2 on a number of random graphs from sparse models. Punchline: too slow, low treedepth colorings have too many colors. Outperform NXVF2 only on artificially prepared data.

Marcin Pilipczuk Sparsity 28/38

slide-151
SLIDE 151

Counting small subgraphs

O’Brien, Sullivan, arXiv:1712.06690 Compared with NXVF2 on a number of random graphs from sparse models. Punchline: too slow, low treedepth colorings have too many colors. Outperform NXVF2 only on artificially prepared data. The NCSU group is working on new implementation of pattern counting with a new algorithm based on weak coloring numbers.

Marcin Pilipczuk Sparsity 28/38

slide-152
SLIDE 152

Wcols: Determining good orders

1

Determining good degeneracy orders - easy!

Marcin Pilipczuk Sparsity 29/38

slide-153
SLIDE 153

Wcols: Determining good orders

1

Determining good degeneracy orders - easy!

2

Determining good generalized coloring numbers orders - hard!

Marcin Pilipczuk Sparsity 29/38

slide-154
SLIDE 154

Wcols: Determining good orders

1

Determining good degeneracy orders - easy!

2

Determining good generalized coloring numbers orders - hard!

3

Nadara, P., Rabinovich, Reidl, Siebertz, SEA 2018.

Marcin Pilipczuk Sparsity 29/38

slide-155
SLIDE 155

Tested approaches

1

Distance-constrained transitive fraternal augmentations

2

Flat decompositions

3

Treedepth heuristic

4

Treewidth heuristic

5

Two greedy approaches

1

Left-to-right, take vertex maximizing current size of the WReach set

2

Right-to-left, take vertex minimizing current size of the SReach set

6

Other simple heuristics

1

Sorting by descending degree

2

Degeneracy ordering

3

Doing these on graph G r (where V (G) = V (G r), uv ∈ E(G r) ⇔ distG(u, v) ≤ r)

7

Local search applied on top of any produced result

Marcin Pilipczuk Sparsity 30/38

slide-156
SLIDE 156

Test data

1

Selection of tests from KONECT base (modelling various real-world data: graphs of collaborations, airlines connections, political blogosphere, neural networks etc.)

2

Random planar graphs generated by LEDA library

3

Tests from Parameterized Algorithms and Computational Experiments Challenge 2016 competition — Feedback Vertex Set track

4

Graphs generated by random models producing graphs of bounded expansion: stochastic blocks model, Chung-Lu model and Chung-Lu model with households We partitioned all tests into small, medium, big and huge based on number of edges (respectively [0, 103), [103, 104), [104, 4, 8 · 104), [4, 8 · 104, ∞)).

Marcin Pilipczuk Sparsity 31/38

slide-157
SLIDE 157

Comparison of all approaches

Marcin Pilipczuk Sparsity 32/38

slide-158
SLIDE 158

Local search

Marcin Pilipczuk Sparsity 33/38

slide-159
SLIDE 159

UQW: Tested approaches

1

Distance trees (variants denoted by tree1, tree2, ld it)

2

From weak coloring numbers to uniform quasi-wideness (variants mfcs, new1, new2, new ld)

3

Naive approach of removing vertices with biggest degrees and greedily computing independent set in r-th power of remaining graph (denoted as ld)

Marcin Pilipczuk Sparsity 34/38

slide-160
SLIDE 160

Comparing different outputs

1

Different outputs can greatly vary. How to compare? UQW is a two-dimensional measure.

Marcin Pilipczuk Sparsity 35/38

slide-161
SLIDE 161

Comparing different outputs

1

Different outputs can greatly vary. How to compare? UQW is a two-dimensional measure.

2

We want best tradeoff between number of deleted vertices and size

  • f r-independent set.

Marcin Pilipczuk Sparsity 35/38

slide-162
SLIDE 162

Comparing different outputs

1

Different outputs can greatly vary. How to compare? UQW is a two-dimensional measure.

2

We want best tradeoff between number of deleted vertices and size

  • f r-independent set.

3

Biggest class of r-independent vertices when grouped by r-distance profiles!

Marcin Pilipczuk Sparsity 35/38

slide-163
SLIDE 163

Results

Marcin Pilipczuk Sparsity 36/38

slide-164
SLIDE 164

Results

Marcin Pilipczuk Sparsity 37/38

slide-165
SLIDE 165

Conclusions for the experimental part

Low treedepth colorings seem too inefficient.

Marcin Pilipczuk Sparsity 38/38

slide-166
SLIDE 166

Conclusions for the experimental part

Low treedepth colorings seem too inefficient. Recommended for wcol: right-to-left greedy with local search on top.

Marcin Pilipczuk Sparsity 38/38

slide-167
SLIDE 167

Conclusions for the experimental part

Low treedepth colorings seem too inefficient. Recommended for wcol: right-to-left greedy with local search on top. Recommended for UQW: delete a few high degree vertices and get greedy.

Marcin Pilipczuk Sparsity 38/38

slide-168
SLIDE 168

Conclusions for the experimental part

Low treedepth colorings seem too inefficient. Recommended for wcol: right-to-left greedy with local search on top. Recommended for UQW: delete a few high degree vertices and get greedy. Missing explanation for the performance of the best algorithms.

Marcin Pilipczuk Sparsity 38/38

slide-169
SLIDE 169

Conclusions for the experimental part

Low treedepth colorings seem too inefficient. Recommended for wcol: right-to-left greedy with local search on top. Recommended for UQW: delete a few high degree vertices and get greedy. Missing explanation for the performance of the best algorithms. Nadara: applied UQW to Dominating Set kernelization, compared to old experiments in planar graphs (Alber, Fellows, Niedermeier, J.ACM 2004 and Alber, Betzler, Niedermeier, Annals OR 2006).

The sparsity-based reduction does not add any power over the known simple neighborhood rules.

Marcin Pilipczuk Sparsity 38/38

slide-170
SLIDE 170

Conclusions for the experimental part

Low treedepth colorings seem too inefficient. Recommended for wcol: right-to-left greedy with local search on top. Recommended for UQW: delete a few high degree vertices and get greedy. Missing explanation for the performance of the best algorithms. Nadara: applied UQW to Dominating Set kernelization, compared to old experiments in planar graphs (Alber, Fellows, Niedermeier, J.ACM 2004 and Alber, Betzler, Niedermeier, Annals OR 2006).

The sparsity-based reduction does not add any power over the known simple neighborhood rules.

Thank you for your attention!

Marcin Pilipczuk Sparsity 38/38