Quantum Speedup for Graph Sparsification, Cut Approximation and - - PowerPoint PPT Presentation

quantum speedup for graph sparsification cut
SMART_READER_LITE
LIVE PREVIEW

Quantum Speedup for Graph Sparsification, Cut Approximation and - - PowerPoint PPT Presentation

Quantum Speedup for Graph Sparsification, Cut Approximation and Laplacian Solving Simon Apers 1 Ronald de Wolf 2 1 Inria, France and CWI, the Netherlands 2 QuSoft, CWI and University of Amsterdam, the Netherlands Simons Institute, April 2020


slide-1
SLIDE 1

Quantum Speedup for Graph Sparsification, Cut Approximation and Laplacian Solving

Simon Apers1 Ronald de Wolf 2

1Inria, France and CWI, the Netherlands 2QuSoft, CWI and University of Amsterdam, the Netherlands

Simons Institute, April 2020

(arXiv:1911.07306)

slide-2
SLIDE 2

Graphs

1

slide-3
SLIDE 3

graphs are nice

2

slide-4
SLIDE 4

graphs are nice all over computer science, discrete math, biology, . . .

2

slide-5
SLIDE 5

graphs are nice all over computer science, discrete math, biology, . . . describe relations, networks, groups, . . .

2

slide-6
SLIDE 6

graphs are nice all over computer science, discrete math, biology, . . . describe relations, networks, groups, . . . sparse graphs are nicer

2

slide-7
SLIDE 7

graphs are nice all over computer science, discrete math, biology, . . . describe relations, networks, groups, . . . sparse graphs are nicer less space to store

2

slide-8
SLIDE 8

graphs are nice all over computer science, discrete math, biology, . . . describe relations, networks, groups, . . . sparse graphs are nicer less space to store less time to process

2

slide-9
SLIDE 9

graphs are nice all over computer science, discrete math, biology, . . . describe relations, networks, groups, . . . sparse graphs are nicer less space to store less time to process example: expanders are more interesting than complete graphs

2

slide-10
SLIDE 10

graphs are nice all over computer science, discrete math, biology, . . . describe relations, networks, groups, . . . sparse graphs are nicer less space to store less time to process example: expanders are more interesting than complete graphs

can we compress general graphs to sparse graphs ?

2

slide-11
SLIDE 11

Graph Sparsification

3

slide-12
SLIDE 12

undirected, weighted graph G = (V, E, w) n nodes and m edges, m ≤ n

2

  • 4
slide-13
SLIDE 13

undirected, weighted graph G = (V, E, w) n nodes and m edges, m ≤ n

2

  • adjacency-list access

query (i, k) returns k-th neighbor j of node i

4

slide-14
SLIDE 14

Graph Sparsification “graph sparsification” = reduce number of edges, while preserving interesting quantities

5

slide-15
SLIDE 15

Graph Sparsification what are “interesting quantities”?

6

slide-16
SLIDE 16

Graph Sparsification what are “interesting quantities”? extremal cuts, eigenvalues, random walk properties, . . .

6

slide-17
SLIDE 17

Graph Sparsification what are “interesting quantities”? extremal cuts, eigenvalues, random walk properties, . . . → typically captured by graph Laplacian LG

6

slide-18
SLIDE 18

Graph Sparsification what are “interesting quantities”? extremal cuts, eigenvalues, random walk properties, . . . → typically captured by graph Laplacian LG LG = D − A with (D)ii =

  • j

w(i, j) and (A)ij = w(i, j)

6

slide-19
SLIDE 19

Graph Laplacian equivalently,

7

slide-20
SLIDE 20

Graph Laplacian equivalently, LG =

  • (i,j)∈E

w(i, j) L(i,j)

7

slide-21
SLIDE 21

Graph Laplacian equivalently, LG =

  • (i,j)∈E

w(i, j) L(i,j) with L(i,j) = (ei − ej) (ei − ej)T =     . . . . . . 1 −1 −1 1

  • (i,j)

. . . . . .    

7

slide-22
SLIDE 22

Graph Laplacian mainly interested in quadratic forms in LG

8

slide-23
SLIDE 23

Graph Laplacian mainly interested in quadratic forms in LG xTLGx

8

slide-24
SLIDE 24

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x

8

slide-25
SLIDE 25

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x =

  • (i,j)

w(i, j) (x(i) − x(j))2

8

slide-26
SLIDE 26

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x =

  • (i,j)

w(i, j) (x(i) − x(j))2

8

slide-27
SLIDE 27

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x =

  • (i,j)

w(i, j) (x(i) − x(j))2 e.g., if xS indicator vector on S ⊆ V:

8

slide-28
SLIDE 28

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x =

  • (i,j)

w(i, j) (x(i) − x(j))2 e.g., if xS indicator vector on S ⊆ V: xT

SLGxS

8

slide-29
SLIDE 29

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x =

  • (i,j)

w(i, j) (x(i) − x(j))2 e.g., if xS indicator vector on S ⊆ V: xT

SLGxS =

  • (i,j)

w(i, j)(xS(i) − xS(j))2

8

slide-30
SLIDE 30

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x =

  • (i,j)

w(i, j) (x(i) − x(j))2 e.g., if xS indicator vector on S ⊆ V: xT

SLGxS =

  • (i,j)

w(i, j)(xS(i) − xS(j))2 =

  • i∈S,j∈Sc

w(i, j)

8

slide-31
SLIDE 31

Graph Laplacian mainly interested in quadratic forms in LG xTLGx =

  • (i,j)

w(i, j) xTL(i,j)x =

  • (i,j)

w(i, j) (x(i) − x(j))2 e.g., if xS indicator vector on S ⊆ V: xT

SLGxS =

  • (i,j)

w(i, j)(xS(i) − xS(j))2 =

  • i∈S,j∈Sc

w(i, j) = cutG(S)

8

slide-32
SLIDE 32

Graph Laplacian as it turns out, quadratic forms xTLGx and xTL+

Gx

for x ∈ Rn describe cut values, eigenvalues, effective resistances, hitting times, . . .

9

slide-33
SLIDE 33

Graph Laplacian as it turns out, quadratic forms xTLGx and xTL+

Gx

for x ∈ Rn describe cut values, eigenvalues, effective resistances, hitting times, . . . → interested in preserving quadratic forms!

9

slide-34
SLIDE 34

Spectral Sparsification

10

slide-35
SLIDE 35

Spectral Sparsification = approximately preserve all quadratic forms

10

slide-36
SLIDE 36

Spectral Sparsification = approximately preserve all quadratic forms definition: H is ǫ-spectral sparsifier of G

10

slide-37
SLIDE 37

Spectral Sparsification = approximately preserve all quadratic forms definition: H is ǫ-spectral sparsifier of G iff xTLHx = (1 ± ǫ) xTLGx for all x ∈ Rn

10

slide-38
SLIDE 38

Spectral Sparsification = approximately preserve all quadratic forms definition: H is ǫ-spectral sparsifier of G iff xTLHx = (1 ± ǫ) xTLGx for all x ∈ Rn equivalently: xTL+

Hx = (1 ± O(ǫ)) xTL+ Gx

10

slide-39
SLIDE 39

Spectral Sparsification = approximately preserve all quadratic forms definition: H is ǫ-spectral sparsifier of G iff xTLHx = (1 ± ǫ) xTLGx for all x ∈ Rn equivalently: xTL+

Hx = (1 ± O(ǫ)) xTL+ Gx

equivalently: (1 − ǫ) LG LH (1 + ǫ) LG

10

slide-40
SLIDE 40

Spectral Sparsification how sparse can we go ?

11

slide-41
SLIDE 41

Spectral Sparsification how sparse can we go ? Karger ’94, Benczúr-Karger ’96, Spielman-Teng ’04, Batson-Spielman-Srivastava ’08:

Theorem

11

slide-42
SLIDE 42

Spectral Sparsification how sparse can we go ? Karger ’94, Benczúr-Karger ’96, Spielman-Teng ’04, Batson-Spielman-Srivastava ’08:

Theorem

every graph has ǫ-spectral sparsifier H with a number of edges

  • O(n/ǫ2)

11

slide-43
SLIDE 43

Spectral Sparsification how sparse can we go ? Karger ’94, Benczúr-Karger ’96, Spielman-Teng ’04, Batson-Spielman-Srivastava ’08:

Theorem

every graph has ǫ-spectral sparsifier H with a number of edges

  • O(n/ǫ2)

H can be found in time O(m)

11

slide-44
SLIDE 44

Spectral Sparsification how sparse can we go ? Karger ’94, Benczúr-Karger ’96, Spielman-Teng ’04, Batson-Spielman-Srivastava ’08:

Theorem

every graph has ǫ-spectral sparsifier H with a number of edges

  • O(n/ǫ2)

H can be found in time O(m)

(only relevant when ǫ ≫

  • n/m)

11

slide-45
SLIDE 45

Applications important building stone of many

  • O(m) cut approximation algorithms

12

slide-46
SLIDE 46

Applications important building stone of many

  • O(m) cut approximation algorithms

max cut (Arora-Kale ’07) min cut (Karger ’00) min st-cut (Peng ’16) sparsest cut (Sherman ’09) . . .

12

slide-47
SLIDE 47

Applications crucial component of Spielman-Teng breakthrough Laplacian solver:

13

slide-48
SLIDE 48

Applications crucial component of Spielman-Teng breakthrough Laplacian solver:

Theorem (Spielman-Teng ’04)

Let G be a graph with m edges. The Laplacian system LGx = b can be approximately solved in time O(m).

13

slide-49
SLIDE 49

Applications crucial component of Spielman-Teng breakthrough Laplacian solver:

Theorem (Spielman-Teng ’04)

Let G be a graph with m edges. The Laplacian system LGx = b can be approximately solved in time O(m). = Gödel prize 2015

13

slide-50
SLIDE 50

Applications crucial component of Spielman-Teng breakthrough Laplacian solver:

Theorem (Spielman-Teng ’04)

Let G be a graph with m edges. The Laplacian system LGx = b can be approximately solved in time O(m).

  • O(m) approximation algorithms for

electrical flows and max flows spectral clustering random walk properties learning from data on graphs . . .

13

slide-51
SLIDE 51

Our Contribution classically, O(m) runtime is optimal for most graph algorithms

14

slide-52
SLIDE 52

Our Contribution classically, O(m) runtime is optimal for most graph algorithms can we do better using a quantum computer?

14

slide-53
SLIDE 53

Our Contribution classically, O(m) runtime is optimal for most graph algorithms can we do better using a quantum computer?

(disclaimer: not with this one we won’t) 14

slide-54
SLIDE 54

Our Contribution

this work:

15

slide-55
SLIDE 55

Our Contribution

this work:

1

quantum algorithm to find ǫ-spectral sparsifier H in time

  • O(√mn/ǫ)

15

slide-56
SLIDE 56

Our Contribution

this work:

1

quantum algorithm to find ǫ-spectral sparsifier H in time

  • O(√mn/ǫ)

2

matching Ω(√mn/ǫ) lower bound

15

slide-57
SLIDE 57

Our Contribution

this work:

1

quantum algorithm to find ǫ-spectral sparsifier H in time

  • O(√mn/ǫ)

2

matching Ω(√mn/ǫ) lower bound

3

applications: quantum speedup for

◮ max cut, min cut, min st-cut, sparsest cut, . . . ◮ Laplacian solving, approximating resistances and random walk

properties, spectral clustering, . . .

15

slide-58
SLIDE 58

this work:

1

quantum algorithm to find ǫ-spectral sparsifier H in time

  • O(√mn/ǫ)

2

matching Ω(√mn/ǫ) lower bound

3

applications: quantum speedup for

◮ max cut, min cut, min st-cut, sparsest cut, . . . ◮ Laplacian solving, approximating resistances and random walk

properties, spectral clustering, . . .

16

slide-59
SLIDE 59

Classical Sparsification Algorithm

17

slide-60
SLIDE 60

Classical Sparsification Algorithm Sparsification by edge sampling:

1

associate probabilities {pe} to every edge

2

keep every edge e with probability pe, rescale its weight by 1/pe

17

slide-61
SLIDE 61

Classical Sparsification Algorithm Sparsification by edge sampling:

1

associate probabilities {pe} to every edge

2

keep every edge e with probability pe, rescale its weight by 1/pe ensures that E(wH

e ) = wG e

17

slide-62
SLIDE 62

Classical Sparsification Algorithm Sparsification by edge sampling:

1

associate probabilities {pe} to every edge

2

keep every edge e with probability pe, rescale its weight by 1/pe ensures that E(wH

e ) = wG e

and hence E(LH) = E weLe

  • = LG

17

slide-63
SLIDE 63

Classical Sparsification Algorithm Sparsification by edge sampling:

1

associate probabilities {pe} to every edge

2

keep every edge e with probability pe, rescale its weight by 1/pe ensures that E(wH

e ) = wG e

and hence E(LH) = E weLe

  • = LG

how to ensure concentration?

17

slide-64
SLIDE 64

Classical Sparsification Algorithm Sparsification by edge sampling:

1

associate probabilities {pe} to every edge

2

keep every edge e with probability pe, rescale its weight by 1/pe ensures that E(wH

e ) = wG e

and hence E(LH) = E weLe

  • = LG

how to ensure concentration? [Spielman-Srivastava ’08]: give high pe to edges with high effective resistance!

17

slide-65
SLIDE 65

Classical Sparsification Algorithm effective resistance R(i,j)

18

slide-66
SLIDE 66

Classical Sparsification Algorithm effective resistance R(i,j) = resistance between i, j after replacing all edges with resistors

18

slide-67
SLIDE 67

Classical Sparsification Algorithm effective resistance R(i,j) = resistance between i, j after replacing all edges with resistors

(Ohm’s law)

= voltage difference required between i, j when sending unit current from i to j

18

slide-68
SLIDE 68

Classical Sparsification Algorithm effective resistance R(i,j) = resistance between i, j after replacing all edges with resistors

(Ohm’s law)

= voltage difference required between i, j when sending unit current from i to j → small if many short and parallel paths from i to j !

18

slide-69
SLIDE 69

Classical Sparsification Algorithm effective resistance R(i,j) red edge: Re = 1 black edges: Re ∈ O(1/n)

18

slide-70
SLIDE 70

? how to identify high-resistance edges ?

19

slide-71
SLIDE 71

? how to identify high-resistance edges ? [Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges

19

slide-72
SLIDE 72

? how to identify high-resistance edges ? [Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges = subgraph F of G with O(n) edges

19

slide-73
SLIDE 73

? how to identify high-resistance edges ? [Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges = subgraph F of G with O(n) edges all distances stretched by factor ≤ log n: for all i, j dG(i, j) ≤ dF(i, j) ≤ log(n) dG(i, j)

19

slide-74
SLIDE 74

? how to identify high-resistance edges ? [Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges = subgraph F of G with O(n) edges all distances stretched by factor ≤ log n: for all i, j dG(i, j) ≤ dF(i, j) ≤ log(n) dG(i, j)

G F

19

slide-75
SLIDE 75

? how to identify high-resistance edges ? [Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges = subgraph F of G with O(n) edges all distances stretched by factor ≤ log n: for all i, j dG(i, j) ≤ dF(i, j) ≤ log(n) dG(i, j)

G F

20

slide-76
SLIDE 76

[Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges! proof idea for Re = 1:

21

slide-77
SLIDE 77

[Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges! proof idea for Re = 1: if Re = 1, there are no alternative paths between endpoints

21

slide-78
SLIDE 78

[Koutis-Xu ’14]: a graph spanner must contain all high-resistance edges! proof idea for Re = 1: if Re = 1, there are no alternative paths between endpoints hence, e must be present in spanner

21

slide-79
SLIDE 79

Classical Sparsification Algorithm Iterative sparsification:

1

construct O(1/ǫ2) spanners and keep these edges

2

keep any remaining edge with probability 1/2, and double its weight

22

slide-80
SLIDE 80

Classical Sparsification Algorithm Iterative sparsification:

1

construct O(1/ǫ2) spanners and keep these edges

2

keep any remaining edge with probability 1/2, and double its weight

(i.e., we set pe = 1 for spanner edges and pe = 1/2 for other edges)

22

slide-81
SLIDE 81

Classical Sparsification Algorithm Iterative sparsification:

1

construct O(1/ǫ2) spanners and keep these edges

2

keep any remaining edge with probability 1/2, and double its weight

(i.e., we set pe = 1 for spanner edges and pe = 1/2 for other edges)

Theorem (Spielman-Srivastava ’08, Koutis-Xu ’14)

W.h.p. output is ǫ-spectral sparsifier with m/2 + O(n/ǫ2) edges

22

slide-82
SLIDE 82

Classical Sparsification Algorithm Iterative sparsification:

1

construct O(1/ǫ2) spanners and keep these edges

2

keep any remaining edge with probability 1/2, and double its weight

(i.e., we set pe = 1 for spanner edges and pe = 1/2 for other edges)

Theorem (Spielman-Srivastava ’08, Koutis-Xu ’14)

W.h.p. output is ǫ-spectral sparsifier with m/2 + O(n/ǫ2) edges → repeat O(log n) times: ǫ-spectral sparsifier with O(n/ǫ2) edges

22

slide-83
SLIDE 83

Quantum Sparsification Algorithm

23

slide-84
SLIDE 84

Quantum Sparsification Algorithm

= quantum spanner algorithm + k-independent oracle + a magic trick

23

slide-85
SLIDE 85

Quantum Spanner Algorithm

24

slide-86
SLIDE 86

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

24

slide-87
SLIDE 87

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

greedy spanner algorithm:

24

slide-88
SLIDE 88

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

greedy spanner algorithm:

1

set F = (V, EF = ∅)

24

slide-89
SLIDE 89

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

greedy spanner algorithm:

1

set F = (V, EF = ∅)

2

iterate over every edge (i, j) ∈ E\EF: if δF(i, j) > log n, add (i, j) to F

24

slide-90
SLIDE 90

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

greedy spanner algorithm:

1

set F = (V, EF = ∅)

2

iterate over every edge (i, j) ∈ E\EF: if δF(i, j) > log n, add (i, j) to F

quantum greedy spanner algorithm:

24

slide-91
SLIDE 91

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

greedy spanner algorithm:

1

set F = (V, EF = ∅)

2

iterate over every edge (i, j) ∈ E\EF: if δF(i, j) > log n, add (i, j) to F

quantum greedy spanner algorithm:

1

set F = (V, EF = ∅)

24

slide-92
SLIDE 92

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

greedy spanner algorithm:

1

set F = (V, EF = ∅)

2

iterate over every edge (i, j) ∈ E\EF: if δF(i, j) > log n, add (i, j) to F

quantum greedy spanner algorithm:

1

set F = (V, EF = ∅)

2

until no more edges are found, do: Grover search for edge (i, j) such that δF(i, j) > log n. add (i, j) to F

24

slide-93
SLIDE 93

Quantum Spanner Algorithm

Theorem (“easy”)

There is a quantum spanner algorithm with query complexity

  • O(√mn)

greedy spanner algorithm:

1

set F = (V, EF = ∅)

2

iterate over every edge (i, j) ∈ E\EF: if δF(i, j) > log n, add (i, j) to F

quantum greedy spanner algorithm:

1

set F = (V, EF = ∅)

2

until no more edges are found, do: Grover search for edge (i, j) such that δF(i, j) > log n. add (i, j) to F

→ can prove: O(n) edges are found using O(√mn) queries

24

slide-94
SLIDE 94

Quantum Spanner Algorithm

Theorem (“less easy”)

There is a quantum spanner algorithm with time complexity

  • O(√mn)

25

slide-95
SLIDE 95

Quantum Spanner Algorithm

Theorem (“less easy”)

There is a quantum spanner algorithm with time complexity

  • O(√mn)

= (roughly) [Thorup-Zwick ’01] classical construction of a spanner by growing small shortest-path trees (SPTs)

25

slide-96
SLIDE 96

Quantum Spanner Algorithm

Theorem (“less easy”)

There is a quantum spanner algorithm with time complexity

  • O(√mn)

= (roughly) [Thorup-Zwick ’01] classical construction of a spanner by growing small shortest-path trees (SPTs) + [Dürr-Heiligman-Høyer-Mhalla ’04] quantum speedup for constructing SPTs

25

slide-97
SLIDE 97

Quantum Sparsification Algorithm Iterative sparsification:

1

use quantum algorithm to construct O(1/ǫ2) spanners, keep these edges

2

keep any remaining edge with probability 1/2, and double its weight

26

slide-98
SLIDE 98

Quantum Sparsification Algorithm Iterative sparsification:

1

use quantum algorithm to construct O(1/ǫ2) spanners, keep these edges

2

keep any remaining edge with probability 1/2, and double its weight → after 1 iteration: “intermediate” graph with ≈ m/2 edges

26

slide-99
SLIDE 99

Quantum Sparsification Algorithm Iterative sparsification:

1

use quantum algorithm to construct O(1/ǫ2) spanners, keep these edges

2

keep any remaining edge with probability 1/2, and double its weight → after 1 iteration: “intermediate” graph with ≈ m/2 edges ? how to keep track in time o(m) ?

26

slide-100
SLIDE 100

Quantum Sparsification Algorithm Iterative sparsification:

1

use quantum algorithm to construct O(1/ǫ2) spanners, keep these edges

2

keep any remaining edge with probability 1/2, and double its weight → after 1 iteration: “intermediate” graph with ≈ m/2 edges ? how to keep track in time o(m) ?

26

slide-101
SLIDE 101

Query Access to Random String

  • maintain (offline) random string x ∈ {0, 1}(n

2)

1 0 0 1 1 0 1 1 1 0 1 0 0

edge (i, j) discarded edge (i′, j′) kept

27

slide-102
SLIDE 102

Query Access to Random String

  • maintain (offline) random string x ∈ {0, 1}(n

2)

1 0 0 1 1 0 1 1 1 0 1 0 0

edge (i, j) discarded edge (i′, j′) kept

(oblivious to the graph!)

27

slide-103
SLIDE 103

Query Access to Random String

  • maintain (offline) random string x ∈ {0, 1}(n

2)

1 0 0 1 1 0 1 1 1 0 1 0 0

edge (i, j) discarded edge (i′, j′) kept

(oblivious to the graph!)

query (i, k) − → (j, x(i, j))

27

slide-104
SLIDE 104

Query Access to Random String

  • maintain (offline) random string x ∈ {0, 1}(n

2)

1 0 0 1 1 0 1 1 1 0 1 0 0

edge (i, j) discarded edge (i′, j′) kept

(oblivious to the graph!)

query (i, k) − → (j, x(i, j))

27

slide-105
SLIDE 105

Query Access to Random String problem: time Ω(n2) to generate random x ∈ {0, 1}(n

2) 28

slide-106
SLIDE 106

Query Access to Random String problem: time Ω(n2) to generate random x ∈ {0, 1}(n

2)

classical solution: “lazy sampling” (generate bits on demand)

28

slide-107
SLIDE 107

Query Access to Random String problem: time Ω(n2) to generate random x ∈ {0, 1}(n

2)

classical solution: “lazy sampling” (generate bits on demand) quantum this is not possible: can address all bits in superposition

28

slide-108
SLIDE 108

Rid of Random String luckily, we can outsmart this quantum demon:

29

slide-109
SLIDE 109

Rid of Random String luckily, we can outsmart this quantum demon:

Fact

k/2-query quantum algorithm cannot distinguish uniformly random string from k-wise independent string * = easy consequence of polynomial method

[Beals-Buhrman-Cleve-Mosca-de Wolf ’98]

29

slide-110
SLIDE 110

Rid of Random String luckily, we can outsmart this quantum demon:

Fact

k/2-query quantum algorithm cannot distinguish uniformly random string from k-wise independent string * = easy consequence of polynomial method

[Beals-Buhrman-Cleve-Mosca-de Wolf ’98]

* k-wise independent string x ∈ {0, 1}(n

2)

behaves uniformly random on every subset of k bits

29

slide-111
SLIDE 111

Rid of Random String aim for quantum algorithm making ∼ √mn queries, so suffices to use k-wise independent n

2

  • bit string with k ∼ √mn

30

slide-112
SLIDE 112

Rid of Random String aim for quantum algorithm making ∼ √mn queries, so suffices to use k-wise independent n

2

  • bit string with k ∼ √mn

? can we efficiently query such a string ?

(without explicitly generating it!)

30

slide-113
SLIDE 113

Rid of Random String aim for quantum algorithm making ∼ √mn queries, so suffices to use k-wise independent n

2

  • bit string with k ∼ √mn

? can we efficiently query such a string ?

(without explicitly generating it!)

→ use recent results on “efficient k-independent hash functions”

30

slide-114
SLIDE 114

Rid of Random String aim for quantum algorithm making ∼ √mn queries, so suffices to use k-wise independent n

2

  • bit string with k ∼ √mn

? can we efficiently query such a string ?

(without explicitly generating it!)

→ use recent results on “efficient k-independent hash functions”

Theorem (Christiani-Pagh-Thorup ’15)

Can construct in preprocessing time O(k) a k-independent oracle that simulates queries to k-wise independent string in time O(1) per query.

30

slide-115
SLIDE 115

Rid of Random String aim for quantum algorithm making ∼ √mn queries, so suffices to use k-wise independent n

2

  • bit string with k ∼ √mn

? can we efficiently query such a string ?

(without explicitly generating it!)

→ use recent results on “efficient k-independent hash functions”

Theorem (Christiani-Pagh-Thorup ’15)

Can construct in preprocessing time O(k) a k-independent oracle that simulates queries to k-wise independent string in time O(1) per query.

Corollary

Any k-query quantum algorithm that queries a uniformly random string can be simulated in time O(k) without random string.

30

slide-116
SLIDE 116

Quantum Sparsification Algorithm

31

slide-117
SLIDE 117

Quantum Sparsification Algorithm Quantum iterative sparsification:

1

use quantum algorithm to construct O(1/ǫ2) spanners, keep these edges

2

construct k-independent oracle that marks remaining edges with probability 1/2, and double weights

31

slide-118
SLIDE 118

Quantum Sparsification Algorithm Quantum iterative sparsification:

1

use quantum algorithm to construct O(1/ǫ2) spanners, keep these edges

2

construct k-independent oracle that marks remaining edges with probability 1/2, and double weights → per iteration: complexity O(√mn/ǫ2)

31

slide-119
SLIDE 119

Quantum Sparsification Algorithm Quantum iterative sparsification:

1

use quantum algorithm to construct O(1/ǫ2) spanners, keep these edges

2

construct k-independent oracle that marks remaining edges with probability 1/2, and double weights → per iteration: complexity O(√mn/ǫ2)

Theorem

There is a quantum algorithm that constructs an ǫ-spectral sparsifier with O(n/ǫ2) edges in time

  • O(√mn/ǫ2)

31

slide-120
SLIDE 120

A Magic Trick

32

slide-121
SLIDE 121

A Magic Trick to improve ǫ-dependency:

33

slide-122
SLIDE 122

A Magic Trick to improve ǫ-dependency:

1

create rough ǫ-spectral sparsifier H for ǫ = 1/10 → O(√mn) using our quantum algorithm

33

slide-123
SLIDE 123

A Magic Trick to improve ǫ-dependency:

1

create rough ǫ-spectral sparsifier H for ǫ = 1/10 → O(√mn) using our quantum algorithm

2

estimate effective resistances for H → O(n) using classical Laplacian solving

33

slide-124
SLIDE 124

A Magic Trick to improve ǫ-dependency:

1

create rough ǫ-spectral sparsifier H for ǫ = 1/10 → O(√mn) using our quantum algorithm

2

estimate effective resistances for H → O(n) using classical Laplacian solving = approximation of effective resistances of G !

33

slide-125
SLIDE 125

A Magic Trick to improve ǫ-dependency:

1

create rough ǫ-spectral sparsifier H for ǫ = 1/10 → O(√mn) using our quantum algorithm

2

estimate effective resistances for H → O(n) using classical Laplacian solving = approximation of effective resistances of G !

3

sample O(n/ǫ2) edges from G using these estimates → in time O(

  • mn/ǫ2) using Grover search

33

slide-126
SLIDE 126

A Magic Trick to improve ǫ-dependency:

1

create rough ǫ-spectral sparsifier H for ǫ = 1/10 → O(√mn) using our quantum algorithm

2

estimate effective resistances for H → O(n) using classical Laplacian solving = approximation of effective resistances of G !

3

sample O(n/ǫ2) edges from G using these estimates → in time O(

  • mn/ǫ2) using Grover search

Theorem (our main result)

There is a quantum algorithm that constructs an ǫ-spectral sparsifier with O(n/ǫ2) edges in time

  • O(√mn/ǫ)

33

slide-127
SLIDE 127

A Magic Trick to improve ǫ-dependency:

1

create rough ǫ-spectral sparsifier H for ǫ = 1/10 → O(√mn) using our quantum algorithm

2

estimate effective resistances for H → O(n) using classical Laplacian solving = approximation of effective resistances of G !

3

sample O(n/ǫ2) edges from G using these estimates → in time O(

  • mn/ǫ2) using Grover search

Theorem (our main result)

There is a quantum algorithm that constructs an ǫ-spectral sparsifier with O(n/ǫ2) edges in time

  • O(√mn/ǫ)

* assuming ǫ ≥

  • n/m, it holds that

O(√mn/ǫ) ∈ O(m)

33

slide-128
SLIDE 128

this work:

1

quantum algorithm to find ǫ-spectral sparsifier H in time

  • O(√mn/ǫ)

2

matching Ω(√mn/ǫ) lower bound

3

applications: quantum speedup for

◮ max cut, min cut, min st-cut, sparsest cut, . . . ◮ Laplacian solving, approximating resistances and random walk

properties, spectral clustering, . . .

34

slide-129
SLIDE 129

Matching Quantum Lower Bound intuition: finding k marked elements among M elements takes Ω( √ Mk) quantum queries

35

slide-130
SLIDE 130

Matching Quantum Lower Bound intuition: finding k marked elements among M elements takes Ω( √ Mk) quantum queries “hence” finding O(n/ǫ2) edges of sparsifier among m edges takes time

  • Ω(√mn/ǫ)

35

slide-131
SLIDE 131

Unsparsifiable Graph

36

slide-132
SLIDE 132

Unsparsifiable Graph random bipartite graph on 1/ǫ2 nodes

36

slide-133
SLIDE 133

Unsparsifiable Graph ǫ2n copies = random graph H(n, ǫ) with n nodes and O(n/ǫ2) edges

37

slide-134
SLIDE 134

Unsparsifiable Graph ǫ2n copies = random graph H(n, ǫ) with n nodes and O(n/ǫ2) edges

Theorem (Andoni-Chen-Krauthgamer-Qin-Woodruff-Zhang ’16)

Any ǫ-spectral sparsifier of H(n, ǫ) must contain a constant fraction of its edges.

37

slide-135
SLIDE 135

Hiding a Sparsifier

38

slide-136
SLIDE 136

Hiding a Sparsifier given n, m, ǫ: we “hide” H(n, ǫ) in larger G(n, m, ǫ) with n nodes and m edges

38

slide-137
SLIDE 137

Hiding a Sparsifier given n, m, ǫ: we “hide” H(n, ǫ) in larger G(n, m, ǫ) with n nodes and m edges → ǫ-spectral sparsifier of G(n, m, ǫ) must find constant fraction of H(n, ǫ)

38

slide-138
SLIDE 138

Proving a Lower Bound

39

slide-139
SLIDE 139

Proving a Lower Bound “hidden” copy of random graph: every edge of sparsifier is hidden among N = m/(nǫ2) entries

39

slide-140
SLIDE 140

Proving a Lower Bound “hidden” copy of random graph: every edge of sparsifier is hidden among N = m/(nǫ2) entries

  • riginal graph:

=     1 1 1 1 1 1 1 1    

39

slide-141
SLIDE 141

Proving a Lower Bound “hidden” copy of random graph: every edge of sparsifier is hidden among N = m/(nǫ2) entries

  • riginal graph:

=     1 1 1 1 1 1 1 1     hidden graph: =    

0000001000 0000000000 0000000000 0010000000 0001000000 0000000000 0000000010 0000000000 0000000000 0000001000 0000000010 0000000000

0000000000

0000000000 0000010000 0000100000

   

39

slide-142
SLIDE 142

Proving a Lower Bound forgetting about graphs:

40

slide-143
SLIDE 143

Proving a Lower Bound forgetting about graphs: A =     1 1 1 1 1 1 1 1     ∈ {0, 1}n×n

40

slide-144
SLIDE 144

Proving a Lower Bound forgetting about graphs: A =     1 1 1 1 1 1 1 1     ∈ {0, 1}n×n = ORN,blockwise        

0000001000 0000000000 0000000000 0010000000 0001000000 0000000000 0000000010 0000000000 0000000000 0000001000 0000000010 0000000000

0000000000

0000000000 0000010000 0000100000

    ∈ {0, 1}Nn×Nn    

40

slide-145
SLIDE 145

Proving a Lower Bound forgetting about graphs: A =     1 1 1 1 1 1 1 1     ∈ {0, 1}n×n = ORN,blockwise        

0000001000 0000000000 0000000000 0010000000 0001000000 0000000000 0000000010 0000000000 0000000000 0000001000 0000000010 0000000000

0000000000

0000000000 0000010000 0000100000

    ∈ {0, 1}Nn×Nn     task:

  • utput constant fraction of 1-bits of A, each described by ORN-function

40

slide-146
SLIDE 146

Proving a Lower Bound forgetting about graphs: A =     1 1 1 1 1 1 1 1     ∈ {0, 1}n×n = ORN,blockwise        

0000001000 0000000000 0000000000 0010000000 0001000000 0000000000 0000000010 0000000000 0000000000 0000001000 0000000010 0000000000

0000000000

0000000000 0000010000 0000100000

    ∈ {0, 1}Nn×Nn     task:

  • utput constant fraction of 1-bits of A, each described by ORN-function

= relational problem composed with ORN

40

slide-147
SLIDE 147

Proving a Lower Bound ? quantum lower bound for composition of relational problem and ORN-function ?

41

slide-148
SLIDE 148

Proving a Lower Bound ? quantum lower bound for composition of relational problem and ORN-function ?

Theorem (proof by A. Belov and T. Lee, to be published)

The quantum query complexity of an efficiently verifiable relational problem, with lower bound L, composed with the ORN-function, is Ω(L √ N).

41

slide-149
SLIDE 149

Proving a Lower Bound ? quantum lower bound for composition of relational problem and ORN-function ?

Theorem (proof by A. Belov and T. Lee, to be published)

The quantum query complexity of an efficiently verifiable relational problem, with lower bound L, composed with the ORN-function, is Ω(L √ N). for L = Ω(n) and N = m/(nǫ2):

Corollary

The quantum query complexity of explicity outputting an ǫ-spectral sparsifier of a graph with n nodes and m edges is

  • Ω(√mn/ǫ).

41

slide-150
SLIDE 150

this work:

1

quantum algorithm to find ǫ-spectral sparsifier H in time

  • O(√mn/ǫ)

2

matching Ω(√mn/ǫ) lower bound

3

applications: quantum speedup for

◮ max cut, min cut, min st-cut, sparsest cut, . . . ◮ Laplacian solving, approximating resistances and random walk

properties, spectral clustering, . . .

42

slide-151
SLIDE 151

Quantum Speedups by Quantum Sparsification

43

slide-152
SLIDE 152

Quantum Speedups by Quantum Sparsification graph quantity P, approximately preserved under sparsification

43

slide-153
SLIDE 153

Quantum Speedups by Quantum Sparsification graph quantity P, approximately preserved under sparsification + classical O(m) algorithm for P

43

slide-154
SLIDE 154

Quantum Speedups by Quantum Sparsification graph quantity P, approximately preserved under sparsification + classical O(m) algorithm for P ↓ quantum sparsify G to H in O(√mn/ǫ) + classical algorithm on H in O(n/ǫ2)

43

slide-155
SLIDE 155

Quantum Speedups by Quantum Sparsification graph quantity P, approximately preserved under sparsification + classical O(m) algorithm for P ↓ quantum sparsify G to H in O(√mn/ǫ) + classical algorithm on H in O(n/ǫ2) = approximate O(√mn/ǫ) quantum algorithm for P

43

slide-156
SLIDE 156

Cut Approximation

MIN CUT:

find cut (S, Sc) that minimizes cut value cutG(S)

44

slide-157
SLIDE 157

Cut Approximation

MIN CUT:

find cut (S, Sc) that minimizes cut value cutG(S) classically: can find MIN CUT in time O(m) (Karger ’00)

44

slide-158
SLIDE 158

Cut Approximation

MIN CUT of ǫ-spectral sparsifier H

gives ǫ-approximation of MIN CUT of G

45

slide-159
SLIDE 159

Cut Approximation

MIN CUT of ǫ-spectral sparsifier H

gives ǫ-approximation of MIN CUT of G quantum sparsify G to H in O(√mn/ǫ) + classical MIN CUT on H in O(n/ǫ2) (Karger ’00) = O(√mn/ǫ) quantum algorithm for ǫ-MIN CUT

45

slide-160
SLIDE 160

Cut Approximation

Classical Quantum (this work) ǫ-MIN CUT

  • O(m) (Karger’00)
  • O(√mn/ǫ)

ǫ-MIN st-CUT

  • O(m + n/ǫ5) (Peng’16)
  • O(√mn/ǫ + n/ǫ5)

√log n-SPARSEST CUT/

  • BAL. SEPARATOR
  • O(m + n1+δ)

(Sherman’09)

  • O(√mn + n1+δ)

.878-MAX CUT

  • O(m) (Arora-Kale’07)
  • O(√mn)

46

slide-161
SLIDE 161

Laplacian Solving

47

slide-162
SLIDE 162

Laplacian Solving general linear system Ax = b

47

slide-163
SLIDE 163

Laplacian Solving general linear system Ax = b given A and b, with nnz(A) = m, complexity of approximating x is O(min{mn, nω}) (ω < 2.373)

47

slide-164
SLIDE 164

Laplacian Solving Laplacian system Lx = b

48

slide-165
SLIDE 165

Laplacian Solving Laplacian system Lx = b given L and b, with nnz(L) = m, complexity of approximating x is O(m) [Spielman-Teng ’04]

48

slide-166
SLIDE 166

Laplacian Solving Laplacian system Lx = b given L and b, with nnz(L) = m, complexity of approximating x is O(m) [Spielman-Teng ’04] + if H sparsifier of G then L+

Hb ≈ L+ Gb

48

slide-167
SLIDE 167

Laplacian Solving Laplacian system Lx = b given L and b, with nnz(L) = m, complexity of approximating x is O(m) [Spielman-Teng ’04] + if H sparsifier of G then L+

Hb ≈ L+ Gb

↓ quantum algorithm to sparsify G to H in O(√mn/ǫ) + solve LHx = b classically in O(n/ǫ2)

48

slide-168
SLIDE 168

Laplacian Solving Laplacian system Lx = b given L and b, with nnz(L) = m, complexity of approximating x is O(m) [Spielman-Teng ’04] + if H sparsifier of G then L+

Hb ≈ L+ Gb

↓ quantum algorithm to sparsify G to H in O(√mn/ǫ) + solve LHx = b classically in O(n/ǫ2) = quantum algorithm for Laplacian solving in O(√mn/ǫ)

48

slide-169
SLIDE 169

Laplacian Solving Laplacian system Lx = b given L and b, with nnz(L) = m, complexity of approximating x is O(m) [Spielman-Teng ’04] + if H sparsifier of G then L+

Hb ≈ L+ Gb

↓ quantum algorithm to sparsify G to H in O(√mn/ǫ) + solve LHx = b classically in O(n/ǫ2) = quantum algorithm for Laplacian solving in O(√mn/ǫ)

(+ quantum reduction for symmetric, diagonally dominant systems)

48

slide-170
SLIDE 170

Laplacian Solving and Friends

Classical Quantum (this work) ǫ-SDD Solving

  • O(m) (ST’04)
  • O(√mn/ǫ)

ǫ-Effective Resistance

(single)

  • O(m)
  • O(√mn/ǫ)

prior: O(√mn/ǫ2)

ǫ-Effective Resistance (all)

  • O(m + n/ǫ4)

(Spielman-Srivastava’08)

  • O(√mn/ǫ + n/ǫ4)

O(1)-Cover Time

  • O(m)

(Ding-Lee-Peres’10)

  • O(√mn)

k bottom eigenvalues

  • O(m + kn/ǫ2)
  • O(√mn/ǫ + kn/ǫ2)

prior, k = 1: O(n2/ǫ)

Spectral k-means clustering

  • O(m + n poly(k))
  • O(√mn + n poly(k))

49

slide-171
SLIDE 171

summary:

50

slide-172
SLIDE 172

summary:

quantum algorithm for spectral sparsification in time O(√mn/ǫ)

50

slide-173
SLIDE 173

summary:

quantum algorithm for spectral sparsification in time O(√mn/ǫ) matching Ω(√mn/ǫ) lower bound

50

slide-174
SLIDE 174

summary:

quantum algorithm for spectral sparsification in time O(√mn/ǫ) matching Ω(√mn/ǫ) lower bound speedup for cut approximation, Laplacian solving, . . .

50

slide-175
SLIDE 175

summary:

quantum algorithm for spectral sparsification in time O(√mn/ǫ) matching Ω(√mn/ǫ) lower bound speedup for cut approximation, Laplacian solving, . . .

  • pen questions:

50

slide-176
SLIDE 176

summary:

quantum algorithm for spectral sparsification in time O(√mn/ǫ) matching Ω(√mn/ǫ) lower bound speedup for cut approximation, Laplacian solving, . . .

  • pen questions:

matching lower bounds for applications? e.g., Ω(√mn/ǫ) for approximate min cut or Laplacian solving?

50

slide-177
SLIDE 177

summary:

quantum algorithm for spectral sparsification in time O(√mn/ǫ) matching Ω(√mn/ǫ) lower bound speedup for cut approximation, Laplacian solving, . . .

  • pen questions:

matching lower bounds for applications? e.g., Ω(√mn/ǫ) for approximate min cut or Laplacian solving?

  • ur

O(√mn/ǫ) sparsification algorithm is tight for weighted graphs. can we do better for unweighted graphs?

50

slide-178
SLIDE 178

summary:

quantum algorithm for spectral sparsification in time O(√mn/ǫ) matching Ω(√mn/ǫ) lower bound speedup for cut approximation, Laplacian solving, . . .

  • pen questions:

matching lower bounds for applications? e.g., Ω(√mn/ǫ) for approximate min cut or Laplacian solving?

  • ur

O(√mn/ǫ) sparsification algorithm is tight for weighted graphs. can we do better for unweighted graphs? thank you! stay safe!

50