DeepTMA: Predicting Effective Contention Models for Network Calculus - - PowerPoint PPT Presentation

deeptma predicting effective contention models for
SMART_READER_LITE
LIVE PREVIEW

DeepTMA: Predicting Effective Contention Models for Network Calculus - - PowerPoint PPT Presentation

DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks Fabien Geyer 1,2 and Steffen Bondorf 3 INFOCOM 2019 Wednesday 1 st May, 2019 1 Chair of Network Architectures and Services 3 Dept. of Information


slide-1
SLIDE 1

DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

Fabien Geyer1,2 and Steffen Bondorf3 INFOCOM 2019

Wednesday 1st May, 2019

1Chair of Network Architectures and Services

Technical University of Munich (TUM)

2Airbus Central R&T

Munich

  • 3Dept. of Information Security and Communication Technology

Norwegian University of Science and Technology (NTNU)

slide-2
SLIDE 2

Motivation

Worst-Case End-to-End Performance Analysis

Probability End-to-end network delay Worst-case Deadline

  • Important for critical applications
  • Need formal proof on network delay
  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-3
SLIDE 3

Motivation

Worst-Case End-to-End Performance Analysis

Probability End-to-end network delay Worst-case Deadline Simulation Measurements Bound Tightness

Computation effort Tightness improvement Analysis methods Ideal

  • Trade-off between computational effort and tightness
  • This talk: network analysis method with good tightness and fast execution
  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-4
SLIDE 4

Motivation

Network Calculus – Basics Time Data Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-5
SLIDE 5

Motivation

Network Calculus – Basics Time Data A Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-6
SLIDE 6

Motivation

Network Calculus – Basics Time Data A D Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-7
SLIDE 7

Motivation

Network Calculus – Basics Time Data A D Backlog Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-8
SLIDE 8

Motivation

Network Calculus – Basics Time Data A D Delay Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-9
SLIDE 9

Motivation

Network Calculus – Basics Time Data A D α Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-10
SLIDE 10

Motivation

Network Calculus – Basics Time Data A D α Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-11
SLIDE 11

Motivation

Network Calculus – Basics Time Data A D α Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-12
SLIDE 12

Motivation

Network Calculus – Basics Time Data A D α β Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-13
SLIDE 13

Motivation

Network Calculus – Basics Time Data A D α β Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-14
SLIDE 14

Motivation

Network Calculus – Basics Time Data A D α β Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-15
SLIDE 15

Motivation

Network Calculus – Basics Time Data α β Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

β α α

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-16
SLIDE 16

Motivation

Network Calculus – Basics Time Data α β Backlog Bound Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

β α α

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-17
SLIDE 17

Motivation

Network Calculus – Basics Time Data α β Delay Bound Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

β α α

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-18
SLIDE 18

Motivation

Network Calculus – Basics Time Data α β Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

β α α

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-19
SLIDE 19

Motivation

Network Calculus – Basics Time Data α β Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

β α α

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-20
SLIDE 20

Motivation

Network Calculus – Basics Time Data α

Basis: Cumulative arrivals and services [Cruz, 1991a]

D A

Arrival curve α: A(t) − A(t − s) ≤ α(s), ∀t ≤ s Service curve β: a server is said to offer a strict service curve β if, during any backlogged period of duration u, the output of the system is at least equal to β(u)

β α α

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

2

slide-21
SLIDE 21

MODELING AND ANALYSIS OF NETWORK INFRASTRUCTURE IN CYBER-PHYSICAL SYSTEMS

LIANG CHENG (LEHIGH UNIVERSITY, BETHLEHEM, USA) STEFFEN BONDORF (NTNU TRONDHEIM, NORWAY)

ACM SIGCOMM 2019 TUTORIALS 2019-08-23 BEIJING, CHINA

slide-22
SLIDE 22

Motivation

Network Calculus – Network Analysis How to compute end-to-end performance?

s1 s2 s3 s4 f1 f2 f3 f4

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

3

slide-23
SLIDE 23

Motivation

Network Calculus – Network Analysis How to compute end-to-end performance?

s1 s2 s3 s4 f1 f2 f3 f4

TFA – Total Flow Analysis [Cruz, 1991b] Step 1: Compute delay at each server on the path

s1 s2 s3 s4 f1 f2 f s4

3

f s3

3

f4

Step 2: Sum delays

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

3

slide-24
SLIDE 24

Motivation

Network Calculus – Network Analysis How to compute end-to-end performance?

s1 s2 s3 s4 f1 f2 f3 f4

TFA – Total Flow Analysis [Cruz, 1991b] Step 1: Compute delay at each server on the path

s1 s2 s3 s4 f1 f2 f s4

3

f s3

3

f4

Step 2: Sum delays Server concatenation [Le Boudec and Thiran, 2001]

β1 β2 β3 α α

(min, +) algebra gives us:

β1 ⊗ β2 ⊗ β3 α α

→ Pay Bursts Only Once principle

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

3

slide-25
SLIDE 25

Motivation

Network Calculus – Network Analysis SFA – Separate Flow Analysis [Le Boudec and Thiran, 2001] Step 1: Compute per-server residual service

s1 s2 s3lo s4lo f1 f2 f3 f4

Step 2: Concatenate the servers

s1 s2 s3lo ⊗ s4lo f3

Step 3: Compute delay over concatenated server

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

4

slide-26
SLIDE 26

Motivation

Network Calculus – Network Analysis SFA – Separate Flow Analysis [Le Boudec and Thiran, 2001] Step 1: Compute per-server residual service

s1 s2 s3lo s4lo f1 f2 f3 f4

Step 2: Concatenate the servers

s1 s2 s3lo ⊗ s4lo f3

Step 3: Compute delay over concatenated server PMOO – Pay Multiplexing Only Once [Schmitt et al., 2008b] Step 1: Concatenate the servers

s1 s2 s3 ⊗ s4 f1 f2 f3 f4

Step 2: Compute residual service

s1 s2 (s3 ⊗ s4)lo f3

Step 3: Compute delay over concatenated server

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

4

slide-27
SLIDE 27

Motivation

Network Calculus – TMA TMA – Tandem Matching Analysis [Bondorf et al., 2017]

  • Main concept: apply concatenation only for some servers
  • Exhaustive search to find which concatenations will result in the tightest end-to-end delay → O

2n−1 SFA

β1 β2 β3 α α

PMOO

β1 ⊗ β2 ⊗ β3 α α

Alternative 1

β1 ⊗ β2 β3 α α

Alternative 2

β1 β2 ⊗ β3 α α

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

5

slide-28
SLIDE 28

Motivation

Network Calculus – TMA TMA – Tandem Matching Analysis [Bondorf et al., 2017]

  • Main concept: apply concatenation only for some servers
  • Exhaustive search to find which concatenations will result in the tightest end-to-end delay → O

2n−1 SFA

β1 β2 β3 α α

Cut Cut

PMOO (no cut)

β1 ⊗ β2 ⊗ β3 α α

Alternative 1

β1 ⊗ β2 β3 α α

Cut

Alternative 2

β1 β2 ⊗ β3 α α

Cut

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

5

slide-29
SLIDE 29

Motivation

Network Calculus – DeepTMA Computation effort Tightness improvement Ideal TFA SFA PMOO Opt. TMA

Opt.: [Schmitt et al., 2008a][Bouillard et al., 2010]

Question: Can we avoid TMA’s exhaustive search?

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

6

slide-30
SLIDE 30

Motivation

Network Calculus – DeepTMA Computation effort Tightness improvement Ideal TFA SFA PMOO Opt. DeepTMA TMA

Opt.: [Schmitt et al., 2008a][Bouillard et al., 2010]

Question: Can we avoid TMA’s exhaustive search? → DeepTMA:

  • Main idea: use fast heuristic for predicting best cuts
  • Even if the heuristic is wrong, the bounds are still valid
  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

6

slide-31
SLIDE 31

Motivation

Network Calculus – DeepTMA Computation effort Tightness improvement Ideal TFA SFA PMOO Opt. DeepTMA TMA

Opt.: [Schmitt et al., 2008a][Bouillard et al., 2010]

Question: Can we avoid TMA’s exhaustive search? → DeepTMA:

  • Main idea: use fast heuristic for predicting best cuts
  • Even if the heuristic is wrong, the bounds are still valid

Network

  • f servers

and flows Network Calculus TMA Analysis Heuristic End-to-End Latencies

Cuts Recommendation

Figure 1: Approach

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

6

slide-32
SLIDE 32

Outline

Heuristic based on Graph Neural Networks Numerical evaluation Conclusion

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

7

slide-33
SLIDE 33

Heuristic based on Graph Neural Networks

Introduction

Heuristic

  • Use Graph Neural Network
  • Classification problem for cuts

β1 β2 β3 α α

Cut? Cut?

Figure 2: Classification problem

β1 β2 β3 α α

Cut Cut

β1 ⊗ β2 ⊗ β3 α α

β1 ⊗ β2 β3 α α

Cut

β1 β2 ⊗ β3 α α

Cut

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

8

slide-34
SLIDE 34

Heuristic based on Graph Neural Networks

Introduction

Heuristic

  • Use Graph Neural Network
  • Classification problem for cuts

Graph formulation

  • Nodes: flows, servers, cuts
  • Edges: relationships between elements
  • Prediction if cut is applied or not

β1 β2 β3 α α

Cut? Cut?

Figure 2: Classification problem

Network

  • f servers

and flows Network Calculus TMA Analysis Heuristic End-to-End Latencies

Cuts Recommendation

Graph Transformation and Neural Network Training Points

Figure 3: Approach

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

8

slide-35
SLIDE 35

Heuristic based on Graph Neural Networks

Problem formulation as graph

s1 s2 s3 s4 f1 f2 f3 f4

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

9

slide-36
SLIDE 36

Heuristic based on Graph Neural Networks

Problem formulation as graph

s1 s2 s3 s4 f1 f2 f3 f4

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

9

slide-37
SLIDE 37

Heuristic based on Graph Neural Networks

Problem formulation as graph

s1 s2 s3 s4 f1 f2 f3 f4

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

9

slide-38
SLIDE 38

Heuristic based on Graph Neural Networks

Problem formulation as graph

s1 s2 s3 s4 f1 f2 f3 f4 Path

  • rdering
  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

9

slide-39
SLIDE 39

Heuristic based on Graph Neural Networks

Problem formulation as graph

s1 s2 s3 s4 f1 f2 f3 f4 Cut

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

9

slide-40
SLIDE 40

Heuristic based on Graph Neural Networks

Problem formulation as graph

s1 s2 s3 s4 f1 f2 f3 f4 [rate, latency] [rate, burst] [path order] Input features:

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

9

slide-41
SLIDE 41

Heuristic based on Graph Neural Networks

Problem formulation as graph

s1 s2 s3 s4 f1 f2 f3 f4 [Pr(cut)] Output features:

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

9

slide-42
SLIDE 42

Heuristic based on Graph Neural Networks

Graph Neural Networks – Introduction Graph Neural Networks [Scarselli et al., 2009] and related architectures are able to process general graphs and predict feature of nodes ov

Principle

  • Each node has a hidden vector hv ∈ Rk
  • . . . computed according to the vector of its neighbors
  • . . . and are propagated through the graph

Algorithm

  • Initialize h(0)

v

according to features of nodes

  • for t = 1, ... , T do
  • a(t)

v = AGGREGATE

h(t−1)

u

| u ∈ Nbr(v)

  • h(t)

v = COMBINE

h(t−1)

v

, a(t)

v

  • return READOUT

h(T)

v

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

10

slide-43
SLIDE 43

Heuristic based on Graph Neural Networks

Graph Neural Networks – Illustration

A B C D E

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

11

slide-44
SLIDE 44

Heuristic based on Graph Neural Networks

Graph Neural Networks – Illustration

A B C D E B C D h(t)

A

h(t−1)

B

h(t−1)

C

h(t−1)

D

AGGREGATE & COMBINE

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

11

slide-45
SLIDE 45

Heuristic based on Graph Neural Networks

Graph Neural Networks – Illustration

A B C D E h(t)

A

h(t−1)

B

h(t−1)

C

h(t−1)

D

AGGREGATE & COMBINE h(t−1)

B

h(t−1)

C

h(t−1)

D

h(t−2)

A

h(t−2)

B

h(t−2)

C

h(t−2)

E

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

11

slide-46
SLIDE 46

Heuristic based on Graph Neural Networks

Graph Neural Networks – Illustration

A B C D E h(t)

A

h(t−1)

B

h(t−1)

C

h(t−1)

D

AGGREGATE & COMBINE h(t−1)

B

h(t−1)

C

h(t−1)

D

h(t−2)

A

h(t−2)

B

h(t−2)

C

h(t−2)

E

h(t−2)

A

h(t−2)

B

h(t−2)

C

h(t−2)

E

h(t−3)

A

h(t−3)

B

h(t−3)

C

h(t−3)

D

h(t−3)

E

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

11

slide-47
SLIDE 47

Heuristic based on Graph Neural Networks

Graph Neural Networks – Illustration

A B C D E

h(t)

A

h(t−1)

B

h(t−1)

C

h(t−1)

D

AGGREGATE & COMBINE h(t−1)

B

h(t−1)

C

h(t−1)

D

h(t−2)

A

h(t−2)

B

h(t−2)

C

h(t−2)

E

h(t−2)

A

h(t−2)

B

h(t−2)

C

h(t−2)

E

h(t−3)

A

h(t−3)

B

h(t−3)

C

h(t−3)

D

h(t−3)

E

h(0)

A

h(0)

B

h(0)

C

h(0)

D

h(0)

E

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

11

slide-48
SLIDE 48

Heuristic based on Graph Neural Networks

Graph Neural Networks – Illustration h(t)

A

h(t−1)

B

h(t−1)

C

h(t−1)

D

AGGREGATE & COMBINE h(t−1)

B

h(t−1)

C

h(t−1)

D

h(t−2)

A

h(t−2)

B

h(t−2)

C

h(t−2)

E

h(t−2)

A

h(t−2)

B

h(t−2)

C

h(t−2)

E

h(t−3)

A

h(t−3)

B

h(t−3)

C

h(t−3)

D

h(t−3)

E

h(0)

A

h(0)

B

h(0)

C

h(0)

D

h(0)

E

READOUT

  • A
  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

11

slide-49
SLIDE 49

Heuristic based on Graph Neural Networks

Graph Neural Networks – Implementation

Implementation (simplified)

  • Initialize h(0)

v

according to features of nodes

  • for t = 1, ... , T do
  • AGGREGATE → a(t)

v = u∈Nbr(v) h(t−1) u

  • COMBINE → h(t)

v = Neural Network

h(t−1)

v

, a(t)

v

  • READOUT → return Neural Network

h(T)

v

  • Training
  • Using standard gradient descent techniques
  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

12

slide-50
SLIDE 50

Heuristic based on Graph Neural Networks

Graph Neural Networks – Implementation

Implementation (simplified)

  • Initialize h(0)

v

according to features of nodes

  • for t = 1, ... , T do
  • AGGREGATE → a(t)

v = u∈Nbr(v) h(t−1) u

  • COMBINE → h(t)

v = Neural Network

h(t−1)

v

, a(t)

v

  • READOUT → return Neural Network

h(T)

v

  • Training
  • Using standard gradient descent techniques

Different approaches

  • Gated-Graph Neural Network
  • Graph Convolution Network
  • Graph Attention Networks
  • Graph Spatial-Temporal Networks
  • . . .

→ Hot area of research in the ML community

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

12

slide-51
SLIDE 51

Numerical evaluation

Dataset generation

  • Generation of 100 000 networks with tandem or tree topology
  • Random generation of curve parameters for servers and flows
  • Evaluation of each network using DiscoDNC and extract intermediary results of TMA
  • Dataset available online: https://github.com/fabgeyer/dataset-infocom2019

Parameter Min Max Mean Median # of servers 2 41 14.2 12.0 # of flows 1 63 23.0 18.0 # of flows per server 1 44 5.8 4.6 # of tandem combinations 2 113 100 596.2 134.0 # of tandem combination per flow 2 32 768 25.9 4.0 # of nodes in analyzed graph 6 717 159.0 127.0

Table 1: Statistics about the generated dataset.

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

13

slide-52
SLIDE 52

Numerical evaluation

Prediction accuracy

21 23 25 27 29 211 213 215 50 100

DeepTMA RND32 RND1

Number of tandem decompositions per flow Average accuracy (%)

DeepTMA RND1 RND2 RND4 RND8 RND16 RND32

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

14

slide-53
SLIDE 53

Numerical evaluation

Tightness The impact of these failures to predict the optimal decomposition only results in a relative error below 6%

1 3 5 7 9 11 13 15 50 100

Worst choice SFA PMOO DeepTMA

Path length of flow Relative error to TMA (%) Consistent worst choice SFA PMOO DeepTMA

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

15

slide-54
SLIDE 54

Numerical evaluation

Runtime

2 4 6 8 10 12 14 16 100 101 102 103 104

TMA SFA PMOO DeepTMA

CPU GPU

Maximum flow path length in network Execution time per network (ms) DeepTMA (GPU) TMA PMOO DeepTMA (CPU) SFA

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

16

slide-55
SLIDE 55

Numerical evaluation

Additional results Three other simpler heuristics defined in the paper

  • Random Choice of Tandem Decomposition
  • Path Length of Flows up to Location of Interference
  • Hop Count Heuristic

Results

  • DeepTMA better than random-based heuristics
  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

17

slide-56
SLIDE 56

Conclusion

Contributions

  • Framework combining network calculus and graph-based deep learning
  • New NC analysis with fast execution times and good tightness
  • Dataset: https://github.com/fabgeyer/dataset-infocom2019

Future work

  • Evaluation on more complex networks and curves
  • Predictions for other NC analyses

Final thoughts

→ Graph Neural Networks are a promising paradigm for computer networks Computation effort Tightness improvement Ideal TFA SFA PMOO Opt. TMA DeepTMA

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

18

slide-57
SLIDE 57

Bibliography

[Bondorf et al., 2017] Bondorf, S., Nikolaus, P ., and Schmitt, J. B. (2017). Quality and cost of deterministic network calculus – design and evaluation of an accurate and fast analysis.

  • Proc. ACM Meas. Anal. Comput. Syst. (POMACS), 1(1):16:1–16:34.

[Le Boudec and Thiran, 2001] Le Boudec, J.-Y. and Thiran, P . (2001). Network Calculus: A Theory of Deterministic Queuing Systems for the Inter- net. Springer-Verlag. [Bouillard et al., 2010] Bouillard, A., Jouhet, L., and Thierry, É. (2010). Tight performance bounds in the worst-case analysis of feed-forward net- works. In Proc. of IEEE INFOCOM. [Cruz, 1991a] Cruz, R. L. (1991a). A calculus for network delay, part I: Network elements in isolation. IEEE Trans. Inf. Theory, 37(1):114–131. [Cruz, 1991b] Cruz, R. L. (1991b). A calculus for network delay, part II: Network analysis. IEEE Trans. Inf. Theory, 37(1):132–141. [Scarselli et al., 2009] Scarselli, F., Gori, M., Tsoi, A. C., Hagenbuchner, M., and Monfardini, G. (2009). The graph neural network model. IEEE Trans. Neural Netw., 20(1):61–80. [Schmitt et al., 2008a] Schmitt, J. B., Zdarsky, F. A., and Fidler, M. (2008a). Delay bounds under arbitrary multiplexing: When network calculus leaves you in the lurch. . . . In Proc. of IEEE INFOCOM. [Schmitt et al., 2008b] Schmitt, J. B., Zdarsky, F. A., and Martinovic, I. (2008b). Improving performance bounds in feed-forward networks by paying multiplex- ing only once. In Proc. of GI/ITG MMB.

  • F. Geyer and S. Bondorf — DeepTMA: Predicting Effective Contention Models for Network Calculus using Graph Neural Networks

18