Towards a Methodology for Benchmarking Edge Processing Frameworks - - PowerPoint PPT Presentation

towards a methodology for benchmarking edge processing
SMART_READER_LITE
LIVE PREVIEW

Towards a Methodology for Benchmarking Edge Processing Frameworks - - PowerPoint PPT Presentation

Towards a Methodology for Benchmarking Edge Processing Frameworks Pedro Silva, Alexandru Costan, Gabriel Antoniu Inria, IRISA France Invited Talk, BenchCouncil19, Denver, November 2019 Data Shifts to the Edge By 2022 Gartner predicts that


slide-1
SLIDE 1

Towards a Methodology for Benchmarking Edge Processing Frameworks

Pedro Silva, Alexandru Costan, Gabriel Antoniu Inria, IRISA France

Invited Talk, BenchCouncil’19, Denver, November 2019

slide-2
SLIDE 2

Data Shifts to the Edge

By 2022 Gartner predicts that 75% of enterprise-generated data will be created and processed outside of the data center and cloud infrastructures compared with 10% today.

Source: Smarter with Gartner, What Edge Computing Means for Infrastructure and Operations, October 3, 2018 Extract from: BullSequana Edge positioning paper (Atos)

2

slide-3
SLIDE 3

3

slide-4
SLIDE 4

4

slide-5
SLIDE 5

Why Edge Processing?

Advantages

q Easier access to data q Bandwidth saving q Privacy q High potential parallelism

EDGE DATA CLOUD / DC DATA FOG

5

slide-6
SLIDE 6

Edge Processing Tools

qCustom software qGeneric frameworks

qApache Edgent qAmazon Greengrass qAzure Stream Analytics qIBM Watson IoT qIntel IoT qOracle Edge Analytics q…

EDGE DATA CLOUD / DC DATA FOG

6

slide-7
SLIDE 7

Edge Processing Tools Are Great! J

EDGE DATA CLOUD / DC DATA FOG

7

slide-8
SLIDE 8

How Great?

EDGE DATA CLOUD / DC DATA FOG

What is their performance? Under which conditions? Do they integrate well with my app?

8

slide-9
SLIDE 9

We Need Benchmarking!

Goal: Understand performance

EDGE DATA CLOUD / DC DATA FOG

9

slide-10
SLIDE 10

Benchmarking: Questions

q Are the cost models precise? q What is the impact of networking on the performance? q How do my algorithms react to real-time scenarios? q How does my hybrid approach compare to a fully centralized solution?

FOG EDGE CLOUD

q SILVA, P., COSTAN A. and ANTONIU, G., Towards a Methodology for Benchmarking Edge Processing Frameworks. 1st Workshop on Parallel AI and Systems for the Edge (PAISE workshop collocated with IPDPS 2019).

10

slide-11
SLIDE 11

Benchmarking Platform: Objectives

q Benchmark complete scenarios q Control network characteristics q Control framework configuration parameters q Control Edge, Fog and Cloud infrastructures

11

slide-12
SLIDE 12

Benchmarking Edge Processing: Related Work

q TPCx-IoT q Created for hardware benchmarking q Fog oriented q Academic benchmarks q Difficult to reproduce q Lack of a clear methodology (metrics, workloads, parameters) q Not focused on the tools

12

slide-13
SLIDE 13

Benchmarking Edge Processing Tools

metrics

q Edge/Fog data processing tools

q Processing performance q Supported programming languages q Connectivity q Development easiness

q Use cases

q Overall application performance q Viability on different infrastructure configurations

workload data transmission processing …

13

slide-14
SLIDE 14

Benchmarking Edge Processing Tools: Zoom

Edge Fog Cloud … … …

14

slide-15
SLIDE 15

Benchmarking Edge Processing Tools : Parameters

Edge Fog Cloud … … …

Workloads: CCTV NYC Taxi EEW Network: Bandwidth Loss Latency Network: Bandwidth Loss Latency Edge: Processing tools Fog: MQTT server + processing tools Cloud: Kafka + Flink 15

slide-16
SLIDE 16

Benchmarking Edge Processing Tools: Metrics

Edge Fog Cloud … … …

Throughput

Latency Edge to Fog Latency Fog to Cloud Processing Latency

Throughput Each component has a resource utilization log. 16

slide-17
SLIDE 17

Benchmarking Platform: Implementation

q Experiment manager q Configures the infrastructure q Deploys frameworks/tools q Deploys applications and manages their executions q Monitors resource usage q Gathers metrics and logs q Edge+Fog+Cloud processing management q Wrappers/interfaces q Metric generation, configuration, connection

Experiment Manager Infrastructure VMs / Containers Bare Metal Edge Fog Cloud

Python / Execo / EnosLib

Grid5K enoslib app stack

17

slide-18
SLIDE 18

Earthquake Early Warning Systems (EEW)

Warning broadcaster Seismometer Data center Data upload P-wave

18

slide-19
SLIDE 19

Earthquake Early Warning Systems (EEW)

Scientific Instruments Intermediate machines with computing capabilities

Centralized data center Broadcasting users

… …

Data Warning

q Deem: hierarchical and distributed ML algorithm q Enables the usage of multiple types of sensors q Enables the deployment on less powerful networks q Enables local decision making.

Deem: local decision Deem: global decision

q FAUVEL, K. ; BALOUEK-THOMERT, D. ; MELGAR, D. ; SILVA, P., SIMONET, A. ; ANTONIU G. ; COSTAN, A ; MASSON, V ; PARASHAR, M. ; RODERO, I. ; TERMIER, A. A Distributed Multi-Sensor Machine Learning Approach to Earthquake Early Warning. Just accepted at AAAI 2020. q SILVA, P., BALOUEK-THOMERT, D.; FAUVEL, K. ; MELGAR, D. ; SIMONET, A. ; ANTONIU G. ; COSTAN, A ; MASSON, V ; PARASHAR, M. ; RODERO, I. ; TERMIER, A A hybrid Fog and Cloud computing based approach for Earthquake Early Warning Systems. (In preparation.) 19

slide-20
SLIDE 20

EEW: Fog-Based Infrastructure

q Thousands of producers q High load on Fog and Cloud q Objectives

q Reduction of network costs q Reduction of Cloud costs q Easier network reconfiguration (intelligent fog nodes)

20

slide-21
SLIDE 21

Next Steps

q Improve the benchmark prototype q Experiment with the EEW scenario q Integrate extra scenarios and use cases (e.g., DL-based)

Thank you!