ON THE ROAD TO BENCHMARKING BPMN 2.0 WORKFLOW ENGINES Marigianna - - PowerPoint PPT Presentation

on the road to benchmarking bpmn 2 0 workflow engines
SMART_READER_LITE
LIVE PREVIEW

ON THE ROAD TO BENCHMARKING BPMN 2.0 WORKFLOW ENGINES Marigianna - - PowerPoint PPT Presentation

Architecture, Design and Web Information Systems Engineering Group ON THE ROAD TO BENCHMARKING BPMN 2.0 WORKFLOW ENGINES Marigianna Skouradaki, Dieter H. Roller, Frank Leymann Vincenzo Ferme , Cesare


slide-1
SLIDE 1

Architecture, Design and Web Information Systems Engineering Group

Vincenzo ¡Ferme, ¡Cesare ¡Pautasso ¡ Faculty ¡of ¡Informatics ¡ University ¡of ¡Lugano ¡(USI) ¡ Switzerland Marigianna ¡Skouradaki, ¡Dieter ¡H. ¡Roller, ¡Frank ¡Leymann ¡ Institute ¡of ¡Architecture ¡and ¡Application ¡Systems ¡ University ¡of ¡Stuttgart ¡ Germany ¡

ON THE ROAD TO BENCHMARKING BPMN 2.0 WORKFLOW ENGINES

slide-2
SLIDE 2

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Application Server DBMS

2

Workflow Engine

What is a Workflow Engine?

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Job Executor

Core Engine

Transaction Manager

Instance Database

Persistent Manager

Process Navigator

A

B C D

Task Dispatcher

Users

Service Invoker

Web Service

slide-3
SLIDE 3

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 3

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Many Business Process Modeling/Execution Languages

BPEL EPC YAWL XPDL BPMN

1992 1998 2002 2004

PNML

2008 …

slide-4
SLIDE 4

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 4

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

BPMN 2.0: A Widely Adopted Standard

Jan 2011

BPMN 2.0

Jan 2014

BPMN 2.0.2

https://en.wikipedia.org/wiki/List_of_BPMN_2.0_engines ISO/IEC 19510

slide-5
SLIDE 5

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 5

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

slide-6
SLIDE 6

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 5

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

WORKLOAD CHARACTERIZATION BENCHMARK EXECUTION

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

slide-7
SLIDE 7

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 5

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

WORKLOAD CHARACTERIZATION BENCHMARK EXECUTION

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 2. Define the Load Functions
  • 1. Define the Workload Mix

20%

80%

D A B C

slide-8
SLIDE 8

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 5

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

WORKLOAD CHARACTERIZATION BENCHMARK EXECUTION

  • 4. Asynchronous execution of business processes

Instance Database

  • 5. Define meaningful and reliable KPIs
  • 3. Deal with engine-specific interfaces

and BPMN 2.0 customizations

x

Engine ↔ Users Client → Engine

Engine ↔ Web Services

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 2. Define the Load Functions
  • 1. Define the Workload Mix

20%

80%

D A B C

slide-9
SLIDE 9

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 6

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

6

slide-10
SLIDE 10

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 6

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

6

Data Flow

A B

C B A

Control Flow Events Activities

Task Types Execution Behavior

A B C D E F G H I

slide-11
SLIDE 11

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 7

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

7

NUMBER OF ENGINES SUPPORTING THE FEATURE NUMBER OF REAL-WORLD MODELS

slide-12
SLIDE 12

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 7

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

7

NUMBER OF ENGINES SUPPORTING THE FEATURE NUMBER OF REAL-WORLD MODELS 200

slide-13
SLIDE 13

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 7

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

7

NUMBER OF ENGINES SUPPORTING THE FEATURE NUMBER OF REAL-WORLD MODELS 200 12 950

slide-14
SLIDE 14

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 7

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

7

NUMBER OF ENGINES SUPPORTING THE FEATURE NUMBER OF REAL-WORLD MODELS 200 12 950 2 4 5 8 10

F

400 600 700 800

slide-15
SLIDE 15

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 8

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 2. Define the Load Functions

Workflow Engine

Users

A

B C D

Start Start Web Services Users Events

Instance Database Application Server DBMS Web Service

slide-16
SLIDE 16

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 9

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 3. Deal with engine-specific interfaces

and BPMN 2.0 customizations

Workflow Engine

Users

Loading Driver

A

B C D

Instance Database Application Server DBMS Web Service

slide-17
SLIDE 17

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 10

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 4. Asynchronous execution of processes

Workflow Engine

Users

A

B C D

Start

Loading Driver

Instance Database Application Server DBMS Web Service

slide-18
SLIDE 18

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 10

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 4. Asynchronous execution of processes

Workflow Engine

Users

A

B C D

Start

Loading Driver

Instance Database End Application Server DBMS Web Service

slide-19
SLIDE 19

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 11

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

The BenchFlow Project

Design the first benchmark to assess and compare the performance of Workflow Engines that are compliant with Business Process Model and Notation 2.0 (BPMN 2.0) standard

“ ”

Engine ↔ Users

Instance Database

Client → Engine

Engine ↔ Web Services

20%

80%

D A B C

slide-20
SLIDE 20

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 12

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B C D

H

F E G

REAL-WORLD PROCESSES

slide-21
SLIDE 21

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 12

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B C D

H

F E G

REAL-WORLD PROCESSES [SOSE2015]

a3

a2 a1

a6

a5 a4

Graph Matching

REOCCURRING STRUCTURES

Skouradaki et al.

slide-22
SLIDE 22

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Selection Criteria

12

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B C D

H

F E G

REAL-WORLD PROCESSES [SOSE2015]

a3

a2 a1

a6

a5 a4

Graph Matching

REOCCURRING STRUCTURES

Skouradaki et al.

slide-23
SLIDE 23

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Selection Criteria

12

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B C D

H

F E G

REAL-WORLD PROCESSES WORKLOAD MIX

a1 a2 a1

a3

a5

50% 50%

Composition Criteria

[SOSE2015]

a3

a2 a1

a6

a5 a4

Graph Matching

REOCCURRING STRUCTURES

Skouradaki et al.

slide-24
SLIDE 24

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 13

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Workflow Engine

Application Server Users DBMS

A

B C D

Loading Driver

Instance Database Web Service

slide-25
SLIDE 25

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 14

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Workflow Engine

Instance Database

Users

Faban

harness

Faban Drivers

A

B C D

Loading Functions

Enabling the Benchmark Execution

Application Server DBMS Web Service

slide-26
SLIDE 26

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Workflow Engine

15

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

harness

Web Service DBMS

  • 1. Flexible deployment
  • 2. Flexible HW Resources
  • 3. Frozen Initial Condition

Faban Drivers

Enabling the Benchmark Execution

Faban

Docker Containers Servers

A

B C D

Loading Functions

slide-27
SLIDE 27

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 16

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Faban

+

Workflow Engine

DBMS

slide-28
SLIDE 28

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

  • 1. Automatically deploy and start the benchmark environment;

16

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Faban

+

Workflow Engine

DBMS

slide-29
SLIDE 29

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

  • 1. Automatically deploy and start the benchmark environment;

16

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Faban

+

Workflow Engine

DBMS

A

B C D

  • 2. Automatically deploy the workload mix;
slide-30
SLIDE 30

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

  • 1. Automatically deploy and start the benchmark environment;

16

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 3. Determine when the benchmark ends;

Enabling the Benchmark Execution

Faban

+

Workflow Engine

DBMS

A

B C D

MONITOR

  • 2. Automatically deploy the workload mix;
slide-31
SLIDE 31

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

  • 1. Automatically deploy and start the benchmark environment;

16

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

  • 3. Determine when the benchmark ends;

Enabling the Benchmark Execution

Faban

+

Workflow Engine

DBMS

A

B C D

MONITOR

Instance Database

COLLECTORS

  • 2. Automatically deploy the workload mix;
  • 4. Collect the execution and process logs.
slide-32
SLIDE 32

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 17

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

The BenchFlow Project Next Steps

  • Release the first prototype of the Benchmark environment


» Yes: Abstract the Interaction with the Engines; Automatic Deploy and 
 Undeploy of the S.U.T.; Execution and Process Log Gathering
 » No: Automatic Generation of Drivers; Users, Web Services and External 
 Catching Business Events


  • Release the first prototype of the Workload Mix synthesizer
  • First Experiments with KPIs Definition and Computation

  • Collect More Process Models and Process Execution Logs

BenchFlow ¡Project: ¡http://design.inf.usi.ch/research/projects/benchflow

slide-33
SLIDE 33

Architecture, Design and Web Information Systems Engineering Group

BACKUP SLIDES

Vincenzo ¡Ferme, ¡Cesare ¡Pautasso ¡ Faculty ¡of ¡Informatics ¡ University ¡of ¡Lugano ¡(USI) ¡ Switzerland Marigianna ¡Skouradaki, ¡Dieter ¡H. ¡Roller, ¡Frank ¡Leymann ¡ Institute ¡of ¡Architecture ¡and ¡Application ¡Systems ¡ University ¡of ¡Stuttgart ¡ Germany ¡

  • Cited Works;
  • Related Works.
slide-34
SLIDE 34

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 19

Cited Works

DBMS

[SOSE2015]

Skouradaki, Marigianna; Goerlach, Katharina; Hahn, Michael; Leymann, Frank. Application

  • f Sub-Graph Isomorphism to Extract Reoccurring Structures from BPMN 2.0 Process
  • Models. In Proceedings of 9th International IEEE Symposium on Service-Oriented System

Engineering (SOSE 2015). San Francisco Bay, USA, March 30 - April 3, 2015. (to appear)

slide-35
SLIDE 35

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 20

Related Works

DBMS

Active Endpoints Inc. Assessing ActiveVOS performance, 2011. http://www.activevos.com/ content/developers/ technical_notes/assessing_activevos_performance.pdf.

  • D. Bianculli, W. Binder, and M. L. Drago. SOABench: Performance evaluation of service-oriented

middleware made easy. In Proc. of ICSE’10 - Volume 2, pages 301–302, 2010.

  • J. Cardoso. Business process control-flow complexity: Metric, evaluation, and validation.

International Journal of Web Services Research, 5(2):49–76, 2008.

  • G. Din, K.-P

. Eckert, and I. Schieferdecker. A workload model for benchmarking BPEL engines. In

  • Proc. of ICSTW’08, pages 356–360, 2008.
  • M. Dumas, L. Garćıa-Bañuelos, and R. M. Dijkman. Similarity search of business process models.

IEEE Data Eng. Bull., 32(3):23–28, 2009.

  • J. Gray. The Benchmark Handbook for Database and Transaction Systems. Morgan Kaufmann,

2nd edition, 1992.

  • G. Hackmann, M. Haitjema, C. Gill, and G.-C. Roman. Sliver: A BPEL workflow process execution

engine for mobile devices. In Proc. of ICSOC’06, pages 503–508. Springer, 2006.

  • S. Harrer, J. Lenhard, and G. Wirtz. BPEL conformance in open source engines. In Proc. of

SOCA’12, pages 1–8, 2012.

slide-36
SLIDE 36

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 21

Related Works

DBMS

  • K. Huppler. The art of building a good benchmark. In Performance Evaluation and Benchmarking,

pages 18–30. Springer, 2009. Intel and Cape Clear. BPEL scalability and performance testing. White paper, 2007.

  • F. Leymann. Managing business processes via workflow technology. In Proc. of VLDB 2001, pages

729–, 2001.

  • A. Liu, Q. Li, L. Huang, and M. Xiao. Facts: A framework for fault-tolerant composition of

transactional web services. IEEE Trans. on Services Computing, 3(1):46–59, 2010.

  • J. Mendling. Metrics for Process Models: Empirical Foundations of Verification, Error

Prediction, and Guidelines for Correctness. Springer, 2008.

  • I. Molyneaux. The Art of Application Performance Testing: Help for Programmers and Quality
  • Assurance. O’Reilly, 2009.
  • M. Z. Muehlen and J. Recker. How much language is enough? theoretical and practical use of the

business process modeling notation. In Proc. of CAiSE’08, pages 465–479, 2008.

  • C. Röck and S. Harrer. Literature survey of performance benchmarking approaches of BPEL
  • engines. Technical report, Otto-Friedrich University of Bamberg, 2014.
  • D. H. Roller. Throughput Improvements for BPEL Engines: Implementation Techniques and

Measurements applied in SWoM. PhD thesis, University of Stuttgart, 2013.

slide-37
SLIDE 37

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme 22

Related Works

DBMS

  • N. Russell, W. M. van der Aalst, and A. Hofstede. All that glitters is not gold: Selecting the right

tool for your BPM needs. Cutter IT Journal, 20(11):31–38, 2007.

  • D. Schumm, D. Karastoyanova, O. Kopp, F. Leymann, M. Sonntag, and S. Strauch. Process fragment

libraries for easier and faster development of process-based applications. CSSI, 2(1):39–55, 2011.

  • M. Skouradaki, D. Roller, C. Pautasso, and F. Leymann. BPELanon: Anonymizing BPEL processes. In
  • Proc. of ZEUS’14, pages 9–15, 2014.

Sun Microsystems. Benchmarking BPEL service engine, 2007. http://wiki.open-esb.java.net/ Wiki.jsp?page=BpelPerformance.html.

  • B. Wetzstein, P

. Leitner, F. Rosenberg, I. Brandic, S. Dustdar, and F. Leymann. Monitoring and analyzing influential factors of business process performance. In Proc. of EDOC’09, pages 141– 150, 2009.