Domain-Level Observation and Control for Compiled Executable DSLs - - PowerPoint PPT Presentation

domain level observation and control for compiled
SMART_READER_LITE
LIVE PREVIEW

Domain-Level Observation and Control for Compiled Executable DSLs - - PowerPoint PPT Presentation

Domain-Level Observation and Control for Compiled Executable DSLs MODELS 2019 Foundations Track Mnich, Germany Erwan Bousse Manuel Wimmer University of Nantes LS2N, France CDL-MINT, Johannes Kepler University Linz, Austria 1/67


slide-1
SLIDE 1

Domain-Level Observation and Control for Compiled Executable DSLs

MODELS 2019 Foundations Track − Münich, Germany

Erwan Bousse

University of Nantes – LS2N, France

Manuel Wimmer

CDL-MINT, Johannes Kepler University Linz, Austria

1/67

slide-2
SLIDE 2

Behavioral models (eg. state machines) can conveniently describe the behaviors of systems under design. Domain-specic languages (DSLs) can be engineered and used to build such models. Dynamic analyses of behavioral models are crucial in early design phases to see how a described behavior unfolds over time. Require the possibility to execute models ⚙ !

Behavioral model

Dynamic Analysis of Behavioral Models

2/67

slide-3
SLIDE 3

Behavioral models (eg. state machines) can conveniently describe the behaviors of systems under design. Domain-specic languages (DSLs) can be engineered and used to build such models. Dynamic analyses of behavioral models are crucial in early design phases to see how a described behavior unfolds over time. Require the possibility to execute models ⚙ !

Behavioral model

conforms to

DSL

Dynamic Analysis of Behavioral Models

3/67

slide-4
SLIDE 4

Behavioral models (eg. state machines) can conveniently describe the behaviors of systems under design. Domain-specic languages (DSLs) can be engineered and used to build such models. Dynamic analyses of behavioral models are crucial in early design phases to see how a described behavior unfolds over time. Require the possibility to execute models ⚙ !

Behavioral model

conforms to

DSL

Dynamic Analysis

Dynamic Analysis of Behavioral Models

4/67

slide-5
SLIDE 5

Behavioral models (eg. state machines) can conveniently describe the behaviors of systems under design. Domain-specic languages (DSLs) can be engineered and used to build such models. Dynamic analyses of behavioral models are crucial in early design phases to see how a described behavior unfolds over time. Require the possibility to execute models ⚙ !

Behavioral model

conforms to

DSL

Dynamic Analysis Model Execution

Dynamic Analysis of Behavioral Models

5/67

slide-6
SLIDE 6

Model Procedure dependency conforms to input/output

Model execution with an interpreted DSL

6/67

slide-7
SLIDE 7

Model Procedure dependency conforms to input/output

Interpreted DSL

Model execution with an interpreted DSL

7/67

slide-8
SLIDE 8

Model Procedure dependency conforms to input/output

Interpreted DSL

Abstract Syntax (metamodel)

Model execution with an interpreted DSL

8/67

slide-9
SLIDE 9

Model Procedure dependency conforms to input/output

Interpreted DSL

Abstract Syntax (metamodel) Model State Definition

Model execution with an interpreted DSL

9/67

slide-10
SLIDE 10

Model Procedure dependency conforms to input/output

Interpreted DSL

Abstract Syntax (metamodel) Model State Definition Execution Steps Definition Interpretation Rules

Model execution with an interpreted DSL

10/67

slide-11
SLIDE 11

Model Procedure dependency conforms to input/output

Interpreted DSL

Abstract Syntax (metamodel) Model State Definition Execution Steps Definition Interpretation Rules

Target interpreter

Engine

Model execution with an interpreted DSL

11/67

slide-12
SLIDE 12

Model Procedure dependency conforms to input/output

Interpreted DSL

Abstract Syntax (metamodel) Model State Definition Execution Steps Definition Interpretation Rules

Target interpreter

Engine Executed Model

Model execution with an interpreted DSL

12/67

slide-13
SLIDE 13

Model Procedure dependency conforms to input/output

Interpreted DSL

Abstract Syntax (metamodel) Model State Definition Execution Steps Definition Interpretation Rules

Target interpreter

Engine Executed Model State Steps

runtime data

Model execution with an interpreted DSL

13/67

slide-14
SLIDE 14

Model Procedure dependency conforms to input/output

Interpreted DSL

Abstract Syntax (metamodel) Model State Definition Execution Steps Definition Interpretation Rules

Target interpreter

Engine Executed Model State Steps

runtime data Dynamic analysis services Tracer Debugger ...

Model execution with an interpreted DSL

14/67

slide-15
SLIDE 15

Abstract Syntax input 1..*

  • utput

1..*

Net Place

name: String initialTokens: Integer

Transition

name: String places * imports merges Model State Definition

Place

tokens: Integer Interpretation rules (summarized) run(Net) fire(Transition) : while there is an enabled transition, fires it. : removes a token from each input Place and adds one to each output Place. * transitions

Example of an Interpreted DSL

15/67

slide-16
SLIDE 16

Abstract Syntax input 1..*

  • utput

1..*

Net Place

name: String initialTokens: Integer

Transition

name: String places * imports merges Model State Definition

Place

tokens: Integer Interpretation rules (summarized) run(Net) fire(Transition) : while there is an enabled transition, fires it. : removes a token from each input Place and adds one to each output Place. * transitions

p4 p5 t1 t2 t3

A

p3 p1 p2

Example of an Interpreted DSL

16/67

slide-17
SLIDE 17

Abstract Syntax input 1..*

  • utput

1..*

Net Place

name: String initialTokens: Integer

Transition

name: String places * imports merges Model State Definition

Place

tokens: Integer Interpretation rules (summarized) run(Net) fire(Transition) : while there is an enabled transition, fires it. : removes a token from each input Place and adds one to each output Place. * transitions

p4 p5 t1 t2 t3

A

p3 p1 p2 p1 p3 p2 p4 p5 t1 t2 t3 p1 p3 p2 p4 p5 t1 t2 t3 p1 p3 p2 p4 p5 t1 t2 t3

fire(t1) fire(t2) fire(t3) run(net)

B C D

1 2 3 4 5 6

A

model state step

foo()

1

  • bservation

point

Example of an Interpreted DSL

17/67

slide-18
SLIDE 18

Abstract Syntax input 1..*

  • utput

1..*

Net Place

name: String initialTokens: Integer

Transition

name: String places * imports merges Model State Definition

Place

tokens: Integer Interpretation rules (summarized) run(Net) fire(Transition) : while there is an enabled transition, fires it. : removes a token from each input Place and adds one to each output Place. * transitions

p4 p5 t1 t2 t3

A

p3 p1 p2 p1 p3 p2 p4 p5 t1 t2 t3 p1 p3 p2 p4 p5 t1 t2 t3 p1 p3 p2 p4 p5 t1 t2 t3

fire(t1) fire(t2) fire(t3) run(net)

B C D

1 2 3 4 5 6

A

model state step

foo()

1

  • bservation

point

Dynamic Analysis

Example of an Interpreted DSL

18/67

slide-19
SLIDE 19

Debugging/Tracing an interpreted model in the GEMOC Studio

19/67

slide-20
SLIDE 20

Question

What about DSLs built with a compiler (eg. a code generator) instead of an interpreter?

20/67

slide-21
SLIDE 21

Model Procedure dependency conforms to input/output

Model execution with a compiled DSL

21/67

slide-22
SLIDE 22

Model Procedure dependency conforms to input/output

Compiled DSL

Model execution with a compiled DSL

22/67

slide-23
SLIDE 23

Model Procedure dependency conforms to input/output

Compiled DSL

Source Abstract Syntax

Model execution with a compiled DSL

23/67

slide-24
SLIDE 24

Model Procedure dependency conforms to input/output

Compiled DSL

Source Abstract Syntax Compiler

Model execution with a compiled DSL

24/67

slide-25
SLIDE 25

Model Procedure dependency conforms to input/output

Compiled DSL

Source Abstract Syntax Compiler

Target Language (interpreted)

Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target Engine

Target interpreter

Model execution with a compiled DSL

25/67

slide-26
SLIDE 26

Model Procedure dependency conforms to input/output

Compiled DSL

Source Abstract Syntax Compiler

Target Language (interpreted)

Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target Engine

Target interpreter

Source Model

Model execution with a compiled DSL

26/67

slide-27
SLIDE 27

Model Procedure dependency conforms to input/output

Compiled DSL

Source Abstract Syntax Compiler

Target Language (interpreted)

Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target Engine

Target interpreter

Source Model Target Model

Model execution with a compiled DSL

27/67

slide-28
SLIDE 28

Model Procedure dependency conforms to input/output

Compiled DSL

Source Abstract Syntax Compiler

Target Language (interpreted)

Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target Engine

Target interpreter

Source Model Target Model

Dynamic analysis services

Target State

Tracer Debugger

Source Steps Target Steps

... Target runtime data

Model execution with a compiled DSL

28/67

slide-29
SLIDE 29

Petri nets abstract syntax AD abstract syntax

Activity Edge Action ForkNode JoinNode InitialNode FinalNode

<<abstract>>

NamedElement

+name: String

source 1 target 1

  • utgoing

* incoming * nodes * edges *

<<abstract>>

Node

transformActivity(Activity) transformEdge(Edge) transformAction(Action) ... : Creates a Net : Creates a Place : Creates a Place and two Transitions imports Compiler (summarized) imports

Example of a compiled DSL (1)

29/67

slide-30
SLIDE 30

A B C e1 e2 e3 e4 e5 e6 e7 Init End J F

Source activity diagram

Example of a compiled DSL (2)

30/67

slide-31
SLIDE 31

A B C e1 e2 e3 e4 e5 e6 e7 Init End J F

Source activity diagram

Init_node Init_offer A_node A_offer e1_edge A_take F_node F_take e2_edge e3_edge e4_edge F_offer B_node C_node B_take C_take B_offer C_offer e5_edge e6_edge J_take J_node J_offer e7_edge End_take

Target Petri net obtained after compilation

Example of a compiled DSL (2)

30/67

slide-32
SLIDE 32

Model Procedure dependency conforms to input/output

Compiled DSL

Source Abstract Syntax Compiler

Target Language (interpreted)

Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target Engine

Target interpreter

Source Model Target Model

Dynamic analysis services

Target State

Tracer Debugger

Source Steps Target Steps

... Target runtime data

Problem: Dynamic analysis is performed at the level of the target domain!

Problem (1)

31/67

slide-33
SLIDE 33
  • ie. when debugging activity diagrams, we must use a petri nets debugger:

Problem (2)

32/67

slide-34
SLIDE 34

The case of programming languages

33/67

slide-35
SLIDE 35

Most general-purpose programming languages rely on ecient compilers for their semantics, either targeting some form of bytecode (eg. Java or Python) or machine code (eg. C or C++).

The case of programming languages

33/67

slide-36
SLIDE 36

Most general-purpose programming languages rely on ecient compilers for their semantics, either targeting some form of bytecode (eg. Java or Python) or machine code (eg. C or C++). Most of these languages do provide an interactive debugger at the source domain level to step through the execution and observe the program state.

The case of programming languages

33/67

slide-37
SLIDE 37

Most general-purpose programming languages rely on ecient compilers for their semantics, either targeting some form of bytecode (eg. Java or Python) or machine code (eg. C or C++). Most of these languages do provide an interactive debugger at the source domain level to step through the execution and observe the program state. But these debuggers result from ad-hoc language engineering work! This does not give us a systematic recipe for engineering new DSLs.

The case of programming languages

33/67

slide-38
SLIDE 38

Most general-purpose programming languages rely on ecient compilers for their semantics, either targeting some form of bytecode (eg. Java or Python) or machine code (eg. C or C++). Most of these languages do provide an interactive debugger at the source domain level to step through the execution and observe the program state. But these debuggers result from ad-hoc language engineering work! This does not give us a systematic recipe for engineering new DSLs. How can we engineer compiled DSLs compatible with dynamic analyses at the source domain level, just as common general-purpose programming languages?

The case of programming languages

33/67

slide-39
SLIDE 39

Contribution

An architecture to support observation and control for compiled DSLs.

34/67

slide-40
SLIDE 40

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Approach Overview

35/67

slide-41
SLIDE 41

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Approach Overview (1)

36/67

slide-42
SLIDE 42

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Source Model State Definition

b

Approach Overview (2)

37/67

slide-43
SLIDE 43

Step (b)− Source model state denition

38/67

slide-44
SLIDE 44

Observing the execution of a model requires accessing its state as it changes (tokens, variables, activated elements, etc.).

Step (b)− Source model state denition

38/67

slide-45
SLIDE 45

Observing the execution of a model requires accessing its state as it changes (tokens, variables, activated elements, etc.). For interpreted DSLs, possible states are dened by a model state denition which extends the abstract syntax of the DSL with new dynamic properties and metaclasses (eg. tokens for the Petri nets DSL).

Step (b)− Source model state denition

38/67

slide-46
SLIDE 46

Observing the execution of a model requires accessing its state as it changes (tokens, variables, activated elements, etc.). For interpreted DSLs, possible states are dened by a model state denition which extends the abstract syntax of the DSL with new dynamic properties and metaclasses (eg. tokens for the Petri nets DSL). But for compiled DSLs, everything related to execution is delegated to the target language, including the state denition.

Step (b)− Source model state denition

38/67

slide-47
SLIDE 47

Observing the execution of a model requires accessing its state as it changes (tokens, variables, activated elements, etc.). For interpreted DSLs, possible states are dened by a model state denition which extends the abstract syntax of the DSL with new dynamic properties and metaclasses (eg. tokens for the Petri nets DSL). But for compiled DSLs, everything related to execution is delegated to the target language, including the state denition. Hence, necessary to extend a compiled DSL with a model state denition, to dene explicitly the possible states of conforming source models.

Step (b)− Source model state denition

38/67

slide-48
SLIDE 48

When executing a UML activity diagram, tokens ow through both nodes and edges of the model. We add a TokensHolder metaclass to reect that:

AD execution metamodel

Token

<<abstract>>

Node Edge

heldTokens * merges

Activity diagram abstract syntax <<abstract>>

TokenHolder

Example of model state denition for the AD DSL

39/67

slide-49
SLIDE 49

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Source Model State Definition

b

Approach Overview (2)

40/67

slide-50
SLIDE 50

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Source Model State Definition

b

Source Execution Steps Definition

c

Approach Overview (3)

41/67

slide-51
SLIDE 51

Step (c) − Source execution steps denition

42/67

slide-52
SLIDE 52

Observing and controlling require knowing the execution steps of the model execution, ie. what are the

  • bservable changes made to the state.

Step (c) − Source execution steps denition

42/67

slide-53
SLIDE 53

Observing and controlling require knowing the execution steps of the model execution, ie. what are the

  • bservable changes made to the state.

For interpreted DSLs, specic interpretation rules can be tagged as producers of execution steps (eg. the

fire step for Petri nets).

Step (c) − Source execution steps denition

42/67

slide-54
SLIDE 54

Observing and controlling require knowing the execution steps of the model execution, ie. what are the

  • bservable changes made to the state.

For interpreted DSLs, specic interpretation rules can be tagged as producers of execution steps (eg. the

fire step for Petri nets).

For compiled DSLs, we propose a trivial step denition metamodel to declare possible execution steps.

Step (c) − Source execution steps denition

42/67

slide-55
SLIDE 55

Observing and controlling require knowing the execution steps of the model execution, ie. what are the

  • bservable changes made to the state.

For interpreted DSLs, specic interpretation rules can be tagged as producers of execution steps (eg. the

fire step for Petri nets).

For compiled DSLs, we propose a trivial step denition metamodel to declare possible execution steps.

Step Definition Metamodel

StepDefinition

+name: String Metamodeling language

parameters *

StepParameter

+name: String

type 1

<<abstract>>

Classifier

+name: String

Step (c) − Source execution steps denition

42/67

slide-56
SLIDE 56

Example of execution steps denition for the AD DSL

43/67

slide-57
SLIDE 57

In UML activity diagrams, a node will take tokens from incoming edges, and oer tokens on its outgoing edges when it nishes its task.

Example of execution steps denition for the AD DSL

43/67

slide-58
SLIDE 58

In UML activity diagrams, a node will take tokens from incoming edges, and oer tokens on its outgoing edges when it nishes its task. We dene the following execution steps to reect that:

  • ffer(Node): oering of tokens of a Node to the outgoing edges of the Node ;

take(Node): taking of tokens by a Node from the incoming edges of the Node

;

executeNode(Node): taking and oering of tokens by a Node , i.e., a

composite step containing both an offer step and a take step;

executeActivity(Activity): execution of the Activity until no tokens

can be oered or taken, i.e., a composite step containing executeNode steps.

Example of execution steps denition for the AD DSL

43/67

slide-59
SLIDE 59

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Source Model State Definition

b

Source Execution Steps Definition

c

Approach Overview (3)

44/67

slide-60
SLIDE 60

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Source Model State Definition

b

Source Execution Steps Definition

c

Feedback Manager

d

+ Compilation Links

Source State Source Steps Source Steps

Approach Overview (4)

45/67

slide-61
SLIDE 61

Step (d) − Feedback manager denition

46/67

slide-62
SLIDE 62

Now remains the translation at runtime of states and steps of the target model back to the source model, to be observed by dynamic analysis tools.

Step (d) − Feedback manager denition

46/67

slide-63
SLIDE 63

Now remains the translation at runtime of states and steps of the target model back to the source model, to be observed by dynamic analysis tools. Our approach: denition of a feedback manager attached to the execution, which performs said translation on the y during the model execution.

Step (d) − Feedback manager denition

46/67

slide-64
SLIDE 64

Now remains the translation at runtime of states and steps of the target model back to the source model, to be observed by dynamic analysis tools. Our approach: denition of a feedback manager attached to the execution, which performs said translation on the y during the model execution. Proposed interface for feedback managers:

feedbackState: Update the source model state based on the set of changes

applied on the target model state in the last target execution step.

processTargetStepStart: Translate a target starting step into source steps. processTargetStepEnd: Translate a target ending step into source steps.

Step (d) − Feedback manager denition

46/67

slide-65
SLIDE 65

Target Petri net execution trace (invisible to users and tools) Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

47/67

slide-66
SLIDE 66 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools) Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

48/67

slide-67
SLIDE 67 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

49/67

slide-68
SLIDE 68 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

50/67

slide-69
SLIDE 69 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity)

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

51/67

slide-70
SLIDE 70 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

2 3

fire(Init_offer)

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity)

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

52/67

slide-71
SLIDE 71 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

2 3

fire(Init_offer)

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity) executeNode(Init)

II.1

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

53/67

slide-72
SLIDE 72 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

2 3

fire(Init_offer)

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity) executeNode(Init)

II.1

e1 e2 Init End A

III.1

  • ffer(Init)

II.2

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

54/67

slide-73
SLIDE 73 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

2 3

fire(Init_offer)

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

fire(A_take)

4

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity) executeNode(Init)

II.1

e1 e2 Init End A

III.1

  • ffer(Init)

II.2

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

55/67

slide-74
SLIDE 74 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

2 3

fire(Init_offer)

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

fire(A_take)

4

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity) executeNode(Init)

II.1

e1 e2 Init End A

III.1

  • ffer(Init)

II.2

executeNode(A)

III.2

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

56/67

slide-75
SLIDE 75 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

2 3

fire(Init_offer)

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

fire(A_take)

4

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity) executeNode(Init)

II.1

e1 e2 Init End A

III.1

  • ffer(Init)

II.2

executeNode(A)

III.2

e1 e2 Init End A

IV III.3

take(A)

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

57/67

slide-76
SLIDE 76 Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

1

run(net)

2 3

fire(Init_offer)

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

fire(A_take)

4

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

fire(A_offer) fire(End_take)

5 6 7

Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge Init_node Init_offer A_take A_offer End_take e1_edge A_node e2_edge

Target Petri net execution trace (invisible to users and tools)

e1 e2 Init End A

I

executeActivity(activity) executeNode(Init)

II.1

e1 e2 Init End A

III.1

  • ffer(Init)

II.2

executeNode(A)

III.2

e1 e2 Init End A

IV III.3

take(A)

V.3

e1 e2 Init End A e1 e2 Init End A

VII VI.2 VI.1 V.2

  • ffer(A)

executeNode(End) take(End)

V.1

Source activity diagram execution trace (seen by users and tools)

Example of execution reconstructed by a feedback manager

58/67

slide-77
SLIDE 77

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Source Model State Definition

b

Source Execution Steps Definition

c

Feedback Manager

d

+ Compilation Links

Source State Source Steps Source Steps

Approach Overview (4)

59/67

slide-78
SLIDE 78

Runtime services Target Language Source Language

Source Abstract Syntax Compiler Source Model Target Abstract Syntax Target Model State Definition Target Execution Steps Definition Interpretation Rules Target State

Tracer Debugger

Source Steps Target Steps

...

Target Engine

Target runtime data Source runtime data Target interpreter

Model Procedure dependency conforms to input/output

Compilation Results

Compilation Links Target Model

a

Source Model State Definition

b

Source Execution Steps Definition

c

Feedback Manager

d

+ Compilation Links

Source State Source Steps Source Steps Feedback Engine

e

Approach Overview (5)

60/67

slide-79
SLIDE 79

Evaluation

Can we observe and control compiled models? In reasonable time?

61/67

slide-80
SLIDE 80

Common parts (eg. glue code, APIs, integration layer) of the approach implemented for the GEMOC Studio, an Eclipse-based language workbench. The source code (Eclipse plugins written in Xtend and Java) is available on Github:

Note

As he GEMOC Studio originally focused on interpreted DSLs, this is the rst attempt to support compiled DSLs in the GEMOC Studio. https://github.com/tetrabox/gemoc- compilation-engine

Implementation

62/67

slide-81
SLIDE 81

Evaluation: RQs

63/67

slide-82
SLIDE 82

RQ#1

Given an interpreted DSL and a compiled DSL with trace-equivalent semantics, does the approach make it possible to observe the same traces with both DSLs?

Evaluation: RQs

63/67

slide-83
SLIDE 83

RQ#1

Given an interpreted DSL and a compiled DSL with trace-equivalent semantics, does the approach make it possible to observe the same traces with both DSLs?

RQ#2

Does the approach enable the use of runtime services at the domain-level of compiled DSLs?

Evaluation: RQs

63/67

slide-84
SLIDE 84

RQ#1

Given an interpreted DSL and a compiled DSL with trace-equivalent semantics, does the approach make it possible to observe the same traces with both DSLs?

RQ#2

Does the approach enable the use of runtime services at the domain-level of compiled DSLs?

RQ#3

What is the time overhead when executing compiled models with feedback management?

Evaluation: RQs

63/67

slide-85
SLIDE 85

Evaluation: Setup

64/67

slide-86
SLIDE 86

Considered DSLs − 2 UML-based languages

a subset of fUML activity diagrams, using Petri nets as a target language, a subset of UML state machines using a subset of Java as a target language. Each DSL implemented twice: one interpreted variant and one compiled variant.

Evaluation: Setup

64/67

slide-87
SLIDE 87

Considered DSLs − 2 UML-based languages

a subset of fUML activity diagrams, using Petri nets as a target language, a subset of UML state machines using a subset of Java as a target language. Each DSL implemented twice: one interpreted variant and one compiled variant.

Considered Runtime Services − 2 tools from our previous work

a trace constructor (ECMFA 2015, SoSym 2017) an omniscient debugger (SLE 2015, JSS 2018)

Evaluation: Setup

64/67

slide-88
SLIDE 88

Considered DSLs − 2 UML-based languages

a subset of fUML activity diagrams, using Petri nets as a target language, a subset of UML state machines using a subset of Java as a target language. Each DSL implemented twice: one interpreted variant and one compiled variant.

Considered Runtime Services − 2 tools from our previous work

a trace constructor (ECMFA 2015, SoSym 2017) an omniscient debugger (SLE 2015, JSS 2018)

Considered Models − random generation

100 fUML activity diagrams in 10 groups ranging from 10 to 100 nodes, 30 UML state machines from 10 to 100 states, and 3 scenarios per state machine.

Evaluation: Setup

64/67

slide-89
SLIDE 89

Evaluation: Results

65/67

slide-90
SLIDE 90

RQ#1: same traces between interpreted and compiled variants?

all 130 generated models executed with the interpreted and the compiled variants of both executable DSLs no dierence found found when comparing traces

Evaluation: Results

65/67

slide-91
SLIDE 91

RQ#1: same traces between interpreted and compiled variants?

all 130 generated models executed with the interpreted and the compiled variants of both executable DSLs no dierence found found when comparing traces

RQ#2: working runtime services?

both runtime services (trace constructor and omniscient debugger) work as expected at the domain-level

Evaluation: Results

65/67

slide-92
SLIDE 92

RQ#1: same traces between interpreted and compiled variants?

all 130 generated models executed with the interpreted and the compiled variants of both executable DSLs no dierence found found when comparing traces

RQ#2: working runtime services?

both runtime services (trace constructor and omniscient debugger) work as expected at the domain-level

RQ#3: execution time overhead when using the feedback manager?

fUML activity diagrams → Petri nets: 1,6 times slower on average UML State Machines → MiniJava: 1,01 times slower on average

Evaluation: Results

65/67

slide-93
SLIDE 93

Summary

Observing and controlling the execution of compiled models is dicult, and there is a lack of systematic approach to design compiled DSLs with that goal in mind. Our proposal: a generic language engineering architecture to dene explicit feedback management in compiled DSLs

Perspectives (excerpt)

handling compilers dened as code generators; provide an easier way to dene feedback managers; managing stimuli sent to the source model during the execution; measuring the amount of eort required to dene a feedback manager as compared to dening an interpreter.

Conclusion

66/67

slide-94
SLIDE 94

Thank you!

Github: Twitter: Email: https://github.com/tetrabox/gemoc-compilation-engine @erwan_bousse erwan.bousse@ls2n.fr

67/67