Model-Based Testing Real-Time and Interactive Music Systems
Poncelet Sanchez Clement,
Florent Jacquemard Team: RepMus
SYNCHRON 2016
Thesis defended on 11/10/16
Model-Based Testing Real-Time and Interactive Music Systems Thesis - - PowerPoint PPT Presentation
Model-Based Testing Real-Time and Interactive Music Systems Thesis defended on 11/10/16 Poncelet Sanchez Clement, Florent Jacquemard SYNCHRON 2016 Team: RepMus Score-Based Interactive Music Systems Mixed Score e input output a evt
Poncelet Sanchez Clement,
Florent Jacquemard Team: RepMus
SYNCHRON 2016
Thesis defended on 11/10/16
input
hard time synchronous
e a
Mixed Score
IMS
discrete inputs/outputs e
a
3
Mixed Score
4
Mixed Score
Specified inputs Interpretation
5
Mixed Score
Interpretation
∞
infinite possibilities
6
Mixed Score
Interpretation
7
Set of relevant inputs Set of corresponding implementation outputs
Computation of expected
Timed conformance
8
performance
IMS environment
Manual Methods Rehearsals
9
generation
execution (virtual clocks)
comparison
conformance criteria
feedbacks
e a
m
e l Environment
Implementation Under Test
10
e a
Implementation Under Test m
e l
Bound performances Automatic construction
Environment
11
1.Objectives
2.Testing Framework
3.Interactive Real-Time Model
.tref .tin .tout
Offline Approach
12
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
e a
Construction: from high level to model Online Approach
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .e a
Mixed Score Models Verdict Model-Based Testing: from model to verdict
13
.tref .tin .oute1? d11 msg11! d12 e2? d21 msg21! . . . . . .
(Simulation) Compute Expected Outputs
.tref2
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .(Execution) Compute Real Outputs
.tout
3
Timed conformance
4
Inputs Generation
.tin
1
Model + Mixed Score
14
1.Objectives 2.Testing Framework
3.Interactive Real-Time Model
15
Specified inputs
Timed Input Trace
16
<e1, 0, 120>
tempo: 120bpm A timed trace is a tuple <s, t, p>: s: symbol t: timestamp in time unit p: pace in time unit per minute Definition:
e
<e2, 2, 120>
<e1, 0, 120> Timed Input Trace
17
tempo: 120bpm A timed trace is a tuple <s, t, p>: s: symbol t: timestamp in time unit p: pace in time unit per minute Definition:
e
Specified inputs
<e3, 2.33, 120>
<e1, 0, 120><e2, 2, 120> Timed Input Trace
18
tempo: 120bpm A timed trace is a tuple <s, t, p>: s: symbol t: timestamp in time unit p: pace in time unit per minute Definition:
e
Specified inputs
Interpretation
<e1, 0, 120><e2, 2, 120><e3, 2.33, 120><e4, 2.66, 120><e5, 3, 120><e6, 6, 120> Timed Input Trace
19
<e1, 0, 119><e2, 1.9, 80.9> <e4, 2.76, 114><e5, 3.2, 115.3><e6, 5.9, 119>
.tin
Timed Input Trace e
tempo: 120bpm A timed trace is a tuple <s, t, p>: s: symbol t: timestamp in time unit p: pace in time unit per minute Definition: Specified inputs
<e1, 0, 120><e2, 2, 120> <e4, 2.66, 120><e5, 3, 120><e6, 6, 120> !
errors
<e1, 0, 120><e2, 1.9, 120> <e4, 2.76, 120><e5, 3.2, 120><e6, 5.9, 120>
variations
a Timed Input Trace Timed Output Trace <a1, 0, 60><a2, 2.66, 60><a3, 3, 60>
20
Expected trace .tref
<e1, 0, 120><e2, 2, 120><e3, 2.33, 120><e4, 2.66, 120><e5, 3, 120><e6, 6, 120> <e1, 0, 119><e2, 1.9, 80.9> <e4, 2.76, 114><e5, 3.2, 115.3><e6, 5.9, 119>
.tin
<e1, 0, 120><e2, 1.9, 120> <e4, 2.76, 120><e5, 3.2, 120><e6, 5.9, 120> <e1, 0, 120><e2, 2, 120> <e4, 2.66, 120><e5, 3, 120><e6, 6, 120> ! e
tempo: 120bpm
?
Specified inputs
21
Set of relevant inputs Set of corresponding implementation outputs
Computation of expected
Timed conformance
.tin
.tref
.tout
22
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
IRTM TAIO model-checking / decidability Interactive Real- Time Model Timed Automata with Input-Output
e a
Model = E + S Bound performances Compute expected output Environment Model System Model TA aspects Synchronous aspects
23
e a
a1! a2! e1?
Jan Tretmans. Model Based Testing with Labelled Transition Systems. Formal Methods and Testing, an outcome of the FORTEST.
Model Based Testing. So<ware and Systems Safety - SpecificaBon and VerificaBon.
Input/Output System Specification
24
Jan Tretmans. Model Based Testing with Labelled Transition Systems. Formal Methods and Testing, an outcome of the FORTEST.
a1! a2! e1?
Model Based Testing. So<ware and Systems Safety - SpecificaBon and VerificaBon.
Simulation
.tin .tref
25
e1
a1! a2! e1?
Receive
e1
Jan Tretmans. Model Based Testing with Labelled Transition Systems. Formal Methods and Testing, an outcome of the FORTEST.
Model Based Testing. So<ware and Systems Safety - SpecificaBon and VerificaBon.
Simulation
.tin .tref
Send
26
a1
a1! a2! e1?
a1
Jan Tretmans. Model Based Testing with Labelled Transition Systems. Formal Methods and Testing, an outcome of the FORTEST.
Model Based Testing. So<ware and Systems Safety - SpecificaBon and VerificaBon.
Simulation
.tref
a2
Send
27
a2
a1! a2! e1?
a1
Jan Tretmans. Model Based Testing with Labelled Transition Systems. Formal Methods and Testing, an outcome of the FORTEST.
Model Based Testing. So<ware and Systems Safety - SpecificaBon and VerificaBon.
Simulation
.tref
System Specification
A theory of timed automata. TheoreBcal Computer Science.
28
a1! a2! e1?
c1 == 0.5 c1 <= 0.5 c1 := 0.0
Time Finite set of clocks valued on reels Restricted with guards and invariants Reset with affectations Urgent locations Abstract Time Same rate
Testing real-time systems under uncertainty. FMCO’10.
clock value: ci = ℝ+.
Extended aspects Multiple time units Alternation
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
29
m.u: musical time unit m.s: mini seconds
System Specification
TA aspects Clock constraints Transition Discrete/Temporal
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
30
State: <t, n> [controls] { symbols } Dense time
Poncelet, Jacquemard. Model Based Testing of an Interactive Music System. 30th ACM/ SIGAPP Symposium Computing (ACM SAC, 2015). Poncelet, Jacquemard. Model-Based Testing for Building Reliable Realtime Interactive Music Systems. Science of Computer Programming (SCP, 2016).
control: Ci=ℝ+
Synchronous aspects
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
31
State: <0, 0> [C1:0] { }
Simulation lead
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
32
State: <0, 1> [C1=0 :: C2=0] { } Alternation
Simulation lead wait
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
33
Cooperative Scheduling State: <0, 1> [C1=0 :: C2=0] { }
Simulation lead wait suspended
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
34
End of logical instant State: <0, 1> [C1=0 :: C2=0] { }
Simulation Synchronous aspect TA aspects Transition Discrete/Temporal
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
35
.tin
State: <0, 1> [C1=0 :: C2=0] { e1 }
Simulation
Receive
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
36
State: <0, 2> [C1=0 :: C2=0] { e1 }
e1
Simulation
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
37
State: <0, 2> [C1=0 :: C2=0] { e1 }
Simulation extended aspects Priorities
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
38
State: <0, 2> [C1=0 :: C2=0] { e1 }
Simulation extended aspects Priorities
Send
a1
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
39
.tref
State: <0, 3> [C1=0 :: C2=0] { e1 }
Simulation
Delay 0.040
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
40
.tref
State: <0.040, 0> [C1=0.040 :: C2=0.040] { }
Simulation
Expire
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
41
.tref
State: <0.040, 1> [C1=0 :: C2=0.040] { }
Simulation
Expire
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
42
.tref
State: <0.040, 1> [C1=0 :: C2=0.040] { }
Simulation
Send
a2
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
43
.tref
State: <0.040, 2> [C1=0 :: C2=0.040] { }
Simulation
Remove Control Point
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
44
.tref
State: <0.040, 3> [C2=0.040] { }
Simulation
45
.tin
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
State: <0, 2> [C1=0 :: C2=0] { e1, e2 }
Simulation
46
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
State: <0, 2> [C1=0 :: C2=0] { e1, e2 }
Simulation
Receive
e2
47
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
State: <0, 3> [C1=0 :: C2=0] { e1, e2 }
Simulation
Send
min(a1, a3)
48
e1? a1! 0.125 mu a2! e2? a3! 93 ms a4! 93 ms a5!
State: <0, 3> [C1=0 :: C2=0] { e1, e2 }
Simulation
e1! 0.5 mu e2! 0.5 mu e3! 0.5 mu
49
𝓕
e1! 0.5 mu e2! e2! 0.5 mu e3! e
3
! 0.5 mu
50
Interpretation
𝓕
TA aspects Non-determinism missed events
e1! 0.5 mu e2! e2 ! 0.5 mu e3! e3! e3 ! 0.5 mu
51
Interpretation
𝓕
TA aspects Non-determinism missed events
52
Interpretation
e1! [0.3,0.7] mu e2! e
2
! [0.3,0.7] mu e3! e3! e
3
! [0.3,0.7] mu
𝓕
TA aspects Non-determinism duration bounds
53
1.Objectives 2.Interactive Real-Time Model
3.Testing Framework
Poncelet, Jacquemard. Model Based Testing of an Interactive Music System. 30th ACM/SIGAPP Symposium Computing (ACM SAC, 2015).
54
Publications
Poncelet, Jacquemard. Model-Based Testing for Building Reliable Realtime Interactive Music Systems. Science of Computer Programming (SCP, 2016).
journals conferences
Developments
Antescofo adaptors (C++). Application to Antescofo + Regression tests front-end compiler of AntescofoDSL (C++, 13.000 loc). ~ 20 Scripts for test execution (Perl). Conformance and trace manager (C++, 4.000 loc). Virtual Machine (C++, 3.000 loc).
An automatic test framework for interactive music systems. Journal of New Music Research (JNMR), 2016.
Test Methods for Score-Based Interactive Music Systems. ICMC - SMS, 2014. Poncelet, Jacquemard. Model-Based Testing for Building Reliable Realtime Interactive Music Systems. Science of Computer Programming (SCP, 2016).
Burloiu, Cont, Poncelet. A visual framework for dynamic mixed music notation. Journal of New Music Research (JNMR, 2016).
55
Interactive Real-Time Model: Testing Framework:
(Gherkin)
Other Applications:
Online Approach
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .e a
.tref .tin .tout
Offline Approach
57
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
e a
Construction: from high level to model Mixed Score Models Verdict Model-Based Testing: from model to verdict
An automatic test framework for interactive music systems. Journal of New Music Research (JNMR), 2016.
58
Translation: from IRTM into TA IRTM TA under restrictions Build Simulate Verify
Testing real-time systems under uncertainty. FMCO’10.
Blom, Hessel, Jonsson, Peterson. Specifying and Generating Test Cases Using Observer Automata. FATES’04.
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
59
&
covering queries
generation
Timed Automata
durations
e1? a1! 0.125 mu a2! e2? a3! 0.125 mu a4! 0.125 mu a5!
Location/Transition/Path
Test Methods for Score-Based Interactive Music Systems. ICMC - SMS, 2014.
Blom, Hessel, Jonsson, Peterson. Specifying and Generating Test Cases Using Observer Automata. FATES’04.
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
60
covering queries
e1? a1! 0.125 mu a2! e2? a3! 0.125 mu a4! 0.125 mu a5!
Location/Transition/Path
&
generation
Timed Automata
durations
Blom, Hessel, Jonsson, Peterson. Specifying and Generating Test Cases Using Observer Automata. FATES’04.
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
61
covering queries
e1? a1! 0.125 mu a2! e2? a3! 0.125 mu a4! 0.125 mu a5!
Location/Transition/Path
&
generation
Timed Automata
durations
Blom, Hessel, Jonsson, Peterson. Specifying and Generating Test Cases Using Observer Automata. FATES’04.
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
62
covering queries
e1? a1! 0.125 mu a2! e2? a3! 0.125 mu a4! 0.125 mu a5!
Location/Transition/Path
&
generation
Timed Automata
durations
relevant
guarantee
Henkjan Honing. From Time to Time: The Representation of Timing and Tempo. Computer Music Journal. 2001.
63
Time Functions (TIF) Random Interpretation = local shift and global tempo changes
reference trace
.tin
64
.tin .tout
a d a p t
e a
65
.tout
.trefExpected Trace Real Trace Timed conformance: Set inclusion of real timed output traces into the expected timed output traces Definition:
66
.tout
.trefExpected Trace Real Trace Missed actions Timed conformance: Set inclusion of real timed output traces into the expected timed output traces Definition:
67
.tout
.trefExpected Trace Real Trace Unexpected actions
Timed conformance: Set inclusion of real timed output traces into the expected timed output traces Definition:
68
.tout
.trefExpected Trace Real Trace Delta > ε
Timed conformance: Set inclusion of real timed output traces into the expected timed output traces Definition:
.tref .tin .tout
Offline Approach
69
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
e a
Construction: from high level to model Online Approach
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .e a
Mixed Score Models Verdict Model-Based Testing: from model to verdict
An automatic test framework for interactive music systems. Journal of New Music Research (JNMR), 2016.
70
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
(Simulation) Compute Expected Outputs On-the-fly Inputs Generation
Model = bytecode
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .et at
Implementation Under Test
guarantee
a’
(Execution) Compute Real Outputs
a d a p t
On-the-fly Comparison
Virtual Machine
a e
Generator
e
71
1.Objectives 2.Interactive Real-Time Model 3.Testing Framework
4.Application to Antescofo
72
73
Listening machine Reactive engine
tempo pos.
74
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
(Simulation) Compute Expected Outputs
.tref2
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .(Execution) Compute Real Outputs
.tout
3
Timed conformance
4
Inputs Generation
.tin
1
Model Mixed Score (DSL Antescofo)
Construction: from high level to model
75
Construction: from high level to model Automatic
e1? d11 msg11! d12 e2? d21 msg21! . . . . . .
et at
Domain Specific Language Antescofo Interactive Real-Time Model
: ; `all A∅ : ms `env Menv : ms `proxy P : ms `sys A : ms `all MenvkPkA
Inference Rules FSM Parts & Connectors + Operators
` 1 ` 2 `i 3 `0
e
3 `0
e
4 `0 4 2 `0 1 e? e ? e? e?
e0 ?
` ` 1 ` 2 `e 3 `e 4 `0 1 a `0 2 a `0
e
3 a `0
e
4 a
Listening machine
Reactive engine
Reactive Engine
black box
76
.tin <e1, 0, 120><e2, 0.5, 120><e3, 1, 120> .tout
77
.tin
Listening machine
Reactive engine
Reactive Engine
black box
.tout
tempo
78
.tout
.trefcoverage (% locations)
22,5 45 67,5 90 1 3 5 7
0% 5% 50%
79
% allowed duration variation
duration (seconds) 100 200 300 400 1 3 5 7 Approach Offline: CoVer
Benchmark
consecutive misses
generation
Timed Automata
relevant
consecutive misses
Nb generated traces 40 80 120 160 5 8 10 15 40
coverage (% locations)
25 50 75 100 5 8 10 15 40
0-00 3-10 7-25
80
misses - k Sonata in F major Georg Friedrich Händel
Measures 5 10s 25 events 84 actions Measures 8 16s 48 events 185 actions Measures 10 20s 74 events 264 actions Measures 15 30s 122 events 444 actions Measures 40 80s 360 events 1218 actions Approach Offline: CoVer
Measures Measures
consecutive misses - % allowed duration variation
Nb generated traces 40 80 120 160 5 8 10 15 40
coverage (% locations)
25 50 75 100 5 8 10 15 40
0-00 3-10 7-25
81
misses - k Sonata in F major Georg Friedrich Händel
Approach Offline: CoVer
Measures Measures
consecutive misses - % allowed duration variation
coverage (% locations) 10 20 30 40 1 10 100 1000
coverage (% locations)
12,5 25 37,5 50 5 8 10 15 40
0-00 3-10 7-25
82
Sonata in F major Georg Friedrich Händel
Measures 40 80s 360 events 1218 actions Approach Offline: Fuzz
10 traces
Measures
40 measures
relevant
guarantee
consecutive misses - % allowed duration variation
duration (seconds) 75 150 225 300 10 50 100
83
misses - k Sonata in F major Georg Friedrich Händel
Approach Online coverage (% locations) 58 59 60 61 62 10 50 100
10 traces Entire measures
relevant
guarantee
84
a! b! b? a?
State: <0, 0> [C1=0 :: C2=0] { }
85
a! b! b? a?
State: <0, 0> [C1=0 :: C2=0] { }
86
a! b! b? a?
State: <0, 1> [C1=0 :: C2=0] { a }
87
a! b! b? a?
State: <0, 2> [C1=0 :: C2=0] { a, b }
88
a! b! b? a?
State: <0, 3> [C2=0] { a, b }
89
a! b! b? a?
State: <0, 4> [C2=0] { a, b }
90
a! b! b? a?
State: <0, 5> [C2=0] { a, b }