Model-Based Testing There is Nothing More Practical than a Good - - PowerPoint PPT Presentation

model based testing
SMART_READER_LITE
LIVE PREVIEW

Model-Based Testing There is Nothing More Practical than a Good - - PowerPoint PPT Presentation

Model-Based Testing There is Nothing More Practical than a Good Theory model Jan Tretmans SUT ESI Embedded Systems Innovation by TNO Radboud University Nijmegen Hgskolan i Halmstad pass fail jan.tretmans@tno.nl Jan Tretmans Radboud


slide-1
SLIDE 1

Model-Based Testing

There is Nothing More Practical than a Good Theory

Jan Tretmans

ESI – Embedded Systems Innovation by TNO Radboud University Nijmegen Högskolan i Halmstad jan.tretmans@tno.nl

SUT

pass fail

model

slide-2
SLIDE 2

Jan Tretmans

ESI Embedded Systems Innovation (TNO) Eindhoven The Netherlands Radboud University Nijmegen The Netherlands

ESI RU UT

slide-3
SLIDE 3

3

Overview : Model-Based Testing

1. Introduction

– Motivation – Basics: what is it – Context: Model-Based Verification, Validation, Code Generation, . . .

2. Theory

MBT with Labelled Transition systems

– models – implementation relation ioco – testing for ioco 3. Tools

– TorXakis - a tool voor ioco testing – Representation of models : Language

4. Applications

– dropbox

slide-4
SLIDE 4

4

Model-Based Testing Motivation

slide-5
SLIDE 5

Trends & Challenges

multi disciplinarity complexity size connectivity systems-of-systems change variability evolvability uncertainty

5

heterogeneous components Model Based Testing

slide-6
SLIDE 6

connectivity multi disciplinarity change variability evolvability complexity uncertainty heterogeneous components

6

MBT : Next Step Challenges

model composition abstraction under

specification

uncertainty

nondeterminism

concurrency

parallelism

state + complex data

statistical usage profiles

partial

specification

multiple paradigms

integration

test selection

Model Based Testing

slide-7
SLIDE 7

7

Model-Based Testing Basics

slide-8
SLIDE 8

Checking or measuring some quality characteristics

  • f an executing software object

by performing experiments in a controlled way w.r.t. a specification

tester

specification

SUT

System Under Test

Software Testing

specification-based, active, black-box testing of functionality

8

slide-9
SLIDE 9

SUT

System Under Test pass fail 1. Manual testing

1 : Manual Testing

9

slide-10
SLIDE 10

SUT

pass fail test execution

TTCN TTCN

test cases 1. Manual testing 2. Scripted testing

2 : Scripted Testing

10

slide-11
SLIDE 11

SUT

pass fail test execution 1. Manual testing 2. Scripted testing 3. Keyword-driven testing

3 : Keyword-Driven Testing

high-level test notation

11

test scripts

slide-12
SLIDE 12

system model

SUT

TTCN TTCN

Test cases pass fail model-based test generation test execution 1. Manual testing 2. Scripted testing 3. Keyword-driven testing 4. Model-based testing

4 : Model-Based Testing

12

slide-13
SLIDE 13

MBT : Example Models

?coin ?button !alarm ?button !coffee 13

slide-14
SLIDE 14

MBT : Example Models

slide-15
SLIDE 15

MBT: next step in test automation

  • Automatic test generation

+ test execution + result analysis

  • More, longer, and diversified test cases

more variation in test flow and in test data

  • Model is precise and consistent test basis

unambiguous analysis of test results

  • Test maintenance by maintaining models

improved regression testing

  • Expressing test coverage

model coverage customer profile coverage

MBT : Benefits

detecting more bugs faster and cheaper

SUT

pass fail model-based test generation

test execution

model

slide-16
SLIDE 16

16

Model-Based Verification, Validation, Testing, . . . . .

slide-17
SLIDE 17

17

Doing Something with Models

  • Modelling

making a model reveals errors

  • Simulation

go step-by-step through the model

  • Model checking

go through all states of the model

  • Theorem proving

prove theorems about the model

  • Code generation

executable code from the model

  • Testing

test an implementation for compliance

  • Model learning

generate a model from observation

slide-18
SLIDE 18

Validation, Verification, Testing

SUT model

informal world real world

validation model-based testing verification

informal requirements

18

formal world

slide-19
SLIDE 19

19

Code Generation from a Model

A model is more (less) than code generation:

  • views
  • abstraction
  • testing of aspects
  • verification and validation
  • f aspects
slide-20
SLIDE 20

20

? x (x >= 0) ! y yxy = x

model of 𝒚

  • specification of properties

rather than construction

  • under-specification
  • non-determinism

Code Generation from a Model

slide-21
SLIDE 21

21

realization virtualization design models abstract (test) models

Spectrum of Models

slide-22
SLIDE 22

22

system model

SUT

TTCN TTCN

Test cases pass fail

test execution model-based test generation SUT conforms to model

Model Based Testing

slide-23
SLIDE 23

23

tester environment mechanical physical

system

Testing with Models

software

slide-24
SLIDE 24

24

tester environment model mechanical physical model

system

Testing with Models

software

software model

slide-25
SLIDE 25

25

A Theory of Model-Based Testing with Labelled Transition Systems

slide-26
SLIDE 26

26

MBT with LTS Models

slide-27
SLIDE 27

27

Models: Labelled Transition Systems

states transitions T  S  (L {})  S initial state s0  S Labelled Transition System:  S, LI, LU, T, s0  input labels

  • utput

labels

? = input ! = output ?coin ?button !alarm ?button !coffee

slide-28
SLIDE 28

28

MBT with LTS ioco

slide-29
SLIDE 29

29

LTS model

SUT

behaving as input-enabled LTS TTCN TTCN

Test cases pass fail LTS test execution ioco test generation

input/output conformance

ioco

set of LTS tests

SUT passes tests SUT ioco model

sound   exhaustive

MBT : Labelled Transitions Systems

slide-30
SLIDE 30

30

p

p =  !x  LU  {} . p

!x

  • ut ( P ) = { !x  LU | p

!x

, pP }  {  | p

p, pP } Straces ( s ) = {   ( L  {} )* | s

} p after  = { p’ | p

p’ }

Input/Output Conformance : ioco

i ioco s =def    Straces (s) : out (i after )  out (s after ) s is a Labelled Transition System i is (assumed to be) an input-enabled LTS

slide-31
SLIDE 31

31

!coffee

?dime ?quart ?dime ?quart ?dime ?quart ?dime

!choc

?quart !tea

!coffee

?dime !tea

specification model

Example: ioco

?dime

!coffee

?dime

!choc

?dime !tea

non-determinism uncertainty under-specification

slide-32
SLIDE 32

32

? x (x >= 0) ! y ( | y x y – x| < ε ) specification model

! x

? x (x < 0) ? x (x >= 0) SUT models ? x

  • non-determinism
  • under-specification
  • specification of properties

rather than construction

MBT : Nondeterminism, Underspecification

! -x

? x (x < 0) ? x (x >= 0) ? x !error

slide-33
SLIDE 33

33

MBT with LTS Testing for ioco

slide-34
SLIDE 34

34

Test Case

– ‘quiescence’ / ‘time-out’ label  – tree-structured – finite, deterministic – final states pass and fail – from each state  pass, fail :

  • either one input !a
  • or all outputs ?x and 

!dub !kwart ?tea ?coffee ?tea

!dub

fail fail

test case = labelled transition system

fail fail

?coffee ?tea

fail pass

?coffee ?tea

fail fail

?coffee ?tea ?coffee

pass fail pass

slide-35
SLIDE 35

35

Algorithm to generate a test case t(S) from a transition system state set S, with S   ( initially S = s0 after  ). Apply the following steps recursively, non-deterministically:

1 end test case

pass

2 supply input !a !a t ( S after ?a   )

Test Generation Algorithm : ioco

allowed outputs (or ): !x out (S) forbidden outputs (or ): !y out (S)

3

  • bserve all outputs

fail

t ( S after !x )

fail

allowed outputs forbidden outputs

?y

?x fail

t ( S after !x )

fail

allowed outputs forbidden outputs

?y ?x

slide-36
SLIDE 36

36

Example: ioco Test Generation

i ioco s i fails t !coffee

?dime !tea

specification model s

?coffee !dime ?tea ?choc

pass fail pass fail

generated test case t !choc

?dime !tea

implementation i

i ioco s =def    Straces (s) :

  • ut (i after )  out (s after )
slide-37
SLIDE 37

37

s  LTS

SUT

behaving as input-enabled LTS

i ioco s test tool

gen : LTS → (TTS)

t  SUT

SUT passes gen(s) SUT comforms to s

 

sound exhaustive

Prove soundness and exhaustiveness:

mIOTS .

( tgen(s) . m passes t )

m ioco s

Test assumption :

SUTIMP . mSUT IOTS . tTESTS .

SUT passes t  mSUT passes t

pass fail

MBT with ioco is Sound and Exhaustive

slide-38
SLIDE 38

38

TorXakis A Tool for MBT with LTS

slide-39
SLIDE 39

MBT Tools ioco

  • AETG
  • Agatha
  • Agedis
  • Autolink
  • Axini Test Manager
  • Conformiq
  • Cooper
  • Cover
  • DTM
  • fMBT
  • Gst
  • Gotcha
  • Graphwalker
  • JTorX
  • MaTeLo
  • MBTsuite
  • M-Frame
  • MISTA
  • NModel
  • OSMO
  • ParTeG
  • Phact/The Kit
  • PyModel
  • QuickCheck
  • Reactis
  • Recover
  • RT-Tester
  • SaMsTaG
  • Smartesting CertifyIt
  • Spec Explorer
  • StateMate
  • STG
  • tedeso
  • Temppo
  • TestGen (Stirling)
  • TestGen (INT)
  • TestComposer
  • TestOptimal
  • TGV
  • Tigris
  • TorX
  • TorXakis
  • T-Vec
  • Tveda
  • Uppaal-Cover
  • Uppaal-Tron
  • . . . . . . . . . . .

39

slide-40
SLIDE 40

Yet Another MBT Tool : TorXakis

  • AETG
  • Agatha
  • Agedis
  • Autolink
  • Axini Test Manager
  • Conformiq
  • Cooper
  • Cover
  • DTM
  • fMBT
  • Gst
  • Gotcha
  • Graphwalker
  • JTorX
  • MaTeLo
  • MBTsuite
  • M-Frame
  • MISTA
  • NModel
  • OSMO
  • ParTeG
  • Phact/The Kit
  • PyModel
  • QuickCheck
  • Reactis
  • Recover
  • RT-Tester
  • SaMsTaG
  • Smartesting CertifyIt
  • Spec Explorer
  • StateMate
  • STG
  • tedeso
  • Temppo
  • TestGen (Stirling)
  • TestGen (INT)
  • TestComposer
  • TestOptimal
  • TGV
  • Tigris
  • TorX
  • TorXakis
  • T-Vec
  • Tveda
  • Uppaal-Cover
  • Uppaal-Tron
  • . . . . . . . . . . .

40

TorXakis

slide-41
SLIDE 41

41

under

specification

partial

specification

uncertainty

nondeterminism

multiple paradigms

integration

test selection model composition state + complex data abstraction concurrency parallelism

statistical usage profiles SUT

TorXakis

model

MBT Next Step Challenges - TorXakis

slide-42
SLIDE 42

a?n a?k x!n+1 x!p [[p<n]] b?m y!`no` a?n y!`yes`

model

TorXakis : A Black-Box View on Systems

42

modelled as state-transition system

SUT b a x y

black-box system view

slide-43
SLIDE 43

43

MBT with LTS TorXakis Representation of LTS Models

slide-44
SLIDE 44

44

  • Explicit :  { S0, S1, S2, S3 } ,

{C10,Coffee,Tea} , { (S0,C10,S1), ( S1, Coffee, S2 ), (S1,Tea,S3) } , S0 

  • Transition tree / graph
  • Language :

S ::= C10 >-> ( Coffee ## Tea )

Coffee C10 Tea S1 S2 S3 S0

S

Representing LTS

slide-45
SLIDE 45

45

A B A B

A >-> B A ## B

STOP

Representing LTS

A >-> B | | | C >-> D

D A C B D C A B D C A B

slide-46
SLIDE 46

A A A A A B B B B

p where p ::= A >-> p q where q ::= A >-> ( B | | | q )

46

Representing LTS

slide-47
SLIDE 47

47

in ? n :: int [[ n ≠ 0 ]]

  • ut ! m :: int

[[ 0 < m < n ]]

  • ut ! m :: int

[[ 0 < m < -n ]]

semantics

STS : Symbolic Transition Systems

  • ut1
  • ut2
  • ut1
  • ut1
  • ut2
  • ut3
  • ut2
  • ut1
  • ut1
  • ut2
  • ut3

in-2 in-1 in3 in2 in1 in-3

  • ut1

in4 in-4

slide-48
SLIDE 48

basic behaviour = transition system

48

complex behaviour = process-algebraic composition

  • f transition systems
  • named behaviour definition
  • named behaviour use
  • sequence
  • choice
  • parallel
  • communication
  • exception
  • interrupt
  • hiding/abstraction

A ?n A ?k X !n+1 X !p [[p<n]] B ?m Y !`no` A ?n Y !`yes`

TorXakis : Defining Behaviour - STS

slide-49
SLIDE 49

49

Hello World

slide-50
SLIDE 50

Models

  • state-based control flow and complex data
  • support for parallel, concurrent systems
  • composing complex models from simple models
  • non-determinism, uncertainty
  • abstraction, under-specification

50

Tool

  • on-line MBT tool

Under the hood

  • powerful constraint/SMT solvers (Z3, CVC4)
  • well-defined semantics and algorithms
  • ioco testing theory

for symbolic transition systems

  • algebraic data-type definitions

TorXakis : Overview

But ....

  • research prototype
  • poor usability

Current Research

  • test selection
  • partial models &

composition https://github.com/TorXakis

Applications

  • several high-tech

systems companies

  • experimental level
slide-51
SLIDE 51

51

Model-Based Testing Applications

slide-52
SLIDE 52

Experience with MBT in Embedded Systems Industry

Systems

  • large, complex, systems
  • complex state + complex data
  • not always up-to-date specifications
  • compositional, parallelism
  • under-specification, non-determinism

Testing

  • state of practice:

keyword-driven test automation

  • instrumentation: existing

keyword-driven test automation SUT

  • testing on real software

with simulated hardware/physics Models

  • how to make models ?
  • who makes models? : Testers
  • DSL (Domain Specific Languages)
slide-53
SLIDE 53

53

Model-Based Testing Testing Dropbox with TorXakis

slide-54
SLIDE 54

Model-Based Testing of Dropbox

  • Inspired by Dropbox testing with Quviq Quickcheck:
  • J. Hughes, B. Pierce, T. Arts, U. Norell,

Mysteries of DropBox: Property-Based Testing

  • f a Distributed Synchronization Service.

In IEEE ICST, pp. 135–145. 1. Dropbox 2. Testing approach 3. Model 4. Test runs

54

slide-55
SLIDE 55

Dropbox : A File Synchronization Service

55

Dropbox Server

slide-56
SLIDE 56

Testing Dropbox

56

Dropbox Client Dropbox Client 1

Dropbox Server

Dropbox Client 2 Read0 Write0 Read1 Write1 Read2 Write2

slide-57
SLIDE 57

connectivity multi disciplinarity change variability evolvability complexity uncertainty heterogeneous components

57

MBT : Next Step Challenges

model composition abstraction under

specification

uncertainty

nondeterminism

concurrency

parallelism

state + complex data

statistical usage profiles

partial

specification

multiple paradigms

integration

test selection

Model Based Testing

slide-58
SLIDE 58

A Dropbox Model

58

localVal0 clean0 fresh0 localVal1 clean1 fresh1

serverVal conflicts

localVal2 clean2 fresh2

𝐽𝑜 ? 𝑆𝑓𝑏𝑒𝑂 𝑃𝑣𝑢 ! 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂 𝐽𝑜 ? 𝑋𝑠𝑗𝑢𝑓𝑂 𝑊

𝑜𝑓𝑥

𝑃𝑣𝑢 ! 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂 ∷= 𝑊

𝑜𝑓𝑥

𝑑𝑚𝑓𝑏𝑜𝑂 ∷= 𝐺𝑏𝑚𝑡𝑓 ¬ 𝑔𝑠𝑓𝑡ℎ𝑂 ∧ 𝑑𝑚𝑓𝑏𝑜𝑂 → 𝐸𝑝𝑥𝑜𝑂 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂 ∷= 𝑡𝑓𝑠𝑤𝑓𝑠𝑊𝑏𝑚 𝑔𝑠𝑓𝑡ℎ𝑂 ∷= 𝑈𝑠𝑣𝑓

¬ 𝑑𝑚𝑓𝑏𝑜𝑂 → 𝑉𝑞𝑂 𝑑𝑚𝑓𝑏𝑜 ∷= 𝑈𝑠𝑣𝑓 𝑗𝑔 𝑔𝑠𝑓𝑡ℎ𝑂 𝑢ℎ𝑓𝑜 𝑗𝑔 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂 ≠ 𝑡𝑓𝑠𝑤𝑓𝑠𝑊𝑏𝑚 𝑢ℎ𝑓𝑜 𝑔𝑠𝑓𝑡ℎ𝑂′ ∷= 𝐺𝑏𝑚𝑡𝑓 𝑔𝑝𝑠 𝑏𝑚𝑚 𝑂′ ≠ 𝑂 𝑡𝑓𝑠𝑤𝑓𝑠𝑊𝑏𝑚 ∷= 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂 𝑓𝑚𝑡𝑓 𝑗𝑔 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂 ∉ 𝑡𝑓𝑠𝑤𝑓𝑠𝑊𝑏𝑚, ⊥ 𝑢ℎ𝑓𝑜 𝑑𝑝𝑜𝑔𝑚𝑗𝑑𝑢𝑡 ∷= 𝑑𝑝𝑜𝑔𝑚𝑗𝑑𝑢𝑡 ∪ { 𝑚𝑝𝑑𝑏𝑚𝑊𝑏𝑚𝑂}

[[ not(lookup(fresh,Node(N))) /\ lookup(clean,Node(N)) ]] =>> Down >-> dropBox [ In,Out,Down,Up ] ( serverVal , conflicts , update(localVal,Node(N),serverVal) , update(fresh,Node(N),True) , clean )

HIDE [Up,Down] IN dropbox NI

slide-59
SLIDE 59

59

system model

SUT

TTCN TTCN

Test cases pass fail model-based test generation test execution

MBT

slide-60
SLIDE 60

TorXakis

60

system model

SUT

pass fail

  • n-the-fly

MBT

TorXakis : On-the Fly MBT

slide-61
SLIDE 61

pass fail

TorXakis

61

TorXakis : Dropbox

dropbox model.txs

adapter adapter adapter

VM 0 Dropbox Client 0 VM 1 Dropbox Client 1

Dropbox Server

VM 2 Dropbox Client 2

slide-62
SLIDE 62

Dropbox Test Run

TXS >> ..11: IN: In0 ! Write(Value("X")) TXS >> ..12: OUT: Out0 ! File(Value("SHK")) TXS >> ..13: IN: In2 ! Write(Value("A")) TXS >> ..14: OUT: Out2 ! File(Value("$")) TXS >> ..15: IN: In2 ! Write(Value("SP")) TXS >> ..16: OUT: Out2 ! File(Value("A")) TXS >> ..17: IN: In1 ! Write(Value("BH")) TXS >> ..18: OUT: Out1 ! File(Value("P")) TXS >> ..19: IN: In2 ! Read TXS >> ..20: OUT: Out2 ! File(Value("SP")) TXS >> ..21: IN: In0 ! Read TXS >> ..22: OUT: Out0 ! File(Value("X")) TXS >> ..23: IN: In2 ! Write(Value("PXH")) TXS >> ..24: OUT: Out2 ! File(Value("X"))

62

slide-63
SLIDE 63

Dropbox Test Run

TXS >> ..77: IN: In0 ! Stabilize TXS >> ..78: OUT: Out0 ! File(Value("P")) TXS >> ..79: OUT: Out0 ! File(Value("L")) TXS >> ..80: OUT: Out0 ! File(Value("TK")) TXS >> ..81: OUT: Out0 ! File(Value("P")) TXS >> Expected: ( { Out0[ $"Out0"$1266 ] } , ( IF isFile($"Out0"$1266) THEN isValueInList(["PH","H"],value($"Out0"$1266)) ELSE False FI ) ) TXS >> FAIL: Out0 ! File(Value("P"))

63

slide-64
SLIDE 64

model

SUT

pass fail TorXakis

New software testing methodologies are needed if testing shall keep up with software development. Model-based testing may be one of them.

Model-Based Testing

The Next Step in Test Automation ! ?

?

https://github.com/TorXakis