Model-Based Testing Alexander Pretschner TU Kaiserslautern and - - PDF document

model based testing
SMART_READER_LITE
LIVE PREVIEW

Model-Based Testing Alexander Pretschner TU Kaiserslautern and - - PDF document

Model-Based Testing Alexander Pretschner TU Kaiserslautern and Fraunhofer IESE Saarbrcken, 31/05/2010 Motivation The oracle problem Automatically deriving tests that include fine-granular expected output information: more than


slide-1
SLIDE 1

1

Model-Based Testing

Alexander Pretschner TU Kaiserslautern and Fraunhofer IESE Saarbrücken, 31/05/2010

Model-Based Testing, 31/5/2010, Alexander Pretschner

2

Motivation

► The oracle problem ► Automatically deriving tests that include fine-granular

expected output information: more than robustness testing

► Specifications (expected output) tend to be bad ► Common “methodologies” for deriving test cases are,

because of their level of abstraction, not too helpful

► “Build partitions”—but that’s the nature of the beast ► Process of deriving tests not reproducible and not

systematic; bound to the ingenuity of single engineers

slide-2
SLIDE 2

2

Model-Based Testing, 31/5/2010, Alexander Pretschner

3

Overview

► Motivation ► Models and Abstraction ► Scenarios ► Selection Criteria ► Generation Technology ► Cost Effectiveness and Evidence ► Summary

Model-Based Testing, 31/5/2010, Alexander Pretschner

4

Goal of Today’s Class

► Understand the ideas of model-based testing ► Understand where you have to think about its deployment ► Know what it can do and what it can’t ► Know where and not automation is likely to be possible ► Be able to, in principle, conceive a set-up for model-based

testing in your context

► Decide on abstraction, build model, decide on test

selection criteria, perform test case generation, execute generate tests, judge what you did

► Clearly, that’s domain-specific

slide-3
SLIDE 3

3

Model-Based Testing, 31/5/2010, Alexander Pretschner

5

Testing

test cases system environment Understanding of specification, mental model

Model-Based Testing, 31/5/2010, Alexander Pretschner

6

Model-Based Testing

test cases explicit behavior model test case specification validation verification model‘s output = system‘s output? AG ϕ⇒ψ system environment

slide-4
SLIDE 4

4

Model-Based Testing, 31/5/2010, Alexander Pretschner

7

Test Generation and Execution

1 2 3 4

run 4 3 2 1 system 1 4 3 2 model test case 4 3 2 1 test execution

Model-Based Testing, 31/5/2010, Alexander Pretschner

8

Levels of Abstraction

test cases concretization (I) abstraction (O) comparison Umwelt AG ϕ⇒ψ γ α complexity distributed between model and driver test case specification system environment

slide-5
SLIDE 5

5

Model-Based Testing, 31/5/2010, Alexander Pretschner

9

Levels of Abstraction: Example

test cases test cases concreti- zation concreti- zation comp- arison comp- arison

AskRandom(19) ResRand(19) << 81 84 00 00 13 >> << 12 47 A4 A8 E5 38 62 6F 09 22 83 22 B9 3E F2 3F 5E 85 60 90 00 >>

„AskRandom“

card specific data (keys, PINs) card specific data (keys, PINs)

Slide: Jan Philipps

Model-Based Testing, 31/5/2010, Alexander Pretschner

10

Example II: Autonomous Parking

Functionality Abstract Functionality: Don’t enter collision area

Taken from Buehler, Wegener: Evolutionary Functional Testing of an Automated Parking System, CCCT’03

slide-6
SLIDE 6

6

Model-Based Testing, 31/5/2010, Alexander Pretschner

11

Flavors of Model-Based Testing

Utting, Pretschner, Legeard: A taxonomy of MBT, technical report 04/2006, University of Waikato, May 2006

Model-Based Testing, 31/5/2010, Alexander Pretschner

12

Difficult Questions

► What is modeled? How are models validated? ► What is tested, and how is this specified? ► How are test cases computed and executed? ► Do explicit behavior models yield better and cheaper

products?

► Or is it better to just define test cases? ► E.g., test cases in XP serve as specification ► Aren’t reviews or inspections more efficient and effective?

slide-7
SLIDE 7

7

Model-Based Testing, 31/5/2010, Alexander Pretschner

13

Overview

► Models ► Scenarios ► Selection Criteria ► Generation Technology ► Cost Effectiveness and Evidence ► Summary

Model-Based Testing, 31/5/2010, Alexander Pretschner

14

test cases explicit behavior model test case specification validation verification model‘s output = system‘s output? AG ϕ⇒ψ system environment

slide-8
SLIDE 8

8

Model-Based Testing, 31/5/2010, Alexander Pretschner

15

Implementation and Environment

► Models of (partial) environment often necessary ►SW almost always based on assumptions

(⇒ integration/system tests)

►Simulation, test case generation 4 3 2 1

Model-Based Testing, 31/5/2010, Alexander Pretschner

16

Abstraction: Models of SUT and Environment

Utting, Pretschner, Legeard: A taxonomy of MBT, technical report 04/2006, University of Waikato, May 2006

slide-9
SLIDE 9

9

Model-Based Testing, 31/5/2010, Alexander Pretschner

17

Purpose of Abstractions

► Insights into a system ► Specification ► Encapsulated access to parts of a system ► Communication among developers ► Code generation ► Test case generation ► …

Model-Based Testing, 31/5/2010, Alexander Pretschner

18

One: Models encapsulate Details

► Like “abstractions” in programming languages:

subroutines, exceptions, garbage collection, Swing

► No or “irrelevant” loss of information

  • “macro expansion”
  • Example: MDA for communication infrastructure

► Separation of concerns, orthogonality ► Matlab-Simulink-like ► Block diagrams: architecture and behavior ► 1:1 representation of a differential equation ► Encapsulation of concrete computation ► Helpful for MBT but not sufficient if validation of model is done by

simulation only

► Is it easier to test a Java program than to test the corresponding

bytecode?

slide-10
SLIDE 10

10

Model-Based Testing, 31/5/2010, Alexander Pretschner

19

Two: Models omit Details

► Simplification with “relevant” loss of information ► Intellectual mastery; “refinement” ► “Complexity essential, not accidental” [Brooks’87] ► Functionality, Data, Scheduling, Communication,

Performance

Model-Based Testing, 31/5/2010, Alexander Pretschner

20

Abstractions I

► Function ►Restriction to a particular function(ality) ►Detection of feature interactions? ► Data ►No loss of information: binary numbers → integers ►Loss of information: equivalence classes → 1 symbol ► Communication ►ISO/OSI stack:

complex interaction at bottom → 1 (inter-)action above

►Corba, J2EE

slide-11
SLIDE 11

11

Model-Based Testing, 31/5/2010, Alexander Pretschner

21

Abstractions II

► Time (more general: QoS) ► Ignore physical time; nondeterministic timeouts ► Granularity of time ► Permutations of sequences of signals (underspecification

in the model)

► Implies natural restrictions w.r.t. tests

Model-Based Testing, 31/5/2010, Alexander Pretschner

22

Levels of Abstraction

► Model as precise as SUT—directly validate SUT! ► Reuse of model components? ► Validate integrated model ► Reuse of environment models? ► Directly test SUT ► Parametrization of the model? ► Informal inductive argument ► One model as reference implementation? ► Conformance tests—why not directly use test cases?

slide-12
SLIDE 12

12

Model-Based Testing, 31/5/2010, Alexander Pretschner

23

Behavior Models

► Executability helps with validation ► Prototypes ► Some disagree: carrying out proofs is much better for

validation

► Behavior models need not be executable ► E.g., specification of a sorted array ► Quantifiers very powerful modeling abstractions ► Many specification styles; many boil down to pre and

postconditions

► “declarative” rather than “operational” ► Doesn’t impact our analysis of model-based testing

Model-Based Testing, 31/5/2010, Alexander Pretschner

24

So what?

► Encapsulation helpful if model is to be reviewed (not

simulated/tested)

► But models for test case generation must be written down ►Appropriate languages ►SUT and environment ► Models “better” since “simpler” ►But complexity essential, not accidental ►Missing information must be given by a human ► Simplifying models for test case generation rather than for

code generation!

slide-13
SLIDE 13

13

Model-Based Testing, 31/5/2010, Alexander Pretschner

25

Example – Part I

► Chip card ► Components encapsulate

behavior and private data state

► Communication

exclusively via channels

► Structure motivated by

functional decomposition

Philipps et al., Model-based Test Case Generation for Smart Cards, Proc. FMICS’03

Model-Based Testing, 31/5/2010, Alexander Pretschner

26

Example – Part I

► Behavior of one

CardHolderVerification component

► Wrong PIN increases PIN counter ► Max PIN counter → card blocked ► Extended Finite State Machine

Transitions i?X∧γ ∧ o!Y∧α

slide-14
SLIDE 14

14

Model-Based Testing, 31/5/2010, Alexander Pretschner

27

Example – Part I

► Environment models ► Restrict possible input

  • utput

Model-Based Testing, 31/5/2010, Alexander Pretschner

28

Example – Part I – Abstraction

► Function: rudimentary file system ► Random numbers: “rnd” ► No actual computation of crypto operations ► Driver ► Abstract commands ► No testing at the level of corrupt APDUs ► Done separately ► No hardware-based attacks

slide-15
SLIDE 15

15

Model-Based Testing, 31/5/2010, Alexander Pretschner

29

Example – Part I – Abstraction

Test sequences Test sequences Concreti- zation Concreti- zation Com pa- rison Com pa- rison

„ PSOVerifyDigSig“

PSOVerifyDigSig(SigCA) ResVerifyDigSig( KeyPubCA, DigCA, SigCA) << 81 2A 00 A8 83 9E 81 ... (Signature of CA) >> << 90 00 >>

MSE: Public Key and Digest of CA

Card specific data ( keys, PI Ns) Card specific data ( keys, PI Ns)

Slide: Jan Philipps

Model-Based Testing, 31/5/2010, Alexander Pretschner

30

Overview

► Models ► Scenarios ► Selection Criteria ► Generation Technology ► Cost Effectiveness and Evidence ► Summary

slide-16
SLIDE 16

16

Model-Based Testing, 31/5/2010, Alexander Pretschner

31

test cases explicit behavior model test case specification validation verification model‘s output = system‘s output? AG ϕ⇒ψ system environment

Model-Based Testing, 31/5/2010, Alexander Pretschner

32

Scenario I: Tests and Code generated from 1 Model

Testfälle α/γ Requirements

  • Env. assumptions

Code generator Test cases Generation Generation Model AG ϕ⇒ψ Test case specs HW, OS, Legacy Code HW, OS, Legacy

slide-17
SLIDE 17

17

Model-Based Testing, 31/5/2010, Alexander Pretschner

33

Discussion: One Model for Both

► Generation: no redundancy → no verification ► “exceptions” don’t occur—model is valid, generator as well (or is it?) ► Tests for ► Code generators (simulation and production)—MDD ► Assumptions on the environment ► Possibly performance/stress ► Exceptions ► Models valid → that‘s alright! ► Different flavor of MBT ► No “double check” model ⇔ implementation ► Abstraction levels ► Test and development models ► Model as basis for manual implementation

Model-Based Testing, 31/5/2010, Alexander Pretschner

34

Scenario II: Two Models

Model Testfälle Requirements AG ϕ⇒ψ Test case specs Test cases Redundancy HW, OS, Legacy Code HW, OS, Legacy α/γ Model Generation Generation, Manual Build

slide-18
SLIDE 18

18

Model-Based Testing, 31/5/2010, Alexander Pretschner

35

Discussion: Two Models

► Expensive ► Redundancy ► Different levels of abstraction ► Both tests and code profit from the (alleged) advantages of

model-based development

► Precise specifications ►Car manufacturers and suppliers ►Behavior models lead to better specifications ►Model alone no (good) specification

Model-Based Testing, 31/5/2010, Alexander Pretschner

36

Scenario III: Model only for TC Generation

Model Testfälle Requirements AG ϕ⇒ψ Test case specs Test cases Redundancy HW, OS, Legacy Code HW, OS, Legacy α/γ Generation Manual Build Specification

slide-19
SLIDE 19

19

Model-Based Testing, 31/5/2010, Alexander Pretschner

37

Discussion: model for test only

► Redundancy ► Expensive; concentration on critical parts possible (?) ► Interleaving code/model with changing requirements ► Specification doesn’t profit from benefits of model-based

development

► Assessment of new model-based testing technology ► “Conformance” tests: suppliers must show adherence to

model

► Scenario of our running chip card example

Model-Based Testing, 31/5/2010, Alexander Pretschner

38

Scenario IV: Model Extraction from Code

Model Testfälle Requirements AG ϕ⇒ψ Test case specs Test cases

  • poss. redundancy

HW, OS, Legacy Code HW, OS, Legacy α/γ Generation Manual Build Specification Extraction

slide-20
SLIDE 20

20

Model-Based Testing, 31/5/2010, Alexander Pretschner

39

Discussion: Model Extraction

► Abstractions always bound to purpose and domain:

automation?

► Automatic generation: redundancy? ► Interleaving code/model? ► Ex-post development of tests ► Assessment of new generation technology with manual

extraction

► Tests for “exception/no exception” possible

Model-Based Testing, 31/5/2010, Alexander Pretschner

40

Continuous Testing

► Assume execution and analysis of tests come at no cost ► Generation of tests in the background ► Execution of tests in the background ► Abstraction level possibly exceptions/no exceptions ► Maturity of software ► Too many detected errors → tedious analysis ► Embedded systems ► Execution takes time ► Simulators ► Business information systems are different

slide-21
SLIDE 21

21

Model-Based Testing, 31/5/2010, Alexander Pretschner

41

Summary I

► 1 model for both ► No redundancy, no double check ► “Test models” different from “development models” ► Cf. argument on using abstract models ► 2 distinct models ► Redundancy ► Expensive ► Different levels of abstraction possible

Model-Based Testing, 31/5/2010, Alexander Pretschner

42

Summary II

► 1 model for tests ► Redundancy ► Changing requirements: interleaving model and code

development?

► OEM builds model, suppliers have to conform to it ► 1 model from code ► Redundancy? ► Ex-post development of test cases only ► [Pretschner’05]

slide-22
SLIDE 22

22

Model-Based Testing, 31/5/2010, Alexander Pretschner

43

And in the real world?

► Model-based testing in the hardware industry ► Need for redundancy is acknowledged ► Reluctance in the SW industry! ► Stochastic testing: reliability engineering ► Continuous systems in Matlab: test code generators ► Models primarily built for test case generation:

stage of case studies

► For SW, I haven’t encountered the situation where two

distinct models are built ($$$)

► Generate tests to validate models is rather common

Model-Based Testing, 31/5/2010, Alexander Pretschner

44

Overview

► Models ► Scenarios ► Selection Criteria ► Generation Technology ► Cost Effectiveness and Evidence ► Summary

slide-23
SLIDE 23

23

Model-Based Testing, 31/5/2010, Alexander Pretschner

45

test cases explicit behavior model test case specification validation verification model‘s output = system‘s output? AG ϕ⇒ψ system environment

Model-Based Testing, 31/5/2010, Alexander Pretschner

46

Test Purpose and Test Case Specification

► Familiar problem … ► Irrelevant if model-based or not ► Test cases: selected “relevant” traces ► What‘s “relevant“? What‘s “good“? ► Test purpose informal, TC spec formal

slide-24
SLIDE 24

24

Model-Based Testing, 31/5/2010, Alexander Pretschner

47

Test purpose, TC specification, test case

► TC spec. formalizes test purpose and renders it operational ►E.g., an invariant cannot directly be tested

Test purpose TC Spec. Test Case Requiremts spec Specification Implementation

informal intensional extensional

Model-Based Testing, 31/5/2010, Alexander Pretschner

48

Selection Criteria

functional structural stochastic ad-hoc

X

fault-based

slide-25
SLIDE 25

25

Model-Based Testing, 31/5/2010, Alexander Pretschner

49

Summary

► Functional criteria ► Specific to domain or application; requirements ► Methodological support ► Structural criteria ► Independent of domain ► Data flow, control flow, data ► Automatic generation of TC specs and test cases ► Measurable ► Ability to reveal faults unclear ► Models of SUT and environment

Model-Based Testing, 31/5/2010, Alexander Pretschner

50

Summary II

► Stochastic criteria ► Uniform distributions: “purely at random” ► User profiles ► In general, not “worse” than structural criteria ► People tend to agree that there’s not one single good

criterion!

slide-26
SLIDE 26

26

Model-Based Testing, 31/5/2010, Alexander Pretschner

51

Test Case Generation

► Search problem ► Techniques ►Dedicated algorithms for dedicated criteria ►(Bounded) model checking ►Deductive theorem proving ►Symbolic execution ►[Lucio’05]

Model-Based Testing, 31/5/2010, Alexander Pretschner

52

Search Problem

► Enumerate traces and select w.r.t. TC specification ► Respect constraints during enumeration ► Functional criteria ► General problem: find traces that cover

edges/nodes/special data values in the control flow and data flow graphs

► Structural criteria ► Directed/heuristic search ► Often, it is a good idea not to visit states twice ► State storage ► Minimization of test suites not covered today

slide-27
SLIDE 27

27

Model-Based Testing, 31/5/2010, Alexander Pretschner

53

Overview

► Models ► Scenarios ► Selection Criteria ► Generation Technology ► Cost Effectiveness and Evidence ► Summary

Model-Based Testing, 31/5/2010, Alexander Pretschner

54

Assumptions

► Effectiveness and cost effectiveness ► Models help with getting requirements/specs straight ► Test suite vs. model: creation and maintenance ► Existence of adequate level of abstraction ► Abstraction and precision ► Easy model validation and maintenance ► Distribution of complexity ► Reuse ► Simpler changes in the model (plus push button) ► Adaptor and environment models/TC specifications

slide-28
SLIDE 28

28

Model-Based Testing, 31/5/2010, Alexander Pretschner

55

Evidence: (Cost) Effectiveness

► “Model-Based Testing does find errors” ► Different/more errors in SUT? ► Farchi et al. ’02, Pretschner et al. ’05 ► Except for last study: no precise description of reference ► Ongoing dispute on comparison with reviews ► Errors in model or specs ► Cost Effectiveness ► Farchi et al. ’02, Bernard et al.’04, Sinha et al. ’06 ► “building tests took less time” ► In sum: hard to admit, but very little evidence! ► But: neither empirical evidence about benefits of OO

software

Model-Based Testing, 31/5/2010, Alexander Pretschner

56

Coverage?

► Unsettled discussion on benefits of structural criteria ► Inconclusive studies on both control and data flow ► Not surprisingly, using such a criterion “leads to failures

that would have gone undetected”

► DO-178B recommends MC/DC for level A software ► Unclear if things change when used on specifications ► People agree: structural tests complement functional tests

slide-29
SLIDE 29

29

Model-Based Testing, 31/5/2010, Alexander Pretschner

57

Empirical Evidence

► Compare any “new” approach to random tests and

“traditionally developed tests”

► Homogeneous systems? ► Domain ► Stage of development ► Programming language ► Skills of programmers ► Complexity ► As always: generalization?!

Model-Based Testing, 31/5/2010, Alexander Pretschner

58

(Personal) Summary and Gut Feel

► Don’t rely on structural criteria only! ► Large state spaces, big problems, anyway! ► Abstract models for testing for exceptions might be cost-effective ► Run tests in the background ► Continuous testing if at no cost ► Model-Based Testing does find additional failures ► But it’s not entirely clear if these wouldn’t also have been found as

a result of carefully studying the specs

► Model in itself definitely helps (XP: tests are spec/model) ► Not necessarily automated generation ► Plenty of other low-level problems in the real world

slide-30
SLIDE 30

30

Model-Based Testing, 31/5/2010, Alexander Pretschner

59

Overview

► Models ► Scenarios ► Selection Criteria ► Generation Technology ► Cost Effectiveness and Evidence ► Summary

Model-Based Testing, 31/5/2010, Alexander Pretschner

60

Summary

► Model of SUT and environment at different levels of

abstraction

► Abstraction compulsory ► Oracle ► Possibly automated test generation with environment

model (statistical testing; structual criteria on encoded scenarios) and structure of model of the SUT

► But we still need to tell the machine what a good test

consists of!

► Different scenarios ► Different generation technologies ► As usual, little evidence …

slide-31
SLIDE 31

31

Model-Based Testing, 31/5/2010, Alexander Pretschner

61

My Personal Bottom Line

► Go for it! I do eat my own cooking! ► Don’t use it to write a script; model a stack? ► Use of models beyond testing important ► Specifications, contracts for suppliers/OEM ► Cost-effectiveness unlikely if nobody uses models

anyway

► Different levels of abstraction are acceptable ► Not so sure about automation ► Enforcement of test rationales can help tremendously ► Use knowledge on earlier failures; user profiles

Model-Based Testing, 31/5/2010, Alexander Pretschner

62

Literature

  • M. Broy, B. Jonsson, J.-P. Katoen, M. Leucker, A. Pretschner, “Model-Based Testing of

Reactive Systems”, Springer Verlag, 2005

► S. Sandberg, “Homing and Synchronizing Sequences”, chapter 1 in [Broy et al.’05] ► M. Krichen, “State Identification”, chapter 2 in [Broy et al.’05] ► H. Björklund, “State Verification, chapter 3 in [Broy et al.’05] ► A. Gargantini, “Conformance Testing”, chapter 4 in [Broy et al.’05] ► A. Pretschner, J. Philipps, “Methodological Issues in Model-Based Testing”, chapter

10 in [Broy et al.’05]

► L. Lucio and M. Samer, “Technology of Test-Case Generation”, chapter 12 in [Broy

et al.’05]

  • A. Pretschner, W. Prenninger, S. Wagner, C. Kühnel, M. Baumgartner, B. Sostawa, R.

Zölch, T. Stauner: One Evaluation of Model-Based Testing and its Automation, Proc. ICSE 2005, pp. 392—401, 2005

  • E. Farchi, A. Hartman, S. S. Pinter, Using a model-based test generator to test for

standard conformance, IBM Systems Journal 41 (1):89-110, 2002

  • E. Bernard, B. Legeard, X. Luck, F. Peureux, Generation of test sequences from formal

specifications: GSM 11.11 standard case-study, SW Practice and Experience 34 (10):915 – 948, 2004

  • A. Sinha, C. Williams, P. Santhanam, A measurement framework for evaluating model-

based test generation tools, IBM Systems Journal 45(3):501-514, 2006