Metamorphic Testing: A Simple Method for Testing Non-Testable - - PowerPoint PPT Presentation

metamorphic testing
SMART_READER_LITE
LIVE PREVIEW

Metamorphic Testing: A Simple Method for Testing Non-Testable - - PowerPoint PPT Presentation

Metamorphic Testing: A Simple Method for Testing Non-Testable Programs Tsong Yueh Chen tychen@swin.edu.au Swinburne University of Technology Australia 1 Test Oracle A mechanism or procedure against which the computed outputs could be


slide-1
SLIDE 1

1

Metamorphic Testing: A Simple Method for Testing Non-Testable Programs

Tsong Yueh Chen

tychen@swin.edu.au Swinburne University of Technology Australia

slide-2
SLIDE 2

2

Test Oracle

  • A mechanism or procedure against which

the computed outputs could be verified

slide-3
SLIDE 3

3

Example

To find the roots of the following polynomial

x**100 + 3*(x**99) – x**98+ ….. +5 Suppose the solutions for x are: 2.0, 6.5, ..

slide-4
SLIDE 4

4

Example

  • sin function

– sin(0o )=0 – sin(30o)=0.5

  • Suppose the program returns:

sin(29.8o )=0.51234 incorrect sin(29.8o )=0.49876 correct?

slide-5
SLIDE 5

5

Example

  • Shortest path problem SP(G, a, b)
  • Suppose the program returns:

– |SP(G, a, a)| = 5 incorrect – |SP(G, a, b)| = 10 where a and b are neighbours – |SP(G, a, b)| = 1,000,001 correct or incorrect?

slide-6
SLIDE 6

6

Non-Testable Programs

  • Absence of test oracle
  • Too expensive to apply test oracle
slide-7
SLIDE 7

7

Intuition of Metamorphic Testing

Though we do not know the correctness of the

  • utput of any individual input

We may know the relation between some related inputs and their outputs

slide-8
SLIDE 8

8

Example

  • Suppose sin(29.8o ) returns 0.49876
  • sin function has the following property

– sin(x)= sin(x+360)

  • Compute 29.8o + 360o = 389.8o
  • Execute the program with 389.8o as an input
  • Check whether sin(29.8o ) = sin(389.8o )
slide-9
SLIDE 9

9

Metamorphic Testing (A Simplified Form)

  • Define source (initial) test cases using some test

case selection strategies

  • Identify some properties of the problem (referred

to as the metamorphic relations)

  • Construct follow-up test cases from the source test

cases with reference to the identified metamorphic relations

  • Verify the metamorphic relations
slide-10
SLIDE 10

10

Categories of Research in MT

  • Applications of MT to specific application

domains with oracle problem

  • Integration of MT with other software

analysis and testing methods

  • Theory for MT
slide-11
SLIDE 11

11

Applications of MT

slide-12
SLIDE 12

12

Successful Applications of MT

  • Bioinformatics programs
  • Embedded systems
  • Machine learning software
  • Optimization systems
  • Compilers
  • Simulation systems
  • ….
slide-13
SLIDE 13

13

Interesting Results

Reveal undetected faults

  • Siemens suite

– print_token, schedule, and schedule_2

  • Compilers
  • Machine learning tool – Weka
  • …….
slide-14
SLIDE 14

14

A Recent Paper

  • Compiler Validation via Equivalence Modulo

Inputs, V. Le, M. Afshari and Z. Su, Proceedings

  • f 35th ACM SIGPLAN Conference on

Programming Language Design & Implementation (PLDI ’14), 216–226, 2014. Best Paper Award

slide-15
SLIDE 15

15

Testing Compilers

Their testing method is a MT method Its MR is:

If programs P and P’ are equivalent with respect to input I, then their object codes are equivalent with respect to I. http://blog.regehr.org.archives/1161

slide-16
SLIDE 16

16

a new test case selection method!

slide-17
SLIDE 17

17

Three Recent Papers

  • Metamorphic Model-Based Testing Applied on NASA

DAT –an Experience Report, M. Lindvall, D. Ganesan, R. Ardal and R. E. Wiegand, ICSE 2015, 129-138.

  • Research Directions for Engineering Big Data Analytics

Software, C. E. Otero and A. Peter, IEEE Intelligent Systems, 14-19, January/February 2015.

  • A Methodology for Validating Cloud Models Using

Metamorphic Testing, A. Nunez and R. M. Hierons, Annals of Telecommunications, Vol. 70(3), 127-135, 2015.

slide-18
SLIDE 18

18

Integration with Other Methods

slide-19
SLIDE 19

19

Example: Spectrum Based Fault Localization

  • A statistical approach to predict how likely

a program entity is faulty

  • Intuition

– More likely to be faulty if executed by more failed test cases – More likely to be faulty if executed by less passed test cases

slide-20
SLIDE 20

20

SBFL

  • Use a set of test cases with

– testing results of pass or fail – coverage information whether a program entity is executed or not

  • Use a formula to predict how likely a

program entity is faulty

slide-21
SLIDE 21

Example

 

1 2 3 4 5 6

: TS t t t t t t

1 2 3 4 5 6

1 1 1 1 1 1 1 1 1 1 : : 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 s s s P MS s s s                                        

 

: RE p p p p f f

21

slide-22
SLIDE 22

Example

 

1 2 3 4 5 6

: TS t t t t t t

1 2 3 4 5 6

1 1 1 1 1 1 1 1 1 1 : : 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 s s s P MS s s s                                        

 

: RE p p p p f f

:

i i i i i ef ep nf np

A a a a a  

2 4 3 2 1 1 2 3 : 1 3 1 1 2 3 1 2 4 MA                    

22

slide-23
SLIDE 23
  • Tarantula
  • Risk values
  • Ranking list

( ) / ( )

i i i ef ef ep T i i i i i i i ef nf ef nf ep np

a a a R s a a a a a a     

1 2 4 1 2 5 7 2      

 

1 2 3 4 5 6

s s s s s s

5 4 1 6 2 3

s s s s s s  

23

slide-24
SLIDE 24

24

SBFL

An Open Problem:

How to apply SBFL on non-testable programs?

slide-25
SLIDE 25

25

Integration of SBFL with MT

  • test case – metamorphic test group
  • pass or failure of a test case – satisfaction or

violation of a metamorphic test group

  • coverage by a test case – coverage by a

metamorphic test group

slide-26
SLIDE 26

26

Integration of SBFL with MT

  • Intuition

– More likely to be faulty if executed by more violated metamorphic test groups – More likely to be faulty if executed by less non- violated metamorphic test groups

slide-27
SLIDE 27

Example

 

1 2 3 4 5 6

: MTS g g g g g g

1 2 3 4 5 6

1 1 1 1 1 1 1 1 1 1 1 : : 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 s s s P MS s s s                                        

 

: RE n v n n v v

27

slide-28
SLIDE 28

Example

 

1 2 3 4 5 6

: MTS g g g g g g

1 2 3 4 5 6

1 1 1 1 1 1 1 1 1 1 1 : : 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 s s s P MS s s s                                        

 

: RE n v n n v v

:

i i i i i ef ep nf np

A a a a a  

3 3 1 3 2 1 3 2 : 2 3 1 3 2 1 3 3 MA                    

28

slide-29
SLIDE 29
  • Tarantula
  • Risk values
  • Ranking list

( ) / ( )

i i i ef ef ep T i i i i i i i ef nf ef nf ep np

a a a R s a a a a a a     

1 1 2 3 1 2 4 5 5 2      

 

1 2 3 4 5 6

s s s s s s

5 1 6 4 2 3

s s s s s s  

29

slide-30
SLIDE 30

30

Other Successful Integrations

slide-31
SLIDE 31

31

One Recent Paper

  • A Methodology for Validating Cloud Models Using

Metamorphic Testing, A. Nunez and R. M. Hierons, Annals of Telecommunications, Vol. 70(3), 127-135, 2015.

slide-32
SLIDE 32

32

Theory for Metamorphic Testing

slide-33
SLIDE 33

33

Metamorphic Testing

  • Some reminders

– MRs not restricted to identity relations and numeric relations – Multiple executions – Follow-up test cases may depend on the outputs

  • f the source test cases

– MT is applicable even if test oracle exists

slide-34
SLIDE 34

34

Metamorphic Relations

slide-35
SLIDE 35

35

Metamorphic Relations

  • Identification of MRs
  • Prioritization of MRs
  • Fault Detection Effectiveness of MRs
slide-36
SLIDE 36

36

Identification of MRs

  • MT can be automated except the

identification of MRs

slide-37
SLIDE 37

37

Identification of MRs

  • Is it feasible to identify or generate MRs?
slide-38
SLIDE 38

38

A Simple and Intuitive Approach

  • Select an input
  • Modify it, hopefully that the relevant

change of output will be somehow predictable. If yes, any generalisation? If yes, then identify an MR

slide-39
SLIDE 39

39

Identification of MRs

Various approaches

  • Machine learning (Columbia; Colorado State)
  • Data mutation (Oxford Brookes)
  • Coding (Peking)
  • Composition (Swinburne)
  • Category-choice framework (HK Poly; Wuhan)
  • …………
  • ……..
slide-40
SLIDE 40

Generation by Composition

  • Generation of new MRs from existing MRs

by composition

40

slide-41
SLIDE 41

Example

  • Shortest path problem: SP(G, a, b)
  • Suppose we have the following MRs

– MRA: |SP(G, a, b)| = |SP(G, b, a)|. – MRB: |SP(G, a, b)| = |SP(G’, a’, b’)|.

  • By composition, a new MR is defined as

– MRAB: |SP(G, a, b)| = |SP(G’, b’, a’)|.

41

slide-42
SLIDE 42

42

Prioritization of MRs

Consider sin(x)

MR1: sin(x) = sin(x + 2 ) MR2: sin(x) = -sin(x + ) MR3: sin(-x)= -sin(x) MR4: sin(x) = sin(-x) MR5: sin(x) = -sin(2  - x) …

slide-43
SLIDE 43

43

Priorization Approaches

  • Usage profile
  • Algorithm
slide-44
SLIDE 44

44

Usage Profile

  • Restricted use of programs – interested in a

subset of properties

slide-45
SLIDE 45

45

Usage Profile

  • sin(x)
  • Electrical Engineers
  • sin(x) = sin(x + 2 )
  • Land Surveyors
  • sin(-x)= -sin(x)
  • sin(x) = sin(-x)
slide-46
SLIDE 46

46

Algorithm

  • A problem may be solved by more than one

algorithm – sorting, adaptive random testing

  • An algorithm may be implemented in

different ways

slide-47
SLIDE 47

47

Example

  • Shortest Path problem:

SP(G, a, b) using forward expansion

A B

slide-48
SLIDE 48

48

Example

  • |SP(G, a, b)| = |SP(G’, a’, b’)|
  • |SP(G, a, b)| = |SP(G, a, c)| +|SP(G, c, b)|
  • |SP(G, a, b)| = |SP(G, b, a)|
slide-49
SLIDE 49

49

Fault-Detection Effectiveness How many MRs are required?

slide-50
SLIDE 50

50

Empirical Observation: a few diverse MRs

slide-51
SLIDE 51

x f(x) f(x) = axn + bxn-1 + …

51

slide-52
SLIDE 52

x f(x)

52

slide-53
SLIDE 53

53

Is MT a Black-Box Method?

slide-54
SLIDE 54

54

Example

    ! 5 ! 3 ) sin(

5 3

x x x x

slide-55
SLIDE 55

55

Example

  • MR1: sin(-x) = -sin(x)
  • MR2: sin(x) = sin(x + 2 )
slide-56
SLIDE 56

56

End-User Software Engineering

  • Limited knowledge of testing
  • Unaccessibility to testing tools
  • Need a testing method
  • easy to learn
  • easy to use
  • easy to automate
slide-57
SLIDE 57

57

End-User Software Engineering

  • Source test case selection strategy – any

available technique or test suite; otherwise special values, random or ad hoc selection

  • Selection of MRs –
  • usage profile
  • end-user’s domain knowledge
  • end-user’s code knowledge
slide-58
SLIDE 58

58

Diversity

the key underlying concept in test case selection strategies

slide-59
SLIDE 59

59

Diversity

  • Success of MT in revealing faults

undetected by other conventional testing methods

  • Diverse MRs in MT
slide-60
SLIDE 60

60

Diversity

underlying concept in software testing

slide-61
SLIDE 61

61

Conclusion Simplicity

slide-62
SLIDE 62

62

Thanks!

slide-63
SLIDE 63

63

References:

  • Metamorphic Testing: A Literature Review, S. Segura, A.
  • B. Sanchez and A. Ruiz-Cortes, Technical Report ISA-15-

TR-01, University of Seville, 2015.