Test Code by Resolving Method Call and Field Dependency Presented by - - PowerPoint PPT Presentation

test code by resolving method call and
SMART_READER_LITE
LIVE PREVIEW

Test Code by Resolving Method Call and Field Dependency Presented by - - PowerPoint PPT Presentation

5th International Workshop on Quantitative Approaches to Software Quality Automatically Identifying Dead Fields in Test Code by Resolving Method Call and Field Dependency Presented by Abdus Satter Institute of Information Technology University


slide-1
SLIDE 1

Automatically Identifying Dead Fields in Test Code by Resolving Method Call and Field Dependency

Presented by Abdus Satter Institute of Information Technology University of Dhaka, Dhaka 1000, Bangladesh

5th International Workshop on Quantitative Approaches to Software Quality

slide-2
SLIDE 2

Outline

Dec 4, 2017 Automatic Identification of Dead Field 2

 Broad Domain – Test Smell, Dead Field  Problem Specification  Literature Review  Research Question  Proposed Technique  Result Analysis  Conclusion

slide-3
SLIDE 3

Test Smell

Dec 4, 2017 Automatic Identification of Dead Field 3

DBContext dbContext = new DBContext(); void login(String userName, String password){ bool found = dbContext.exists(userName, password); if(found == true) session.set(“logged_in”); } String conStr = “+++++”; File sampleDataFile; testLogin(){ for each(String line: sampleDataFile.readLines()){ splitStr=line.split(‘ ’); assertEqual(login(splitStr[0],s plitStr[1],splitStr[2])); } } Production Code Test Code

slide-4
SLIDE 4

Test Smell

Dec 4, 2017 Automatic Identification of Dead Field 4

DBContext dbContext = new DBContext(); void login(String userName, String password){ bool found = dbContext.exists(userName, password); if(found == true) session.set(“logged_in”); }

String conStr = “+++++”; File sampleDataFile; testLogin(){ for each(String line: sampleDataFile.readLines()){ splitStr=line.split(‘ ’); assertEqual(login(splitStr[0],s plitStr[1]),splitStr[2]); assertEqual(logout(splitStr[0] ), true); } ) }

External Resource Poor Explanation

  • f Assertion

Unused Code Assume that Data exist in Database Testing Multiple Methods

slide-5
SLIDE 5

Dead Field

Dec 4, 2017 Automatic Identification of Dead Field 5

Production Code

slide-6
SLIDE 6

Dead Field

Dec 4, 2017 Automatic Identification of Dead Field 6

Production Code Setup Method

slide-7
SLIDE 7

Dead Field

Dec 4, 2017 Automatic Identification of Dead Field 7

Production Code Setup Method Setup Fields: account facebookAccount googleAccount

slide-8
SLIDE 8

Dead Field

Dec 4, 2017 Automatic Identification of Dead Field 8

Production Code Setup Method Setup Fields: account facebookAccount googleAccount Used Field: account

slide-9
SLIDE 9

Dead Field

Dec 4, 2017 Automatic Identification of Dead Field 9

Production Code Setup Method Setup Fields: account facebookAccount googleAccount Used Field: account Dead Fields: facebookAccount googleAccount

slide-10
SLIDE 10

Impact of Dead Field

Dec 4, 2017 Automatic Identification of Dead Field 10

 Increase line of code  Decrease maintainability  Make the code difficult to comprehend  Increase the amount of production code in Test Driven

Development (TDD) approach

slide-11
SLIDE 11

Refactoring: improving the design of existing code[1]

 Martin Fowler first introduced code smell  Discussed impact of code smells in software

maintainability

 Provided refactoring mechanism to remove code smells

Dec 4, 2017 Automatic Identification of Dead Field 11

slide-12
SLIDE 12

Refactoring Test Code[2]

 van Deursen first introduced term test smell  Identified eleven test smells based on the concept of code

smell

 Proposed different techniques to remove test smells from

test code

 However, no technique was provided to detect dead field

because this smell was not discovered at that time

Dec 4, 2017 Automatic Identification of Dead Field 12

slide-13
SLIDE 13

An Empirical Analysis of the Distribution of Unit Test Smells[3]

 Two studies were carried out by Bavota et al. on 18

software systems with twenty masters students

 Explained the distribution of test smells in test code  Described the impact of test smells in test code

comprehension in the controlled experiment

 However, they did not provide any information regarding

dead field

Dec 4, 2017 Automatic Identification of Dead Field 13

slide-14
SLIDE 14

Testq: Exploring structural and maintenance characteristics of unit test suites[4]

 Tool presented by Bart

Van Rompaey for exploring structural maintenance characteristics of test code

 Eleven different test smells like assertionless, eager test

and so on could be identified by the tool

 Uses test smell characteristics for test smell detection  Visualization feature helps to explore test suites visually  The tool could not identify dead field as no metric was

defined for the detection

Dec 4, 2017 Automatic Identification of Dead Field 14

slide-15
SLIDE 15

Rule-based assessment of test quality[5]

 This rule based tool proposed by Stefan Reichart for

assessing test code quality

 The tool parses the source code, analyzes the source

tree, detects pattern and identifies smell

 Pattern detection involved test smell characteristics  The tool could not detect dead fields as no rule was

provided for it

Dec 4, 2017 Automatic Identification of Dead Field 15

slide-16
SLIDE 16

Automated Detection of Test Fixture Strategies and Smells[6]

 Introduced five new test smells

 T

est Maverick

 Dead Fields  Lack of Cohesion of T

est Methods

 Obscure In-Line Setup  Vague Header setup

 Implemented a tool named TestHound to detect these

smells

 Could identify test smells well but could not identify dead

field correctly due to not considering field dependency and usage of setup fields properly

Dec 4, 2017 Automatic Identification of Dead Field 16

slide-17
SLIDE 17

Research Question

 How to develop a process to identify dead fields

automatically by analyzing method call and field dependency in test methods?

Dec 4, 2017 Automatic Identification of Dead Field 17

slide-18
SLIDE 18

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 18

 The technique comprises four steps –

 Call Graph Generation  Data Dependency Graph Generation  Setup Field Detection  Dead Field Detection

slide-19
SLIDE 19

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 19

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

slide-20
SLIDE 20

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 20

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; } int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

Parsing Methods

slide-21
SLIDE 21

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 21

setup int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); }

Call Graph Generation

slide-22
SLIDE 22

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 22

m2 int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; }

Call Graph Generation

setup

slide-23
SLIDE 23

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 23

m2 m3 int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; }

Call Graph Generation

setup

slide-24
SLIDE 24

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 24

m2 m3 m4 int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

Call Graph Generation

setup

slide-25
SLIDE 25

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 25

m2 m3 m4 int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

Call Graph Generation

setup

slide-26
SLIDE 26

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 26

m2 m3 m4 int setup (int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

Call Graph Generation

setup

slide-27
SLIDE 27

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 27

m2 m3 m4 int setup (int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

Method Call Resolution

setup

slide-28
SLIDE 28

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 28

m2 m3 m4 import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup (int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

Method Call Resolution

setup

slide-29
SLIDE 29

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 29

m2 m3 m4 import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } int m4(){ return p+x-y; }

Method Call Resolution

setup

slide-30
SLIDE 30

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 30

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; }

slide-31
SLIDE 31

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 31

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } p x y z

Data Dependency Graph Generation

slide-32
SLIDE 32

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 32

import java.lib2 import java.lib3 int p=200;

int x=p-3;

int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4); x=x+a+b; return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } p x y z

Data Dependency Graph Generation

slide-33
SLIDE 33

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 33

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4);

x=x+a+b;

return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){ return y*y-2*y; } p x y z

Data Dependency Resolution

slide-34
SLIDE 34

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 34

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20; int z; int setup(int a, int b){ a=lib2.pow(a,4);

x=x+a+b;

return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){

return y*y-2*y;

} p x y z

Data Dependency Resolution

slide-35
SLIDE 35

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 35

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20;

int z;

int setup(int a, int b){ a=lib2.pow(a,4);

x=x+a+b;

return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){

return y*y-2*y;

} p x y z

Data Dependency Resolution

slide-36
SLIDE 36

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 36

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20;

int z;

int setup(int a, int b){ a=lib2.pow(a,4);

x=x+a+b;

return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){

return y*y-2*y;

} p x y

Data Dependency Graph

int testDummy(){

  • dummy1();

} void dummy1(){

  • pow(2,x);

}

slide-37
SLIDE 37

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 37

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20;

int z;

int setup(int a, int b){ a=lib2.pow(a,4);

x=x+a+b;

return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){

return y*y-2*y;

} p x y int testDummy(){

  • dummy1();

} void dummy1(){

  • pow(2,x);

} Used setup field={ x, p}

Data Dependency Graph

slide-38
SLIDE 38

Dead Field Detector (DFD)

Dec 4, 2017 Automatic Identification of Dead Field 38

import java.lib2 import java.lib3 int p=200; int x=p-3; int y=20;

int z;

int setup(int a, int b){ a=lib2.pow(a,4);

x=x+a+b;

return m2(x); } int m2(int a){ return m3()-2*a; } int m3(){

return y*y-2*y;

} p x y int testDummy(){

  • dummy1();

} void dummy1(){

  • pow(2,x);

} Used setup field={ x, p} Dead Field = {x, p, y } - { x, p} = {y}

Data Dependency Graph

slide-39
SLIDE 39

Experimental Setup

Dec 4, 2017 Automatic Identification of Dead Field 39

 Implementation Details

 Java  Java Parser  Maven

 Experimental Project

 eGit  130K lines of code, 85 test classes and 10 modules

slide-40
SLIDE 40

Result Analysis

Dec 4, 2017 Automatic Identification of Dead Field 40

171 176 181 186 191 196 201 206 211 216 221 TH DFD MI Number of Dead Field

Number of Detected Dead Field Analysis

5 10 15 20 25 30 35 40 TH DFD MI number of setup fields

Number of Detected Setup Field Analysis

TH = TestHound DFD = Dead Field Detector MI = Manual Inspection

slide-41
SLIDE 41

Conclusion

Dec 4, 2017 Automatic Identification of Dead Field 41

 DFD seems to identify more setup fields and dead fields

correctly than existing approaches

 Automatic identification reduces the time and effort in

software maintenance

 In future, experiments will be performed on industrial

projects

slide-42
SLIDE 42

Thank You

42 Dec 4, 2017 Automatic Identification of Dead Field

slide-43
SLIDE 43

References

Dec 4, 2017 Automatic Identification of Dead Field 43  [1] Martin Fowler. Refactoring: improving the design of existing code. Pearson

Education India, 1999.

 [2] Arie van Deursen, Leon Moonen, Alex van den Bergh, and Gerard Kok.

Refactoring test code. CWI, 2001

 [3] Gabriele Bavota, Abdallah Qusef, Rocco Oliveto, Andrea De Lucia, and David

  • Binkley. An empirical analysis of the distribution of unit test smells and their impact
  • n software maintenance. In Proceedings of the 28th IEEE International Conference
  • n Software Maintenance (ICSM), 2012, pages 56–65. IEEE, 2012.

 [4] Manuel Breugelmans and Bart Van Rompaey. Testq: Exploring structural and

maintenance characteristics of unit test suites. In WASDeTT

  • 1: 1st International

Workshop on Advanced Software Development Tool and Techniques, 2008.

 [5] Stefan Reichhart, Tudor Gˆ ırba, and St ´ ephane Ducasse. Rule-based assessment

  • f test quality. Journal of Object Technology, 6(9):231–251, 2007.

 [6] Michaela Greiler, Arie van Deursen, and Margaret Anne Storey. Automated

detection of test fixture strategies and smells. In Proceedings of the Sixth International Conference on Software Testing, Verification and Validation (ICST), 2013 IEEE, pages 322–331. IEEE, 2013.