Automatic Debugging of Android Applications Pedro Machado - - PowerPoint PPT Presentation

automatic debugging of android applications
SMART_READER_LITE
LIVE PREVIEW

Automatic Debugging of Android Applications Pedro Machado - - PowerPoint PPT Presentation

Automatic Debugging of Android Applications Pedro Machado Supervisors: Rui Maranho (PhD) Jos Campos (MSc) Context Context 3 Context 4 Context Source: Gartner 5 Software Development Process Design Implement Test Release


slide-1
SLIDE 1

Pedro Machado Supervisors: Rui Maranhão (PhD) José Campos (MSc)

Automatic Debugging of Android Applications

slide-2
SLIDE 2

“Context”

slide-3
SLIDE 3

Context

3

slide-4
SLIDE 4

Context

4

slide-5
SLIDE 5

Context

5

Source: Gartner

slide-6
SLIDE 6

Software Development Process

Design Implement Test Release

Debug

6

slide-7
SLIDE 7

“Challenges”

slide-8
SLIDE 8

Debugging

Debugging Time Money

8

slide-9
SLIDE 9

Challenges

9

  • Debugging is a manual task
  • Resource constraints
  • Very specific architectures
slide-10
SLIDE 10

Challenges

Are mobile apps… … reliable?

10

slide-11
SLIDE 11

Hypothesis

Static and dynamic analysis improve the reliability of Mobile Apps.

11

slide-12
SLIDE 12

“Related Work”

slide-13
SLIDE 13

Android Debugging tools [6]

13

slide-14
SLIDE 14

GROPG [3]

14

slide-15
SLIDE 15

Fault localization tools

Gzoltar [4]

15

Tarantula [5]

slide-16
SLIDE 16

“Motivational Example”

slide-17
SLIDE 17

Spectrum-based fault localization

17

[1][2][4][5] class CharCount { ... static void count(String s){ int let, dig, other; for(int i = 0; i < s.length(); i++) { char c = s.charAt(i); if (‟A‟<=c && ‟Z‟>=c) let += 2; /* FAULT */ else if (‟a‟<=c && ‟z‟>=c) let += 1; else if (‟0‟<=c && ‟9‟>=c) dig += 1; else if (isprint(c))

  • ther += 1; }

System.out.println(let+" "+ dig+" "+other); } ...} 0.63 0.63 0.67 0.67 1.00 0.67 0.22 0.53 0.57 0.00 0.00 0.63

S.C.

1 2 3 4 5 6 7 8 9 10 11 12

1 2 3 4 5 6 7 8 9 10

Error Vector

slide-18
SLIDE 18

Lint static analysis

18

Lint yields issues that indicate potential bugs. Issues are characterized by their:

  • Category
  • Severity
  • Warning
  • Error
  • Fatal
  • Priority (values from 1 to 10);
slide-19
SLIDE 19

Motivational Example

19

slide-20
SLIDE 20

“Tool”

slide-21
SLIDE 21

Visualizations

21

slide-22
SLIDE 22

Static and Dynamic Analysis

22

slide-23
SLIDE 23

Static and Dynamic Analysis

23

Static Dynamic + Static

slide-24
SLIDE 24

MZoltar„s Workflow

24

Obtaining program spectra

Logcat Sockets Files EMMA

Run Tests

ADT / ADB

Instrument

ASM ASMDex JaCoCo

slide-25
SLIDE 25

“Evaluation”

slide-26
SLIDE 26

Research Questions

26

  • Is the instrumentation overhead

negligible?

  • Do diagnostiscs remain accurate

under Android‟s constrained environment?

  • Does Lint integration contribute to a

better diagnostic quality?

slide-27
SLIDE 27

Experimental Setup

27

Subject LOC Tests Coverage Resources LOC CharCount 148 10 92,2% 115 ConnectBot 32911 14 0,7% 7673 Google Authenticator 3659 170 76,6% 5275 StarDroid 13783 187 29,7% 2694

slide-28
SLIDE 28

Experimental Results

28

Subject Original Instrumented Overhead CharCount 1,82s 1,86s 2% ConnectBot 1,25s 1,35s 8% Google Authenticator 80,49s 87,26s 8% StarDroid 14,70s 15,46s 5% Average Overhead: 5,75%

slide-29
SLIDE 29

Metrics – Diagnostic Quality

29

  • Average number of lines to inspect
  • Coverage density
slide-30
SLIDE 30

Multiple Faults / Run

30

Charcount Connectbot Google Authenticator Stardroid

slide-31
SLIDE 31

Multiple Tests / Run

31

Charcount Connectbot Google Authenticator Stardroid

slide-32
SLIDE 32

Multiple Tests / Run

32

Average Time Reduction: 79% Average Cd Growth : 74%

slide-33
SLIDE 33

Lint analysis results

33

Subject Dynamic Dynamic + Static Bug 1 Bug 2 Bug 1 Bug 2 CharCount 148 148 1 ConnectBot 32911 32911 30 19 Google Authenticator 3659 3659 3 1 StarDroid 13783 13783 18 12 Average Cd Reduction: 99,9%

slide-34
SLIDE 34

“Conclusions”

slide-35
SLIDE 35

Conclusions

  • The developer only has to inspect an average of 5
  • ut of an average of 12625 lines, before finding the

bug;

  • Negligible instrumentation overhead (5.75%), despite

the Android constrained environment;

  • Grouping tests cases reduces time by 79%, but

decreases diagnostic quality by 74%;

  • Lint integration reduces the number of lines to inspect

by 99.9%.

35

slide-36
SLIDE 36

Conclusions

  • Reduce development costs
  • Reduce time to market
  • Increase quality and reliability

36

slide-37
SLIDE 37

Future Work

  • Port MZoltar to the Android Studio IDE;
  • Port MZoltar to other mobile technologies;
  • Correlate Lint issues and Runtime failures;
  • Perform an User Study to evaluate MZoltar;
  • Investigate a bug prediction approach.

37

slide-38
SLIDE 38

Publications

Published:

38

Submitted:

  • Pedro Machado, José Campos and Rui Abreu,

Automatic Debugging of Android Applications. In IJUP‟12, Porto, Portugal

  • Pedro Machado, José Campos and Rui Abreu,

MZoltar: Automatic Debugging of Android

  • Applications. In DeMobile‟13, St Petesburg,
  • Russia. (Co-located with ESEC/FSE)
  • Pedro Machado, José Campos and Rui Abreu.

Combining Static and Dynamic Analysis in Mobile Apps Fault Localization. In ISSRE‟13, Pasadena, CA, USA

slide-39
SLIDE 39

“Questions, anyone?”

slide-40
SLIDE 40

Work Plan – 2nd Semester

Months February March April May June Activities\Weeks 11 18 25 4 11 18 25 1 8 15 22 29 6 13 20 27 3 10 17 Framework Architecture particularities Integration with Gzoltar Specific visualizations Testing and validating results Article writing Thesis writing 40

slide-41
SLIDE 41

Motivation

Global smart phone sales v PC sales, 2011 (according to Canalys) Device Shipments 2011 (millions) Annual growth Smartphones 487.7 62.7% Tablets 63.2 274.2% Notebooks 209.6 7.5% Desktops 112.4 2.3% Netbooks 29.4

  • 25.3%

Source: Canalys (Feb 2012)

41

slide-42
SLIDE 42

Work Plan – 1st Semester

Months October November December January Activities\Weeks 8 15 22 29 5 12 19 26 3 10 17 24 31 7 14 21 28 SFL Concept Android Testing Architecture particularities Gzoltar Preparation of first presentation ADT Preparation of second presentation Writing the state of the art 42

slide-43
SLIDE 43

Metrics – Diagnostic Quality

43

Diagnostic Quality Density

slide-44
SLIDE 44

Similarity Coefficients

44

slide-45
SLIDE 45

References

[1] R. Abreu, P. Zoeteweij, and A.J.C. van Gemund, Spectrum-based Fault Localization. In Proceedings of 24th IEEE/ACM International Conference on Automated Software Engineering (ASE‟09), Auckland, New Zealand, November 2009. IEEE Computer Science. [2] R. Abreu, P. Zoeteweij, and A.J.C. van Gemund, On the accuracy of spectrum-based fault localization. In Proc. TAIC PART‟07, Sep 2007. [3] T.A. Nguyen, C. Csallner and N. Tillmann, GROPG: A Graphical On- Phone Debugger [4] J. Campos, A. Riboira, A. Perez, R. Abreu: GZoltar: an Eclipse plug-in for Testing and Debugging. ASE 2012 [5] Jones, J, Harrold, M.J.: Empirical evaluation of the tarantula automatic fault-localization technique. ASE 2005 [6] http://developer.android.com [7] A. P. Felt, E. Chin, S. Hanna, D. Song, and D. Wagner. Android Permissions Demystied. In Proceedings of the 18th ACM Conference on Computer and Communications Security, CCS ‟11, 2011 45