IPA: Error Propagation Analysis of Multithreaded Programs Using - - PowerPoint PPT Presentation

ipa error propagation analysis of multithreaded programs
SMART_READER_LITE
LIVE PREVIEW

IPA: Error Propagation Analysis of Multithreaded Programs Using - - PowerPoint PPT Presentation

IPA: Error Propagation Analysis of Multithreaded Programs Using Likely Invariants Abraham Chan*, Stefan Winter , Habib Saissi , Karthik Pattabiraman*, Neeraj Suri * The University of British Columbia Technische Universitt


slide-1
SLIDE 1

IPA: Error Propagation Analysis of Multithreaded Programs Using Likely Invariants

Abraham Chan*, Stefan Winter♰, Habib Saissi♰, Karthik Pattabiraman*, Neeraj Suri♰ *The University of British Columbia

♰Technische Universität Darmstadt

slide-2
SLIDE 2

Background Motivating Example Methodology Evaluation Conclusion

Fault Injection

▷ Evaluate the robustness of software ▷ Inject software simulated faults into programs

Background 2 Error Propagation Analysis

slide-3
SLIDE 3

Background Motivating Example Methodology Evaluation Conclusion

Error Propagation Analysis (EPA)

▷ Understand where and why errors manifest [Hiller et al.] ▷ Compare a golden trace with a faulty trace [Ammar et al.] ▷ What about multithreaded programs?

Background 3

Our contribution: EPA framework for multithreaded programs

slide-4
SLIDE 4

Background Motivating Example Methodology Evaluation Conclusion

EPA Example

  • 1. int func (int x) {

2. int a = x - 3; 3. if (a > 0) { 4. x = 1; 5. } else { 6. x = 2; 7. } 8. a = a + 5; 9. return x;

  • 10. }

Motivating Example 4

  • 1. x = 4, a = 1
  • 2. x = 1, a = 1
  • 3. x = 1, a = 6

Trace for x = 4

slide-5
SLIDE 5

Background Motivating Example Methodology Evaluation Conclusion

Fault Detection using Tracing

  • 1. int func (int x) {

2. int a = x - 3; 3. if (a < 0) { *Fault* 4. x = 1; 5. } else { 6. x = 2; 7. } 8. a = a + 5; 9. return x;

  • 10. }

Motivating Example 5

Faulty Trace Golden Trace Traces for x = 4

  • 1. x = 4, a = 1
  • 2. x = 2, a = 1
  • 3. x = 2, a = 6
  • 2. x = 1, a = 1
  • 3. x = 1, a = 6
  • 1. x = 4, a = 1
slide-6
SLIDE 6

Background Motivating Example Methodology Evaluation Conclusion

Multithreading

  • 1. int func (int x) {

2. int a = x - 3; 3. if (a > 0) { 4. x = 1; 5. } else { 6. x = 2; 7. } 8. a = a + 5; 9. return x;

  • 10. }

Motivating Example 6

Thread 1

  • 1. x = 4, a = 1
  • 2. x = 1, a = 1
  • 3. x = 1, a = 6

Thread 2

  • 4. x = 4, a = 1
  • 5. x = 1, a = 1
  • 6. x = 1, a = 6

Traces for x = 4 on 2 threads Who to trust? 1, 2, 3, 4, 5, 6 1, 4, 5, 6, 2, 3

slide-7
SLIDE 7

Background Motivating Example Methodology Evaluation Conclusion

Likely Invariants

▷ Statistically inferred from dynamic program traces with a confidence threshold ▷ Cheaper to infer than true invariants ▷ Likely Invariant Detection Tool: Daikon [Ernst et al.] ▷ Example: this.theArray.length >= 5

Methodology 7

slide-8
SLIDE 8

Background Motivating Example Methodology Evaluation Conclusion

Invariant Example

  • 1. int func (int x) {

2. int a = x - 3; 3. if (a > 0) { 4. x = 1; 5. } else { 6. x = 2; 7. } 8. a = a + 5; 9. return x;

  • 10. }

8

x == 4 Daikon Invariants inferred at x = 4 x == 1, a >

Methodology

slide-9
SLIDE 9

Background Motivating Example Methodology Evaluation Conclusion

Invariant Fault Detection

  • 1. int func (int x) {

2. int a = x - 3; 3. if (a < 0) { *Fault* 4. x = 1; 5. } else { 6. x = 2; 7. } 8. a = a + 5; 9. return x;

  • 10. }

9

x == 4 Faulty Trace

  • 1. x = 4, a = 1
  • 2. x = 2, a = 1
  • 3. x = 2, a = 6

x == 1, a > 0

Methodology

Daikon Invariants inferred at x = 4

slide-10
SLIDE 10

Background Motivating Example Methodology Evaluation Conclusion

Error Propagation

  • 1. int func (int x) {

2. int a = x - 3; 3. if (a < 0) { *Fault* 4. x = 1; 5. } else { 6. x = 2; 7. } 8. a = a + 5; 9. return x;

  • 10. }

10

x == 4 Invariants inferred at x = 4 x == 1, a >

Methodology Fault has propagated through the function.

slide-11
SLIDE 11

Background Motivating Example Methodology Evaluation Conclusion

Main Contributions

▷ Develop an EPA framework for multithreaded programs using likely invariants ▷ Empirically assess the efficacy of invariants for fault detection

Methodology 11

slide-12
SLIDE 12

Background Motivating Example Methodology Evaluation Conclusion

Traditional EPA Workflow

Compilation Instrumentation Profiling Fault Injection Golden Trace Faulty Trace Compare Program Inputs

Test Oracle

12 Methodology

slide-13
SLIDE 13

Background Motivating Example Methodology Evaluation Conclusion

Invariant Propagation Analysis (IPA)

Compilation Instrumentation Profiling Fault Injection Golden Trace Faulty Trace

Fault Detection

Program Inputs Invariant Inference Methodology

Test Oracle

13

Our work

slide-14
SLIDE 14

Background Motivating Example Methodology Evaluation Conclusion

Main Contributions

▷ Develop an EPA framework for multithreaded programs using likely invariants ▷ Empirically assess the efficacy of invariants for fault detection

Methodology 14

slide-15
SLIDE 15

Background Motivating Example Methodology Evaluation Conclusion

Research Questions

15

Q1: Stability Q2: Fault Coverage Q3: Performance

Evaluation

slide-16
SLIDE 16

Background Motivating Example Methodology Evaluation Conclusion

Experimental Setup

▷ 6 multi-threaded benchmarks

  • Domains: Sorting, scientific computing, web server
  • 3 from PARSEC suite

▷ Fault Injection Tool: LLFI

  • LLVM based fault injector
  • Developed at UBC’s Dependable Systems Lab
  • https://github.com/DependableSystemsLab/LLFI

Evaluation 16

slide-17
SLIDE 17

Background Motivating Example Methodology Evaluation Conclusion

Fault Model

▷ Common software bugs that are hard to detect through regression or unit tests [Vipindeep et al.] ▷ Faults Considered:

  • Data Corruption
  • File I/O Buffer Overflow
  • Buffer Overflow Malloc
  • Function Call Corruption
  • Invalid Pointer
  • Race Condition

Evaluation 17

slide-18
SLIDE 18

Background Motivating Example Methodology Evaluation Conclusion

RQ1: Stability

Evaluation 18

10 20 30 40 50 60 70 80 90 100 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Number of Likely Invariants Number of Profiling Runs Swaptions Nbds Blackscholes Quicksort Streamcluster Nullhttpd

slide-19
SLIDE 19

Background Motivating Example Methodology Evaluation Conclusion

RQ2: Fault Coverage

Evaluation 19

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Fault Coverage Fault Type

Quicksort

SDC Crash/Hang Benign

slide-20
SLIDE 20

Background Motivating Example Methodology Evaluation Conclusion

RQ3: Performance

Evaluation

▷ Setup Overhead: IPA is 2-90% slower than EPA ▷ Fault Detection: IPA is 2.7x to 151x faster than EPA ▷ Fault Detection time is amortized over experimental runs

20 Setup Fault Detection

slide-21
SLIDE 21

Background Motivating Example Methodology Evaluation Conclusion

Lessons Learned

▷ Fault coverage is dependant on the application and fault type ▷ Possible trade off between invariant stability and fault coverage ▷ Certain types of invariants may be better at detecting different faults

Conclusion 21

slide-22
SLIDE 22

Background Motivating Example Methodology Evaluation Conclusion

Summary

▷ Problem: Multithreaded programs produce nondeterministic golden

traces

▷ Approach: Use likely invariants to detect faults ▷ Result: Likely invariants offer good fault detection in many

applications ▷ Available at: http://github.com/DependableSystemsLab/LLFI-IPA ▷ Contact: http://ece.ubc.ca/~abrahamc/

Conclusion 22