parameter tuning for search based test data generation
play

Parameter Tuning for Search-Based Test-Data Generation Revisited - PowerPoint PPT Presentation

Parameter Tuning for Search-Based Test-Data Generation Revisited Support for Previous Results Anton Kotelyanskii Gregory M. Kapfhammer creative commons licensed ( BY-NC-ND ) ickr photo shared by sunface13 Software Testing Software


  1. Parameter Tuning for Search-Based Test-Data Generation Revisited Support for Previous Results Anton Kotelyanskii Gregory M. Kapfhammer creative commons licensed ( BY-NC-ND ) �ickr photo shared by sunface13

  2. Software Testing

  3. Software Testing Test Suites

  4. Software Testing Test Suites Automatic Generation

  5. Software Testing Test Suites Automatic Generation Confronting Challenges

  6. Software Testing Test Suites Automatic Generation Confronting Challenges Evaluation Strategies

  7. Empirical Studies

  8. Empirical Studies Challenges

  9. Empirical Studies Challenges Importance

  10. Empirical Studies Challenges Importance Replication

  11. Empirical Studies Challenges Importance Replication Rarity

  12. EvoSuite creative commons licensed ( BY-SA ) �ickr photo shared by mcclanahoochie

  13. EvoSuite Amazing test suite generator creative commons licensed ( BY-SA ) �ickr photo shared by mcclanahoochie

  14. EvoSuite Amazing test suite generator Uses a genetic algorithm creative commons licensed ( BY-SA ) �ickr photo shared by mcclanahoochie

  15. EvoSuite Amazing test suite generator Uses a genetic algorithm Input : A Java class creative commons licensed ( BY-SA ) �ickr photo shared by mcclanahoochie

  16. EvoSuite Amazing test suite generator Uses a genetic algorithm Input : A Java class Output : A JUnit test suite creative commons licensed ( BY-SA ) �ickr photo shared by mcclanahoochie

  17. EvoSuite Amazing test suite generator Uses a genetic algorithm Input : A Java class Output : A JUnit test suite http://www.evosuite.org/ creative commons licensed ( BY-SA ) �ickr photo shared by mcclanahoochie

  18. Parameter Tuning

  19. Parameter Tuning RSM : Response surface methodology

  20. Parameter Tuning RSM : Response surface methodology SPOT : Sequential parameter optimization toolbox

  21. Parameter Tuning RSM : Response surface methodology SPOT : Sequential parameter optimization toolbox Successfully applied to many diverse problems!

  22. Defaults or Tuned Values?

  23. Experiment Design creative commons licensed ( BY-NC ) �ickr photo shared by Michael Kappel

  24. Experiment Design Eight EvoSuite parameters creative commons licensed ( BY-NC ) �ickr photo shared by Michael Kappel

  25. Experiment Design Eight EvoSuite parameters Ten projects from SF100 creative commons licensed ( BY-NC ) �ickr photo shared by Michael Kappel

  26. Experiment Design Eight EvoSuite parameters Ten projects from SF100 475 Java classes for subjects creative commons licensed ( BY-NC ) �ickr photo shared by Michael Kappel

  27. Experiment Design Eight EvoSuite parameters Ten projects from SF100 475 Java classes for subjects 100 trials after parameter tuning creative commons licensed ( BY-NC ) �ickr photo shared by Michael Kappel

  28. Experiment Design Eight EvoSuite parameters Ten projects from SF100 475 Java classes for subjects 100 trials after parameter tuning Aiming to improve statement coverage creative commons licensed ( BY-NC ) �ickr photo shared by Michael Kappel

  29. Parameters Parameter Name Minimum Maximum Population Size 5 99 Chromosome Length 5 99 Rank Bias 1.01 1.99 Number of Mutations 1 10 Max Initial Test Count 1 10 Crossover Rate 0.01 0.99 Constant Pool Use Probability 0.01 0.99 Test Insertion Probability 0.01 0.99

  30. Experiments

  31. Experiments 184 days of computation time estimated

  32. Experiments 184 days of computation time estimated Cluster of 70 computers running for weeks

  33. Experiments 184 days of computation time estimated Cluster of 70 computers running for weeks Identi�ed 139 "easy" and 21 "hard" classes

  34. Experiments 184 days of computation time estimated Cluster of 70 computers running for weeks Identi�ed 139 "easy" and 21 "hard" classes Mann-Whitney U-test and

  35. Experiments 184 days of computation time estimated Cluster of 70 computers running for weeks Identi�ed 139 "easy" and 21 "hard" classes Mann-Whitney U-test and Vargha-Delaney e�ect size

  36. Results Category E�ect Size p-value Results Across Trials and Classes 0.5029 0.1045 No "Easy" and "Hard" Classes 0.5048 0.0314

  37. Results Category E�ect Size p-value Results Across Trials and Classes 0.5029 0.1045 No "Easy" and "Hard" Classes 0.5048 0.0314 Using lower-is-better inverse statement coverage

  38. Results Category E�ect Size p-value Results Across Trials and Classes 0.5029 0.1045 No "Easy" and "Hard" Classes 0.5048 0.0314 Using lower-is-better inverse statement coverage E�ect size greater than 0.5 means that tuning is worse

  39. Results Category E�ect Size p-value Results Across Trials and Classes 0.5029 0.1045 No "Easy" and "Hard" Classes 0.5048 0.0314 Using lower-is-better inverse statement coverage E�ect size greater than 0.5 means that tuning is worse Testing shows we do not always reject the null hypothesis

  40. Results Category E�ect Size p-value Results Across Trials and Classes 0.5029 0.1045 No "Easy" and "Hard" Classes 0.5048 0.0314 Using lower-is-better inverse statement coverage E�ect size greater than 0.5 means that tuning is worse Testing shows we do not always reject the null hypothesis Additional empirical results in the QSIC 2014 paper!

  41. Discussion creative commons licensed ( BY ) photo shared by Startup Stock Photos

  42. Discussion Tuning improved scores for 11 classes creative commons licensed ( BY ) photo shared by Startup Stock Photos

  43. Discussion Tuning improved scores for 11 classes Otherwise, same as or worse than defaults creative commons licensed ( BY ) photo shared by Startup Stock Photos

  44. Discussion Tuning improved scores for 11 classes Otherwise, same as or worse than defaults A "soft �oor" may exist for parameter tuning creative commons licensed ( BY ) photo shared by Startup Stock Photos

  45. Discussion Tuning improved scores for 11 classes Otherwise, same as or worse than defaults A "soft �oor" may exist for parameter tuning Additional details in the QSIC 2014 paper! creative commons licensed ( BY ) photo shared by Startup Stock Photos

  46. Practical Implications

  47. Practical Implications Fundamental Challenges

  48. Practical Implications Fundamental Challenges Tremendous Con�dence

  49. Practical Implications Fundamental Challenges Tremendous Con�dence Great Opportunities

  50. Important Contributions creative commons licensed ( BY-NC-ND ) �ickr photo shared by sunface13

  51. Important Contributions Comprehensive Experiments creative commons licensed ( BY-NC-ND ) �ickr photo shared by sunface13

  52. Important Contributions Comprehensive Experiments Conclusive Con�rmation creative commons licensed ( BY-NC-ND ) �ickr photo shared by sunface13

  53. Important Contributions Comprehensive Experiments Conclusive Con�rmation For EvoSuite, Defaults = Tuned creative commons licensed ( BY-NC-ND ) �ickr photo shared by sunface13

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend