systematic analysis of testing related systematic
play

Systematic Analysis of testing-related Systematic Analysis of - PowerPoint PPT Presentation

Systematic Analysis of testing-related Systematic Analysis of testing-related publications concerning reprocucibility publications concerning reprocucibility and comparability and comparability Bachelor's Thesis Defense by Artur Solomonik


  1. Systematic Analysis of testing-related Systematic Analysis of testing-related publications concerning reprocucibility publications concerning reprocucibility and comparability and comparability Bachelor's Thesis Defense by Artur Solomonik Bachelor's Thesis Defense by Artur Solomonik Referees: Prof. Dr. Norbert Siegmund, Prof. Dr. Martin Potthast Referees: Prof. Dr. Norbert Siegmund, Prof. Dr. Martin Potthast

  2. Software Testing Software Testing

  3. Software Testing Life Cycle Software Testing Life Cycle

  4. Software Testing Life Cycle Software Testing Life Cycle

  5. Software Testing Life Cycle Software Testing Life Cycle

  6. Software Testing Research Software Testing Research Generating test suites Exploration pinciples Mutation testing Executing generated test suites Prioritization and Reduction of Test Cases Automating test case creation, selection and execution Finding new approaches on organizing testing processes Testing Work�ow Decision Making Process When and What to Automate?

  7. Software Testing Research Software Testing Research Testing Levels Data-Flow Testing, Static Code Analysis | Unit Testing Backbone-, Client-Server-, Bottom-Up | Integration Testing GUI Testing, End-To-End Testing | System Testing Reliability and Stability, Chaos Testing | Acceptance Testing Execution Paradigms

  8. Test Execution Paradigms Test Execution Paradigms

  9. How do we know the testing system is working? How do we know the testing system is working?

  10. Empirical Software Evaluations Empirical Software Evaluations

  11. Evaluating result data Evaluating result data Present the result data set and identify signi�cant values Connect hypotheses and results Compare related work and their �ndings Argument the improvement or bene�ts of the approach Apply suitable metrics

  12. Reproducibility Reproducibility Goal: Provide the reader with every information and resource necessary to recreate the �ndings presented in the paper

  13. Reproducibility Attributes Reproducibility Attributes Reproduction score in�uenced by data set attributes Identi�cation: Explanation of where the data is and what it is called Description: Level of the of the explanation regarding the element Availability: Ease of accessing or obtaining the research elements Persistence: Con�dence in future state and availability of the elements Flexibility: Adaptability of the elements to new environments

  14. Reproducibility Attributes Reproducibility Attributes Reproduction score in�uenced by data set attributes Identi�cation: Explanation of where the data is and what it is called Description: Level of the of the explanation regarding the element Availability: Ease of accessing or obtaining the research elements Persistence: Con�dence in future state and availability of the elements Flexibility: Adaptability of the elements to new environments

  15. Reproducibility Attributes Reproducibility Attributes Reproduction score in�uenced by data set attributes Identi�cation: Explanation of where the data is and what it is called Description: Level of the of the explanation regarding the element Availability: Ease of accessing or obtaining the research elements Persistence: Con�dence in future state and availability of the elements Flexibility: Adaptability of the elements to new environments Varying data sources - Attributes not applicable to anything

  16. Comparability Comparability Goal: Assess papers on whether empirical comparisons in the evaluation are appropriate or existent. Criteria for comprehensible evaluations Strategies of Comparison Connectivity to related work

  17. How can we understand the research strategies of How can we understand the research strategies of software testing publications in terms of software testing publications in terms of reproducibility and comparability? reproducibility and comparability?

  18. Paper Classi�cation Paper Classi�cation

  19. Data Source Data Source Papers from 10 popular software engineering conferences (ASE, ICSE, ISSTA, ...) Additional publications from two journals (ESE, TOSEM) Frequently mentioned publications Papers from modi�cation / re�nement phases

  20. Processed Data Set Processed Data Set

  21. testing papers and evaluation data Freigeben Anmelden 3 In Arbeit... Datei Bearbeiten Ansicht Einfügen Format Daten Tools Add-ons Hilfe 100% Nur Kommentierzugriff year papers paper_evaluation benchmark paper_benchmark

  22. testing papers and evaluation data Freigeben Anmelden In Arbeit... Datei Bearbeiten Ansicht Einfügen Format Daten Tools Add-ons Hilfe 100% Nur Kommentierzugriff year Raw Data Set Spreadsheet with 8060 registered papers of which 360 are classi�ed by 23 columns 205 documented benchmarks Over 15000 bibliographic and semantic connections between records papers paper_evaluation benchmark paper_benchmark

  23. Classi�cation Parameters Availability [open/closed] Data Set State [vanilla/modi�ed] Selection Cause [...] Modi�cation Cause [...] Sub-Check Systems [single/multiple] [named/unnamed]

  24. Classi�cation Parameters Contribution [...] Choice of Metric [functionality/performance/both] Metrics [ ] Metrics

  25. Classi�cation Parameters Error [generation/real world/both] Creation Error [TRUE/FALSE] Annotation Comparison [TRUE/FALSE] [former/foreign/parallel] [exclusive/inclusive]

  26. Open Source vs. Closed Source Open Source vs. Closed Source

  27. Software Testing Evaluation Metrics Software Testing Evaluation Metrics

  28. Choice of Metric and Error Annotation Choice of Metric and Error Annotation

  29. Selection and modi�cation causes of benchmarks Selection and modi�cation causes of benchmarks

  30. Bibliographic Networks Bibliographic Networks

  31. Goal: Visualizing great amounts of bibliographic data, increasing the interactivity with a set of publications and creating dynamic, time-based insight on the netvork evolution.

  32. Current implementations of paper networks Current implementations of paper networks Visualize the connection and in�uence between authors Giving insight rather than speci�c values Connected over citations, bibliographic coupling, co-citations or co-authorship relations Color- and size-coding node information Geographic hierarchies

  33. Additions and Improvements Additions and Improvements Benchmarks and software systems as their own entities in a network More insight on reproducibility Multidimensional graph data visualization without clutter Tailouring the visualization to a certain aspect of a publication (e.g. the evaluation)

  34. Visualizing bibliographic networks Visualizing bibliographic networks

  35. test-su fa test generati test coverage MATCH n = ({contribution: 'mutation testing'})-->() return n stematic testing  SEND  SAVE QUERY mbolic execution Use Cypher Query Sort By Venue Temporary Highlighting Permanent Highlighting Color Nodes by Contribution regression testing race testing TESTING LITERATU OVERVIEW SYSTEM This

  36. test-su fa test generati test coverage TeLO-S MATCH n = ({contribution: 'mutation testing'})-->() return n D3 visualization of stematic testing testing publications in  SEND  SAVE QUERY mbolic execution a node-link force- directed graph Use Cypher Query Sort By Venue Temporary Highlighting Permanent Highlighting Color Nodes by Contribution regression testing race testing TESTING LITERATU OVERVIEW SYSTEM This

  37. Cypher Query Input and Con�guration test-su fa Selecting sepeci�c nodes from the Neo4J graph data base and manipulating the test generati layout and color-coding test coverage TeLO-S MATCH n = ({contribution: 'mutation testing'})-->() return n D3 visualization of stematic testing testing publications in  SEND  SAVE QUERY mbolic execution a node-link force- directed graph Use Cypher Query Sort By Venue Temporary Highlighting Permanent Highlighting Color Nodes by Contribution regression testing race testing TESTING LITERATU OVERVIEW SYSTEM This

  38. Contribution Plot Cypher Query Input and Con�guration Immediate assessement test-su fa Selecting sepeci�c nodes from the Neo4J of proportions of graph data base and manipulating the contribution test generati layout and color-coding representatives test coverage TeLO-S MATCH n = ({contribution: 'mutation testing'})-->() return n D3 visualization of stematic testing testing publications in  SEND  SAVE QUERY mbolic execution a node-link force- directed graph Use Cypher Query Sort By Venue Temporary Highlighting Permanent Highlighting Color Nodes by Contribution regression testing race testing TESTING LITERATU OVERVIEW SYSTEM This

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend