scan an approach to label and relate
play

SCAN: An approach to Label and Relate Performances Evaluation - PowerPoint PPT Presentation

SCAN Soumaya Medini Introduction SCAN Approach SCAN SCAN: An approach to Label and Relate Performances Evaluation Execution Trace Segments RQ1: How do the labels of the trace segments produced by the participants change when providing


  1. SCAN Soumaya Medini Introduction SCAN Approach SCAN SCAN: An approach to Label and Relate Performances Evaluation Execution Trace Segments RQ1: How do the labels of the trace segments produced by the participants change when providing them different amount of information? RQ2: How do the labels of Soumaya Medini the trace segments produced by the participants compare to the labels generated by SCAN? RQ3: To what extent does Soccer & Ptidej Lab SCAN correctly identify DGIGL, ´ relations among segments? Ecole Polytechnique de Montr´ eal SCAN Usefulness Evaluation 16 Mai 2014 RQ4: Does SCAN has a potential to support feature location? RQ5: To what extent does SCAN support feature location tasks if used as a standalone technique? Conclusion References Pattern Trace Identification, Detection, and Enhancement in Java SOftware Cost-effective Change and Evolution Research Lab

  2. SCAN Outline Soumaya Medini Introduction SCAN Approach Introduction SCAN Performances Evaluation RQ1: How do the labels of SCAN Approach the trace segments produced by the participants change when providing them different amount of information? SCAN Performances Evaluation RQ2: How do the labels of the trace segments produced by the participants compare to the labels generated by SCAN? RQ3: To what extent does SCAN Usefulness Evaluation SCAN correctly identify relations among segments? SCAN Usefulness Evaluation Conclusion RQ4: Does SCAN has a potential to support feature location? RQ5: To what extent does SCAN support feature References location tasks if used as a standalone technique? Conclusion References 2 / 24

  3. SCAN Introduction Soumaya Medini Introduction SCAN Approach SCAN Performances ◮ Software maintenance can be up to 90% of software Evaluation RQ1: How do the labels of cost. [Standish, 1984] the trace segments produced by the participants change when ◮ Program comprehension occupies up to 90% of software providing them different amount of information? maintenance. [Standish, 1984] RQ2: How do the labels of the trace segments produced by the ◮ Concept location is an important task during Program participants compare to the labels generated by SCAN? RQ3: To what extent does comprehension.[Rajlich, 2002] SCAN correctly identify relations among segments? SCAN Usefulness Evaluation Concept location RQ4: Does SCAN has a potential to support feature Aims to dentify the source code elements that implement a location? RQ5: To what extent does concept of the software. SCAN support feature location tasks if used as a standalone technique? Conclusion References 3 / 24

  4. SCAN Introduction Soumaya Medini Introduction SCAN Approach SCAN Performances Evaluation Drawbacks RQ1: How do the labels of the trace segments produced by the ◮ Scalability: [Cornelissen et al., 2009]: “The scalability of participants change when providing them different amount of information? dynamic analysis due to the large amounts of data that RQ2: How do the labels of the trace segments may be introduced in dynamic analysis, affecting produced by the participants compare to the labels generated by SCAN? performance, storage, and the cognitive load humans RQ3: To what extent does SCAN correctly identify relations among segments? can deal with.” SCAN Usefulness ◮ Large and noisy: execution trace corresponding to“Draw Evaluation RQ4: Does SCAN has a a rectangle”in JHotDraw contains 3,000 method calls. potential to support feature location? RQ5: To what extent does SCAN support feature location tasks if used as a standalone technique? Conclusion References 4 / 24

  5. SCAN Introduction Soumaya Medini Introduction SCAN Approach SCAN Performances Evaluation RQ1: How do the labels of To address drawbacks the trace segments produced by the participants change when ◮ Compact ( e.g. , Reiss and Renieris providing them different amount of information? RQ2: How do the labels of [Reiss and Renieris, 2001], Hamou-Lhadj and the trace segments produced by the participants compare to the Lethbridge [Hamou-Lhadj and Lethbridge, 2006]). labels generated by SCAN? RQ3: To what extent does ◮ Segment ( e.g. , Asadi et al. [Asadi et al., 2010], Medini SCAN correctly identify relations among segments? et al. [Medini et al., 2011], Pirzadeh and SCAN Usefulness Evaluation Hamou-Lhadj [Pirzadeh and Hamou-Lhadj, 2011]). RQ4: Does SCAN has a potential to support feature location? RQ5: To what extent does SCAN support feature location tasks if used as a standalone technique? Conclusion References 5 / 24

  6. SCAN SCAN Approach Soumaya Medini Introduction SCAN Approach SCAN Performances Step1: Trace Segmentation Evaluation ◮ Use a dynamic programming optimization technique. RQ1: How do the labels of the trace segments produced by the participants change when ◮ The cost function relies on conceptual cohesion and providing them different amount of information? coupling measures. RQ2: How do the labels of the trace segments produced by the participants compare to the labels generated by SCAN? Step 2: Label Identification RQ3: To what extent does SCAN correctly identify relations among segments? ◮ Extract terms contained in the signature of all called SCAN Usefulness Evaluation methods in each segment. RQ4: Does SCAN has a potential to support feature ◮ Rank terms by their tf-idf values and keep the top ones. location? RQ5: To what extent does SCAN support feature ◮ Topmost 10 terms yields meaningful segments labels. location tasks if used as a standalone technique? Conclusion References 6 / 24

  7. SCAN SCAN Approach Soumaya Medini Introduction SCAN Approach Step 3: Relation Identification SCAN ◮ Use Formal Concept Analysis. Performances Evaluation ◮ Same feature, sub-feature and macro-phase. RQ1: How do the labels of the trace segments produced by the participants change when providing them different amount of information? RQ2: How do the labels of the trace segments produced by the participants compare to the labels generated by SCAN? RQ3: To what extent does SCAN correctly identify relations among segments? SCAN Usefulness Evaluation RQ4: Does SCAN has a potential to support feature location? RQ5: To what extent does SCAN support feature location tasks if used as a standalone technique? Conclusion ArgoUML FCA lattice example. References 7 / 24

  8. SCAN SCAN Performances Evaluation Soumaya Medini Introduction SCAN Approach SCAN Performances Evaluation RQ1: How do the labels of the trace segments produced by the participants change when providing them different amount of information? RQ2: How do the labels of the trace segments produced by the participants compare to the labels generated by SCAN? Segments characteristics. RQ3: To what extent does SCAN correctly identify relations among segments? SCAN Usefulness Evaluation RQ4: Does SCAN has a potential to support feature location? RQ5: To what extent does SCAN support feature location tasks if used as a standalone technique? Participants characteristics. Conclusion References 8 / 24

  9. SCAN SCAN Performances Evaluation Soumaya Medini RQ1 Introduction SCAN Approach SCAN ◮ To reduce the time and effort: segments characterized Performances Evaluation using 5 and 15 different unique methods. RQ1: How do the labels of the trace segments produced by the ◮ Small version: may result in loss of relevant information. participants change when providing them different amount of information? ◮ Medium version: may preserve better the relevant RQ2: How do the labels of the trace segments produced by the information. participants compare to the labels generated by SCAN? ◮ Unique methods are selected according to their tf-idf. RQ3: To what extent does SCAN correctly identify relations among segments? SCAN Usefulness RQ1 Evaluation RQ4: Does SCAN has a potential to support feature How do the labels of the trace segments produced by the location? RQ5: To what extent does participants change when providing them different amount of SCAN support feature location tasks if used as a standalone technique? information? Conclusion References 9 / 24

  10. SCAN SCAN Performances Evaluation Soumaya Medini RQ1: How do the labels of the trace segments produced by the participants change when providing them different amount of information? Introduction SCAN Approach SCAN Performances Evaluation RQ1: How do the labels of the trace segments Experiment Design produced by the participants change when providing them different ◮ We select 9 segments (between 50 and 200). amount of information? RQ2: How do the labels of the trace segments ◮ We group participants into 3 groups. Each version is produced by the participants compare to the labels generated by SCAN? assigned to a different group. RQ3: To what extent does SCAN correctly identify relations among segments? ◮ Oracle: Labels of the full segments. SCAN Usefulness ◮ Evaluation: Intersection between terms of small and Evaluation RQ4: Does SCAN has a medium and terms of full segment. potential to support feature location? RQ5: To what extent does SCAN support feature location tasks if used as a standalone technique? Conclusion References 10 / 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend