t1
play

T1 Bio Thursday, October 29, 1998 10:30AM S OFTWARE T ESTING T - PDF document

P R E S E N T A T I O N Presentation Notes Paper T1 Bio Thursday, October 29, 1998 10:30AM S OFTWARE T ESTING T URNOVERS Jeffrey Davis VISA International International Conference On Software Testing, Analysis &


  1. P R E S E N T A T I O N Presentation Notes Paper T1 Bio Thursday, October 29, 1998 10:30AM S OFTWARE T ESTING T URNOVERS Jeffrey Davis VISA International International Conference On Software Testing, Analysis & Review October 26-30, 1998 • San Diego, CA

  2. Software Testing Turnover Jeffrey Davis Visa International Software Test Turnovers 1

  3. Software Testing Turnovers • Where do they fit in the SDLC? • Levels of Software Testing Turnover. • What should the Test Analyst get out of the turnover in order to do the job? • How do we communicate with development? • What are the Test Analysts deliverables? Software Test Turnovers 2

  4. Where do Software Turnovers fit into the SDLC? • Before Testing (recommended!) • After development coding changes ? • In initial design stages ? Software Test Turnovers 3

  5. Levels of Software Testing Turnover • Informal • Informal-tive • Formal Software Test Turnovers 4

  6. Informal • You've got Mail • Check Migration Dates - Something has changed • Roundtable topic at a meeting Informal - tive • e-Mail with attachments • Inbox / Chair • Drop-in 5 minute “Chat” Software Test Turnovers 5

  7. Formal Test Turnover (Software Turnover Document) • List of Program Changes • Unit Test Results • Variances from Detail Design (Change Control) Software Test Turnovers 6

  8. What do we need to get out of the Test Turnover? • Environmental Definitions • Time Constraints • Documentation • “Hot-Line” to Development Software Test Turnovers 7

  9. Avenues of Communication • Hey “Psssst-Buddy” • e-Mail • Meetings • Documents Software Test Turnovers 8

  10. What do I need to do my job? • Design Specifications (What did they do?) • Design Specifications (Why did they do it?) • Design Specifications (Is anything else affected?) • We need to evaluate …. Design Specifications Software Test Turnovers 9

  11. Evaluate the Design Document • Does it address all the business requirements? • Does it list the business reason for the change? • Does it list all modules that require changing? • Does it list all new modules required? • Does it define changes to each module? • Does it define how those changes affect the overall system? • Does it provide for a time-table for software turnover to test? Software Test Turnovers 10

  12. Turnover Time What questions do we ask? • Unit Test results • Change Control Difference between Design Document and implementation • Installation scripts File conversions Environment changes Software Test Turnovers 11

  13. What are the Test Deliverables? • Test Plan Test Analysis Test Validation Approach Test Cases Expected Results Sign-Offs Auditable • Test Incident Report • Findings / Risk Assessment • Post Mortem Software Test Turnovers 12

  14. Questions??? Software Test Turnovers 13

  15. Software Testing Turnovers Jeffrey S. Davis, Visa International jedavis@visa.com ABSTRACT As companies grow and become more sophisticated they recognize the need for analysts, both business and QA to support their Information System functions. Often the practice of software turnover to the QA team is a matter of informal notification. This notification at best includes a module list, functional changes and a deadline for moving the changes into production. This arrangement for turnover naturally developed because all organizations start with an IS department that has a minimal development staff. It should be the role of a proactive testing department to increase its own involvement in the development process define its’ participation in the life cycle process, and to manage this change. INTRODUCTION We find the roots of most test organizations planted firmly in the development department of most Information System organizations. This commonality or relationship with the development organization can lead to problems as the organization grows The development of formal test organizations requires a formal interface with development teams. We can label this interface as Entry Criteria. In addition, the output from the test organization can be labeled as Exit Criteria. “The quality and effectiveness of software testing are primarily determined by the quality of the test processes used” 1 . The identification of Entry and Exit criteria are a critical step to improving the test process. ENTRY CRITERIA Entry criteria is defined as “Specific conditions or on-going activities that must be present before a process can begin” 2 . In the Systems Development Life Cycle it also specifies which entry criteria are required at each phase. Additionally, it is also important to define the time interval or required amount of lead time that an entry criteria item be available to the process. Input can be divided into two categories. The first is what we receive from development. The second is what we produce that acts as input to later test process steps. 1 Kit, Edward, “Software Testing in the Real World” pp 3 2 Visa Software Project Life Cycle pp B-2

  16. The type of required input from development includes: • Technical Requirements/Statement of Need • Design Document • Change Control • Turnover Document The type of required input from test includes: • Evaluation of available test tools • Test Strategy • Test Plan • Test Incident Reports By referencing the Entry Exit Criteria matrix following the next section, you can see how the process uses the deliverables during the test process for Entry Criteria for later process steps. The matrix supplied offers “date required”. These dates are used as a reference, and should be modified to meet the specific goals and requirements of each test effort based on size and complexity. EXIT CRITERIA Exit Criteria is often viewed as a single document commemorating the end of a life cycle phase. Exit Criteria is defined as “The specific conditions or on-going activities that must be present before a life cycle phase can be considered complete. The life cycle specifies which exit criteria are required at each phase” 3 . This definition identifies the intermediate deliverables, and allows us to track them as independent events. The type of output from test includes: • Test Strategy • Test Plan • Test Scripts/Test Case Specifications • Test Logs • Test Incident Report Log • Test Summary Report/Findings Report By identifying the specific Exit criteria, we are able to identify and plan how these steps and processes fit into the life cycle. All of the Exit Criteria listed above, less the Test Summary/Findings Report, act as Entry Criteria to a later process. It is this level of process understanding that provides us with the tools we need to improve the overall test process. 3 Visa Software Project Life Cycle pp B-2

  17. Entry/Exit Criteria Matrix The following matrix assumes that regression testing is completed prior to and independent of functional testing. This would be consistent with large test efforts and year 2000 testing. Input (Entry) Date Required Output (Exit) Phase I: Review and Forecast 1. Publication of General/Detail Design 1. 1 week prior to start of test 1. Preliminary estimate. 2. Project plan. Document by Development Group. plan. 3. Resource(s) assigned. 1. Publication of Test Strategy Document by 1. 2 weeks prior to start of 1. Requirements for environments and tools. 2. Test data requirements. Development Group. functional testing. 3. Include information on interfaces in the environment section of test plan. Phase II: Test Planning 1. Turnover to Test Department.2. 1 week prior to start of 4. Updated estimate. 5. Design review completed. functional testing (coincides with the publication of the test 6. Revised project plan. strategy). 7. Beginning of project status reports. 1. Verify Regression Test Completed 1. 1 week prior to start of 1. Request for any necessary baseline data from functional testing. regression testing. 2. Review of TIR’s written during regression testing. 1. Asses Tools Requirements 1. 2 weeks prior to start of 1. Submit requests to Test Tools group. functional testing. Phase III: Test Design 1. Test plan draft. 2. 1 week prior to start of 2. Draft test plan. functional testing. 3. Updated project plan. 4. Current status report. Phase IV: Build Test Environments and Tools 1. Environment set-up 3. Start 1 week prior to start of 5. Environment ready for testing. functional testing. Phase V: Create Test Data 1. Test Plan 4. Start of Functional Testing 6. Functional test files. Phase VI: Finalize Test Plan and Scripts. 1. Test plan walk-through 5. 3 days prior to start of 7. Final Test Plan 2. Publish final draft of Test Plan functional testing. 8. Publish Test Plan. 6. 1 day prior to start of functional 9. Updated project plan. testing. Phase VII: Test Execution 1. Run Test Cases (Functional Testing) 7. Start of testing 10. Functionally tested software.

  18. Input (Entry) Date Required Output (Exit) 11. Log any DR’s found. Phase VIII: Certification Turnover/Report Results 1. Close Test Incident Reports (TIR) 8. Prior to end of phase testing. 12. Update of TIR’s to closed status. 2. Produce Findings Reports 9. 2 weeks after end of phase 13. Findings report 14. Publish Findings report. testing. 3. Compliance Sign-off 10. 2 weeks after end of Findings 15. End of Functional testing. 16. Publish Compliance Sign-off. report.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend