T1 Bio Thursday, October 29, 1998 10:30AM S OFTWARE T ESTING T - - PDF document

t1
SMART_READER_LITE
LIVE PREVIEW

T1 Bio Thursday, October 29, 1998 10:30AM S OFTWARE T ESTING T - - PDF document

P R E S E N T A T I O N Presentation Notes Paper T1 Bio Thursday, October 29, 1998 10:30AM S OFTWARE T ESTING T URNOVERS Jeffrey Davis VISA International International Conference On Software Testing, Analysis &


slide-1
SLIDE 1

P R E S E N T A T I O N

T1

Thursday, October 29, 1998 10:30AM

SOFTWARE TESTING TURNOVERS

Jeffrey Davis

VISA International

International Conference On

Software Testing, Analysis & Review

October 26-30, 1998

  • San Diego, CA

Presentation Notes Paper Bio

slide-2
SLIDE 2

Software Test Turnovers 1

Jeffrey Davis Visa International

Software Testing Turnover

slide-3
SLIDE 3

Software Test Turnovers 2

Software Testing Turnovers

  • Where do they fit in the SDLC?
  • Levels of Software Testing Turnover.
  • What should the Test Analyst get out of

the turnover in order to do the job?

  • How do we communicate with

development?

  • What are the Test Analysts deliverables?
slide-4
SLIDE 4

Software Test Turnovers 3

Where do Software Turnovers fit into the SDLC?

  • Before Testing (recommended!)
  • After development coding changes ?
  • In initial design stages ?
slide-5
SLIDE 5

Software Test Turnovers 4

Levels of Software Testing Turnover

  • Informal
  • Informal-tive
  • Formal
slide-6
SLIDE 6

Software Test Turnovers 5

Informal

  • You've got Mail
  • Check Migration Dates - Something has

changed

  • Roundtable topic at a meeting

Informal - tive

  • e-Mail with attachments
  • Inbox / Chair
  • Drop-in 5 minute “Chat”
slide-7
SLIDE 7

Software Test Turnovers 6

Formal Test Turnover (Software Turnover Document)

  • List of Program Changes
  • Unit Test Results
  • Variances from Detail Design (Change

Control)

slide-8
SLIDE 8

Software Test Turnovers 7

What do we need to get out

  • f the Test Turnover?
  • Environmental Definitions
  • Time Constraints
  • Documentation
  • “Hot-Line” to Development
slide-9
SLIDE 9

Software Test Turnovers 8

Avenues of Communication

  • Hey “Psssst-Buddy”
  • e-Mail
  • Meetings
  • Documents
slide-10
SLIDE 10

Software Test Turnovers 9

What do I need to do my job?

  • Design Specifications (What did they do?)
  • Design Specifications (Why did they do it?)
  • Design Specifications (Is anything else affected?)
  • We need to evaluate …. Design Specifications
slide-11
SLIDE 11

Software Test Turnovers 10

Evaluate the Design Document

  • Does it address all the business requirements?
  • Does it list the business reason for the change?
  • Does it list all modules that require changing?
  • Does it list all new modules required?
  • Does it define changes to each module?
  • Does it define how those changes affect the overall

system?

  • Does it provide for a time-table for software

turnover to test?

slide-12
SLIDE 12

Software Test Turnovers 11

Turnover Time What questions do we ask?

  • Unit Test results
  • Change Control

Difference between Design Document and implementation

  • Installation scripts

File conversions Environment changes

slide-13
SLIDE 13

Software Test Turnovers 12

What are the Test Deliverables?

  • Test Plan

Test Analysis Test Validation Approach Test Cases Expected Results Sign-Offs Auditable

  • Test Incident Report
  • Findings / Risk Assessment
  • Post Mortem
slide-14
SLIDE 14

Software Test Turnovers 13

Questions???

slide-15
SLIDE 15

Software Testing Turnovers Jeffrey S. Davis, Visa International jedavis@visa.com ABSTRACT As companies grow and become more sophisticated they recognize the need for analysts, both business and QA to support their Information System functions. Often the practice of software turnover to the QA team is a matter of informal notification. This notification at best includes a module list, functional changes and a deadline for moving the changes into production. This arrangement for turnover naturally developed because all

  • rganizations start with an IS department that has a minimal development staff. It should be the role of a

proactive testing department to increase its own involvement in the development process define its’ participation in the life cycle process, and to manage this change. INTRODUCTION We find the roots of most test organizations planted firmly in the development department of most Information System organizations. This commonality or relationship with the development organization can lead to problems as the organization grows The development of formal test organizations requires a formal interface with development teams. We can label this interface as Entry Criteria. In addition, the output from the test

  • rganization can be labeled as Exit Criteria. “The quality and effectiveness of software testing are primarily

determined by the quality of the test processes used”1. The identification of Entry and Exit criteria are a critical step to improving the test process. ENTRY CRITERIA Entry criteria is defined as “Specific conditions or on-going activities that must be present before a process can begin”2. In the Systems Development Life Cycle it also specifies which entry criteria are required at each phase. Additionally, it is also important to define the time interval or required amount of lead time that an entry criteria item be available to the process. Input can be divided into two categories. The first is what we receive from

  • development. The second is what we produce that acts as input to later test process steps.

1 Kit, Edward, “Software Testing in the Real World” pp 3 2 Visa Software Project Life Cycle pp B-2

slide-16
SLIDE 16

The type of required input from development includes:

  • Technical Requirements/Statement of Need
  • Design Document
  • Change Control
  • Turnover Document

The type of required input from test includes:

  • Evaluation of available test tools
  • Test Strategy
  • Test Plan
  • Test Incident Reports

By referencing the Entry Exit Criteria matrix following the next section, you can see how the process uses the deliverables during the test process for Entry Criteria for later process steps. The matrix supplied offers “date required”. These dates are used as a reference, and should be modified to meet the specific goals and requirements of each test effort based on size and complexity. EXIT CRITERIA Exit Criteria is often viewed as a single document commemorating the end of a life cycle phase. Exit Criteria is defined as “The specific conditions or on-going activities that must be present before a life cycle phase can be considered complete. The life cycle specifies which exit criteria are required at each phase”3. This definition identifies the intermediate deliverables, and allows us to track them as independent events. The type of output from test includes:

  • Test Strategy
  • Test Plan
  • Test Scripts/Test Case Specifications
  • Test Logs
  • Test Incident Report Log
  • Test Summary Report/Findings Report

By identifying the specific Exit criteria, we are able to identify and plan how these steps and processes fit into the life cycle. All of the Exit Criteria listed above, less the Test Summary/Findings Report, act as Entry Criteria to a later process. It is this level of process understanding that provides us with the tools we need to improve the

  • verall test process.

3 Visa Software Project Life Cycle pp B-2

slide-17
SLIDE 17

Entry/Exit Criteria Matrix The following matrix assumes that regression testing is completed prior to and independent of functional testing. This would be consistent with large test efforts and year 2000 testing. Input (Entry) Date Required Output (Exit) Phase I: Review and Forecast

  • 1. Publication of General/Detail Design

Document by Development Group.

  • 1. 1 week prior to start of test

plan.

  • 1. Preliminary estimate.
  • 2. Project plan.
  • 3. Resource(s) assigned.
  • 1. Publication of Test Strategy Document by

Development Group.

  • 1. 2 weeks prior to start of

functional testing.

  • 1. Requirements for environments and tools.
  • 2. Test data requirements.
  • 3. Include information on interfaces in the

environment section of test plan. Phase II: Test Planning

  • 1. Turnover to Test Department.2. 1 week prior to start of

functional testing (coincides with the publication of the test strategy).

  • 4. Updated estimate.
  • 5. Design review completed.
  • 6. Revised project plan.
  • 7. Beginning of project status reports.
  • 1. Verify Regression Test Completed
  • 1. 1 week prior to start of

functional testing.

  • 1. Request for any necessary baseline data from

regression testing.

  • 2. Review of TIR’s written during regression

testing.

  • 1. Asses Tools Requirements
  • 1. 2 weeks prior to start of

functional testing.

  • 1. Submit requests to Test Tools group.

Phase III: Test Design

  • 1. Test plan draft.
  • 2. 1 week prior to start of

functional testing.

  • 2. Draft test plan.
  • 3. Updated project plan.
  • 4. Current status report.

Phase IV: Build Test Environments and Tools

  • 1. Environment set-up
  • 3. Start 1 week prior to start of

functional testing.

  • 5. Environment ready for testing.

Phase V: Create Test Data

  • 1. Test Plan
  • 4. Start of Functional Testing
  • 6. Functional test files.

Phase VI: Finalize Test Plan and Scripts.

  • 1. Test plan walk-through
  • 2. Publish final draft of Test Plan
  • 5. 3 days prior to start of

functional testing.

  • 6. 1 day prior to start of functional

testing.

  • 7. Final Test Plan
  • 8. Publish Test Plan.
  • 9. Updated project plan.

Phase VII: Test Execution

  • 1. Run Test Cases (Functional Testing)
  • 7. Start of testing
  • 10. Functionally tested software.
slide-18
SLIDE 18

Input (Entry) Date Required Output (Exit)

  • 11. Log any DR’s found.

Phase VIII: Certification Turnover/Report Results

  • 1. Close Test Incident Reports (TIR)
  • 8. Prior to end of phase testing.
  • 12. Update of TIR’s to closed status.
  • 2. Produce Findings Reports
  • 9. 2 weeks after end of phase

testing.

  • 13. Findings report
  • 14. Publish Findings report.
  • 3. Compliance Sign-off
  • 10. 2 weeks after end of Findings

report.

  • 15. End of Functional testing.
  • 16. Publish Compliance Sign-off.
slide-19
SLIDE 19

CONCLUSION The refinement of the test process helps us to improve the overall test effort. It allows us to define re-usable test

  • cases. It allows us to develop repeatable test results. It allows us to better measure the effectiveness of our
  • testing. As we improve the test process, we also improve how we test. The end result is an efficient, cost-

effective, documented test process. REFRENCES

  • 1. Visa Software Project Life Cycle
  • 2. IEEE(1991) Standard for Software Test Documentation, Std 829-1983
  • 3. Hetzel, Bill, “The Complete Guide to Software Testing”, 1988
  • 4. Kit, Edward, “Software Testing in the Real World”, 1996
slide-20
SLIDE 20

Jeffrey S. Davis

Jeffrey Davis is a Project Leader for a Year 2000 Test Team at VISA International. He has been in the Software Development and Test field for over 10 years. He has worked in the Financial and Insurance fields. His accomplishments include developing and implementing a test process to reduce risk associated with the payment of HMO claims. Jeffrey has a Bachelor of Science in Management Information Systems and a Masters of Science in Management Information Systems both from California State University, Sacramento.