Software Quality Assurance Software Quality Assurance Dr. Linda H. - - PowerPoint PPT Presentation

software quality assurance software quality assurance
SMART_READER_LITE
LIVE PREVIEW

Software Quality Assurance Software Quality Assurance Dr. Linda H. - - PowerPoint PPT Presentation

Software Quality Assurance Software Quality Assurance Dr. Linda H. Rosenberg Assistant Director For Information Sciences Goddard Space Flight Center, NASA 301-286-5710 Linda.Rosenberg@gsfc.nasa.gov V&V 10/2002 Slide 1 Agenda


slide-1
SLIDE 1

Slide 1 V&V 10/2002

Software Quality Assurance Software Quality Assurance

  • Dr. Linda H. Rosenberg

Assistant Director For Information Sciences Goddard Space Flight Center, NASA 301-286-5710 Linda.Rosenberg@gsfc.nasa.gov

slide-2
SLIDE 2

Slide 2 V&V 10/2002

Agenda

  • Introduction
  • Defining Software Quality Assurance
  • Quality Assurance and Software Development
  • IV&V within SQA
  • Summary

QA

slide-3
SLIDE 3

Slide 3 V&V 10/2002

Introduction

slide-4
SLIDE 4

Slide 4 V&V 10/2002

“Traditional” Development

TIME REQUIREMENTS TESTING Q U A L I T Y

slide-5
SLIDE 5

Slide 5 V&V 10/2002

Results in

slide-6
SLIDE 6

Slide 6 V&V 10/2002

Quality Assurance

slide-7
SLIDE 7

Slide 7 V&V 10/2002

Why SOFTWARE Assurance

f u n c t i

  • n

a l i t y

Hardware Software

time

Hardware Software

cost time

slide-8
SLIDE 8

Slide 8 V&V 10/2002

Software Quality Assurance

IEEE 12207 - Standard for Information Technology - Software Life Cycle Processes “The Quality assurance process is a process for providing adequate assurance that the software products and processes in the project life cycle conform to their specified requirements and adhere to their established

  • plans. “

IEEE 730 - Quality Assurance Plans “Quality Assurance - a planned and systematic pattern of all actions necessary to provide adequate confidence that the time or product conforms to established technical requirements.”

slide-9
SLIDE 9

Slide 9 V&V 10/2002

Quality Attributes

Product Operations Product Transition Product Revision

Maintainability - Can I fix it? Flexibility - Can I change it? Testability - Can I test it? Portability - Will I be able to use

  • n another machine?

Reusability - Will I be able to reuse some of the software? Interoperability - Will I be able to interface it with another machine? Correctness - Does it do what I want? Reliability - Does it do it accurately all the time? Efficiency - Will it run on my machine as well as it can? Integrity - Is it secure? Usability - Can I run it?

slide-10
SLIDE 10

Slide 10 V&V 10/2002

SQA Life CYCLE

Concept/ Requirements Reviews (SCR. SRR) Requirement trace SW Development Plans Define success criteria Prototyping Metrics Safety Considerations IV&V Design Reviews (PDR, CDR ) Requirement trace Support tools Metrics Safety Considerations IV&V

  • Devel. & Coding

Walkthrough and reviews Requirement trace SW Devel. Folders Capture deficiencies Metrics Safety Considerations IV&V Test Witnessing Requirement trace Monitoring Reliability metrics Metrics Safety Considerations IV&V Deployment Capture anomalies Report trending Sustaining engineering Metrics Safety Considerations IV&V

slide-11
SLIDE 11

Slide 11 V&V 10/2002

SQA Across the Life Cycle

Design Test Deployment Concept/ Requirements

  • Devel. & Coding

IV&V Risk Management Metrics Safety Reliability

slide-12
SLIDE 12

Slide 12 V&V 10/2002

Why IV&V at NASA

MARS

slide-13
SLIDE 13

Slide 13 V&V 10/2002

Independent Verification & Validation

Software IV&V is a systems engineering process employing rigorous methodologies for evaluating the correctness and quality

  • f the software product throughout the software life cycle

Independent

– Technical: IV&V prioritizes its own efforts – Managerial: Independent reporting route to Program Management – Financial: Budget is allocated by program and controlled at high level such that IV&V effectiveness is not compromised

Verification (Are we building the product right?) Validation (Are we building the right product?)

slide-14
SLIDE 14

Slide 14 V&V 10/2002

IV&V Approach

Traditional Software Development IV&V Implementation

Req Design Code Testing Unit Test (Verification & Validation) Integration Acceptance

Clean Room Approach iV&V V&V

Req Design Code Test (Verification & Validation) Unit Integration Acceptance Req Design Code Test (Verification & Validation) Unit Integration Acceptance

IV&V

slide-15
SLIDE 15

Slide 15 V&V 10/2002

IV&V Activities

Requirements Phase

  • System Reqts

Analysis

  • S/W Reqts

Analysis

  • Interface Analysis
  • Process Analysis
  • Technical

Reviews & Audits Design Phase

  • Design Analysis
  • Interface Analysis
  • Test Program

Analysis

  • Supportability

Analysis

  • Process Analysis
  • Technical

Reviews & Audits Code Phase

  • Code Analysis
  • Test Program

Analysis

  • Supportability

Analysis

  • Process Analysis
  • Technical

Reviews & Audits Test Phase

  • Test Program

Analysis

  • Independent Test
  • Supportability

Analysis

  • Technical

Reviews & Audits Verify Verify Verify Validate

C a ta stro p h ic/C ritica l/H ig h R isk F u n ctio n s L ist T ra cea b ility A n a ly sis Issu es T ra ck in g M etrics A ssessm en t L o a d in g A n a ly sis C h a n g e Im p a ct A n a ly sis S p ecia l S tu d ies

slide-16
SLIDE 16

Slide 16 V&V 10/2002

Implementing IV&V at NASA

slide-17
SLIDE 17

Slide 17 V&V 10/2002

IV&V Criteria

IV&V is intended to mitigate risk Probability

  • f an undesired event

Consequences if that event should occur

Risk = Probability * Consequence ∴ IV&V must be based on Risk Probability & Consequence

slide-18
SLIDE 18

Slide 18 V&V 10/2002

IV&V Probability Risk Factors

Factors that impact the difficulty of the development – Software Team Complexity – Contractor Support – Organization Complexity – Schedule Pressure – Process Maturity of Software Provider – Degree of Innovation – Level of Integration – Requirement Maturity – Software Lines of Code

slide-19
SLIDE 19

Slide 19 V&V 10/2002

IV&V Probability Risk Factors

Factors contributing to probability

  • f software

failure Weighting Factor Likely- hood of failure rating 1 2 4 8 16 Software team complexity Up to 5 people at one location Up to 10 people at one location Up to 20 people at one location or 10 people with external support Up to 50 people at one location or 20 people with external support More than 50 people at one location or 20 people with external support X2 Contractor Support None Contractor with minor tasks Contractor with major tasks Contractor with major tasks critical to project success X2 Organization Complexity* One location Two locations but same reporting chain Multiple locations but same reporting chain Multiple providers with prime sub relationship Multiple providers with associate relationship X1 Schedule Pressure** No deadline Deadline is negotiable Non-negotiable deadline X2 Process Maturity of Software Provider Independent assessment of Capability Maturity Model (CMM) Level 4, 5 Independent assessment of CMM Level 3 Independent assessment of CMM Level 2 CMM Level 1 with record of repeated mission success CMM Level 1

  • r equivalent

X2 Degree of Innovation Proven and accepted Proven but new to the development

  • rganization

Cutting edge X1 Level of Integration Simple - Stand alone Extensive Integration Required X2 Requirement Maturity Well defined

  • bjectives - No

unknowns Well defined

  • bjectives -

Few unknowns Preliminary

  • bjectives

Changing, ambiguous, or untestable

  • bjectives

X2 Software Less than 50K Over 500K Over 1000K X2 Un-weighted probability of failure score Lines of Code***

Total

Table 1 Likelihood of Failures Based on Software Environment

slide-20
SLIDE 20

Slide 20 V&V 10/2002

Consequence Factors

  • Potential for loss of life
  • Potential for serious injury
  • Potential for catastrophic mission failure
  • Potential for partial mission failure
  • Potential for loss of equipment
  • Potential for waste of software resource investment-
  • Potential for adverse visibility
  • Potential effect on routine operations

GRAVE SUBSTANTIAL MARGINAL INSIGNIFICANT

slide-21
SLIDE 21

Slide 21 V&V 10/2002

Criteria Determination for IV&V

Grave Substantial Marginal Insignificant

16 32 64 128 250 96

IV&V IV&V IV&V

Consequence of Software Failure Total Likelihood of Failure based on Software Environment

High Risk - IV&V Required Intermediate Risk - Evaluate for IV&V

slide-22
SLIDE 22

Slide 22 V&V 10/2002

Summary

slide-23
SLIDE 23

Slide 23 V&V 10/2002

SQA vs. IV&V

SQA PROJECT X IV&V Risk

∴ SQA ≠ IV&V

slide-24
SLIDE 24

Slide 24 V&V 10/2002

IV&V Benefits

Technical Management

  • Better software/system

Performance

  • Higher Confidence in

Software Reliability

  • Compliance between

Specs & Code

  • Criteria for Program

Acceptance

  • Better Visibility into

Development

  • Better Decision Criteria
  • Second Source Technical

Alternative

  • Reduced maintenance cost
  • Reduced Frequency of

Operational Change

slide-25
SLIDE 25

Slide 25 V&V 10/2002

Conclusion

  • Applied early in the software development process, IV&V

can reduce overall Project cost.

  • NASA policy provides the management process for

assuring that the right level of IV&V is applied.

  • IV&V Implementation Criteria provide a quantitative

approach for determining the right level based on mission risk

  • IV&V CANNOT replace Quality assurance but must

supplement it to be successful

  • IV&V Requires a strong Quality assurance base
slide-26
SLIDE 26

Slide 26 V&V 10/2002

References

IV&V Facility, Fairmont, WV Director – Ned Keeler nelson.keeler@ivv.nasa.gov Deputy Director - Bill Jackson William.L.Jackson@ivv.nasa.gov