software quality assurance software quality assurance
play

Software Quality Assurance Software Quality Assurance Dr. Linda H. - PowerPoint PPT Presentation

Software Quality Assurance Software Quality Assurance Dr. Linda H. Rosenberg Assistant Director For Information Sciences Goddard Space Flight Center, NASA 301-286-5710 Linda.Rosenberg@gsfc.nasa.gov V&V 10/2002 Slide 1 Agenda


  1. Software Quality Assurance Software Quality Assurance Dr. Linda H. Rosenberg Assistant Director For Information Sciences Goddard Space Flight Center, NASA 301-286-5710 Linda.Rosenberg@gsfc.nasa.gov V&V 10/2002 Slide 1

  2. Agenda • Introduction • Defining Software Quality Assurance • Quality Assurance and Software Development • IV&V within SQA • Summary QA V&V 10/2002 Slide 2

  3. Introduction V&V 10/2002 Slide 3

  4. “Traditional” Development TIME Q REQUIREMENTS U A L I TESTING T Y V&V 10/2002 Slide 4

  5. Results in V&V 10/2002 Slide 5

  6. Quality Assurance V&V 10/2002 Slide 6

  7. Why SOFTWARE Assurance Software y t i l a n o i t c n u f Hardware time Software cost Hardware time V&V 10/2002 Slide 7

  8. Software Quality Assurance IEEE 12207 - Standard for Information Technology - Software Life Cycle Processes “The Quality assurance process is a process for providing adequate assurance that the software products and processes in the project life cycle conform to their specified requirements and adhere to their established plans. “ IEEE 730 - Quality Assurance Plans “Quality Assurance - a planned and systematic pattern of all actions necessary to provide adequate confidence that the time or product conforms to established technical requirements.” V&V 10/2002 Slide 8

  9. Quality Attributes Portability - Will I be able to use Maintainability - Can I fix it? on another machine? Flexibility - Can I change it? Reusability - Will I be able to Testability - Can I test it? reuse some of the software? Interoperability - Will I be able to Product Product interface it with another machine? Revision Transition Product Operations Correctness - Does it do what I want? Reliability - Does it do it accurately all the time? Efficiency - Will it run on my machine as well as it can? Integrity - Is it secure? Usability - Can I run it? V&V 10/2002 Slide 9

  10. SQA Life CYCLE Concept/ Requirements Reviews (SCR. SRR) Requirement trace SW Development Plans Define success criteria Design Prototyping Metrics Reviews (PDR, CDR ) Safety Considerations Requirement trace Devel. & Coding IV&V Support tools Metrics Walkthrough and reviews Safety Considerations Requirement trace IV&V Test SW Devel. Folders Capture deficiencies Metrics Witnessing Safety Considerations Requirement trace IV&V Monitoring Reliability metrics Metrics Deployment Safety Considerations IV&V Capture anomalies Report trending Sustaining engineering Metrics Safety Considerations IV&V V&V 10/2002 Slide 10

  11. SQA Across the Life Cycle Concept/ Requirements Design Devel. & Coding Test Deployment IV&V Safety Reliability Metrics Risk Management V&V 10/2002 Slide 11

  12. Why IV&V at NASA MARS V&V 10/2002 Slide 12

  13. Independent Verification & Validation Software IV&V is a systems engineering process employing rigorous methodologies for evaluating the correctness and quality of the software product throughout the software life cycle Independent – Technical : IV&V prioritizes its own efforts – Managerial : Independent reporting route to Program Management – Financial : Budget is allocated by program and controlled at high level such that IV&V effectiveness is not compromised Verification (Are we building the product right?) Validation (Are we building the right product?) V&V 10/2002 Slide 13

  14. IV&V Approach Traditional Software Development V&V Req Design Code Test (Verification & Validation) Unit Integration Acceptance Req Design Code Testing Clean Room Approach Unit iV&V Test (Verification & Validation) Integration Acceptance Req Design Code Test (Verification & Validation) Unit Integration Acceptance IV&V IV&V Implementation V&V 10/2002 Slide 14

  15. IV&V Activities Requirements Phase • System Reqts Design Phase Analysis • Design Analysis Code Phase • S/W Reqts • Interface Analysis Analysis • Code Analysis Test Phase • Test Program • Interface Analysis • Test Program Analysis • Process Analysis • Test Program Analysis • Supportability Analysis • Technical • Supportability Analysis • Independent Test Reviews & Audits Analysis • Process Analysis • Supportability • Process Analysis • Technical Analysis • Technical Reviews & Audits • Technical Reviews & Audits Reviews & Audits Verify Verify Verify Validate C a ta stro p h ic/C ritica l/H ig h R isk F u n ctio n s L ist T ra cea b ility A n a ly sis Issu es T ra ck in g M etrics A ssessm en t L o a d in g A n a ly sis C h a n g e Im p a ct A n a ly sis S p ecia l S tu d ies V&V 10/2002 Slide 15

  16. Implementing IV&V at NASA V&V 10/2002 Slide 16

  17. IV&V Criteria IV&V is intended to mitigate risk Probability Consequences of an undesired event if that event should occur Risk = Probability * Consequence ∴ IV&V must be based on Risk Probability & Consequence V&V 10/2002 Slide 17

  18. IV&V Probability Risk Factors Factors that impact the difficulty of the development – Software Team Complexity – Contractor Support – Organization Complexity – Schedule Pressure – Process Maturity of Software Provider – Degree of Innovation – Level of Integration – Requirement Maturity – Software Lines of Code V&V 10/2002 Slide 18

  19. IV&V Probability Risk Factors Factors Un-weighted probability of failure score Weighting Likely- contributing Factor hood of to probability failure of software rating failure 1 2 4 8 16 Software Up to 5 people Up to 10 Up to 20 Up to 50 More than 50 X2 team at one location people at one people at one people at one people at one complexity location location or 10 location or 20 location or 20 people with people with people with external external external support support support Contractor None Contractor with Contractor with Contractor with X2 Support minor tasks major tasks major tasks critical to project success Organization One location Two locations Multiple Multiple Multiple X1 Complexity* but same locations but providers with providers with reporting chain same reporting prime sub associate chain relationship relationship Schedule No deadline Deadline is Non-negotiable X2 Pressure** negotiable deadline Process Independent Independent Independent CMM Level 1 CMM Level 1 X2 Maturity of assessment of assessment of assessment of with record of or equivalent Software Capability CMM Level 3 CMM Level 2 repeated Provider Maturity Model mission (CMM) Level success 4, 5 Degree of Proven and Proven but Cutting edge X1 Innovation accepted new to the development organization Level of Simple - Stand Extensive X2 Integration alone Integration Required Requirement Well defined Well defined Preliminary Changing, X2 Maturity objectives - No objectives - objectives ambiguous, or unknowns Few unknowns untestable objectives Software Less than 50K Over 500K Over 1000K X2 Lines of Code*** Total V&V 10/2002 Table 1 Likelihood of Failures Based on Software Environment Slide 19

  20. Consequence Factors GRAVE SUBSTANTIAL MARGINAL INSIGNIFICANT • Potential for loss of life • Potential for serious injury • Potential for catastrophic mission failure • Potential for partial mission failure • Potential for loss of equipment • Potential for waste of software resource investment- • Potential for adverse visibility • Potential effect on routine operations V&V 10/2002 Slide 20

  21. Criteria Determination for IV&V IV&V Consequence of Software Failure Grave IV&V Substantial IV&V Marginal 96 Insignificant 16 32 64 128 250 Total Likelihood of Failure based on Software Environment High Risk - IV&V Required Intermediate Risk - Evaluate for IV&V V&V 10/2002 Slide 21

  22. Summary V&V 10/2002 Slide 22

  23. SQA vs. IV&V PROJECT X SQA IV&V Risk ∴ SQA ≠ IV&V V&V 10/2002 Slide 23

  24. IV&V Benefits Technical Management • Better Visibility into • Better software/system Development Performance • Better Decision Criteria • Higher Confidence in Software Reliability • Second Source Technical Alternative • Compliance between Specs & Code • Reduced maintenance cost • Criteria for Program • Reduced Frequency of Acceptance Operational Change V&V 10/2002 Slide 24

  25. Conclusion • Applied early in the software development process, IV&V can reduce overall Project cost. • NASA policy provides the management process for assuring that the right level of IV&V is applied. • IV&V Implementation Criteria provide a quantitative approach for determining the right level based on mission risk • IV&V CANNOT replace Quality assurance but must supplement it to be successful • IV&V Requires a strong Quality assurance base V&V 10/2002 Slide 25

  26. References IV&V Facility, Fairmont, WV Director – Ned Keeler nelson.keeler@ivv.nasa.gov Deputy Director - Bill Jackson William.L.Jackson@ivv.nasa.gov V&V 10/2002 Slide 26

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend