business value and customer benefits derived from high
play

Business Value and Customer Benefits Derived from High Maturity - PowerPoint PPT Presentation

CMMI sm Technology Conference and User Group November 2002 Business Value and Customer Benefits Derived from High Maturity Alan Pflugrad Northrop Grumman Information Technology Defense Enterprise Solutions (DES) Chair, DES Engineering Process


  1. CMMI sm Technology Conference and User Group November 2002 Business Value and Customer Benefits Derived from High Maturity Alan Pflugrad Northrop Grumman Information Technology Defense Enterprise Solutions (DES) Chair, DES Engineering Process Group Executive Manager, Systems and Process Engineering Email: apflugrad@northropgrumman.com Phone: (703) 883-5128

  2. Discussion Purpose and Agenda • Purpose: – Communicate business value and customer benefits derived from an application of “high maturity” system/software engineering processes, and – How an integrated process framework helps • Discussion Agenda – Business Value/Customer Benefits & Process Highlights – Quality and Process Goals – Quality and Process Performance – Process Highlights – Integrated Process Improvement (CMMI) • Limit – 40 minutes including questions 2

  3. Organizational and Project Quantitative Management Process Overview DES Business Objectives DES management checks 1 DES management selects quality and org and project data 6 process goals & measurements against DES goals (process capability baseline). Projects select related goals & measurements 2 5 Projects check performance for each life cycle phase. against project goals and business objectives. Projects track process Projects improve performance 3 performance over 4 by removing root causes for time. out-of-bound conditions. 3

  4. DES Process and Quality Measures Acronym Measurement Process CPIm Cost Performance Index monthly Earned Value System SPIm Schedule Performance Index monthly Earned Value System EPVPm ETC Performance Variance Percentage monthly Earned Value System or other financial process DDr Defect Density from Peer Review Peer Review (all Life Cycle Stages) DDt Defect Density from Test & Operations Test 4

  5. Process/Quality Improvements support Organizational Business Objectives DES Business Objectives DES Process & Quality Performance Goals Annual Operating Plan Collective across participating projects 1. Achieve Cost Perf. Index = 1 ± 5% Achieve revenue and margin objectives 2. Achieve Schedule Perf. Index = 1 ± 5% 3. Achieve Est-To-Complete-Var = 0 ± 5%. 4. Achieve 5% improvement in Defect Density for each life cycle phase. 1. Achieve Cost Perf. Index = 1 ± 5%. Improve customer 2. Achieve Schedule Perf. Index = 1 ± 5%. satisfaction rating 3. Achieve Est-To-Complete-Var = 0 ± 5%. 4. Achieve 5% improvement in Defect Density for each life cycle phase. 5

  6. Optimizing Process Strategy Overview Before PR Defect Test PR Defect Pareto PR Analysis or Data Analysis PR Peer PR Review PR Defect Categories Common Causes Project Tailored Process Organization Defined Technology Process I nnovations Root Cause Analysis Remove Root Causes From Process to Process Change Prevent Defects 6

  7. SATS/SIGS Program and QM Indicators Defect Density at Review (all defects) S Technical 80 60 • Goal: 20 +/- 5 defects/KLoC 40 • Actual: 22.9 defects/KLoC 20 • Action: Implementing DDt 0 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 • Technical Highlights: Only 2% of all defects are found in the fielded system Cost Performance Index (Monthly) 1.40 S Financial 1.20 1.00 • Goal: 1.0 +/- 0.1 0.80 0.60 • Actual: 0.98 Mar-00 May-00 Jul-00 Sep-00 Dec-00 Feb-01 Jun-01 Oct-01 Nov-01 Jan-02 Mar-02 May-02 Apr-01 Aug-01 • Action: DP cycle for SCoV in April; Countermeasures – improve estimation; change EV tracking • Technical Highlights: CPI is still on target 7

  8. SATS/SIGS Program and QM Indicators Schedule Performance Index (Monthly) M Schedule 1.30 1.10 0.90 • Goal: 1.0 +/- 0.1 0.70 0.50 Mar-00 May-00 Jul-00 Dec-00 Feb-01 Jun-01 Oct-01 Nov-01 Jan-02 Mar-02 May-02 • Actual: 0.975 Sep-00 Apr-01 Aug-01 • Action: Watching closely, DP cycle for SCoV in April; Countermeasures – improve estimation; change EV tracking • Technical Highlights: will be Satisfactory by 7/02 E Customer Satisfaction Award Fee Scores 110% • Goal: >= 95% 105% 100% • Actual: 98.8% 95% • Action: Continue to deliver 90% M-00 M-00 M-01 M-01 J-00 S-00 N-00 J-01 J-01 S-01 N-01 J-02 • Technical Highlights: Customer is very flexible due to track record 8

  9. Controlling Quality Performance — Build AWIPS CM Build Defect Density AWIPS Release 5.1.1 0.0004 Errors per File 0.0002 0.0000 -0.0002 50 PS11- 50 S10-A4 511 PS2c- 511 PS7- 511 A1-A4 511 A5-A8 43 B3-50 50 PS3-PS6 50 PS7-PS10 50 PS15-S1 50 S2-S5 50 S6-S9 50 A5-A8 50 A9-B3 50 B4-511 511 S1-S4 511 S5-S8 511 B2-512 PS14 PS10 PS2 PS1 PS6 PS3 Release Average UCL Mean LCL Statistical process control identifies build issues that can impact the development schedule. 9

  10. Predicting Quality - Example AWIPS Rel 4.2 DRs 400 300 200 DRs D R s Rayleigh Model 100 0 6/ 7 8 9 10 11 12 1 2/ 3 4 5 6 7 8/ 9 98 99 99 99 99 99 Date 10

  11. Controlling Process Performance • Cost and schedule can be managed with statistical AWIPS Release 5.1 Monthly CPIm X Chart process control 2.75 2.50 2.25 • Improves predictions of 2.00 1.75 1.50 future performance 1.25 X 1.00 0.75 0.50 • Results: 0.25 0.00 -0.25 -0.50 – Build 4, 2% underrun -0.75 Dec-2000 Jan-2001 Feb-2001 Mar-2001 Apr-2001 May-2001 Jun-2001 Jul-2001 Aug-2001 Sep-2001 Oct-2001 Nov-2001 Dec-2001 CPIm CPIm Mean UNPLX LNPLX U2S L2S U1.5S L1.5S – R5.0, 4% underrun – R5.1, 5% underrun AWIPS Release 5.2 Monthly SPIm X Chart – Build 5 variance in last 12 1.25 months, 10% 1.00 X 0.75 0.50 Aug-2001 Sep-2001 Oct-2001 Nov-2001 Dec-2001 SPIm SPIm Mean UNPLX LNPLX U2S L2S U1.5S L1.5S Statistical process control improves cost & schedule performance. 11

  12. JEDMICS Defect Density & Customer Satisfaction Survey 4.60 0.08 4.50 0.07 4.40 0.06 4.30 0.05 4.20 0.04 4.10 0.03 4.00 0.02 3.90 0.01 3.80 0 1998 1999 2000 2001 Averaged Customer Satisfaction Defect Density (Test) Defect Density (Operations) 12

  13. Quality Improvement Realized Mean Defect Density by Phase 2.0 Good Defects/[Page|KSLOC] 1.5 PRC 9909 PRC 9912 1.0 A B 0.5 D 0.0 .05 defects/ Analysis PDesign CDesign Code Test Ops KSLOC LifeCycle Phase 13

  14. Process Implementation Support – Best Practice • Process Champion • Process • Internal Consultant • Product • Subject Matter Experts • Links to other KPAs/PAs • One Per Domain • Links to SIM Check- • Shows Variations in each • Links to other processes Support lists Process/ Asset Process • Expert Knowledge Tailoring Integration Guidance • Templates Output • Samples • Per Process, Asset Work Roles & Products Responsibilities Processes Processes • SWCMM Policy & • Policy Statements Requirements • SECMM Verifications • Quality Assurance • ISO • Audits • Customer Stds Rollout Training Plans • Proposals • Pilot Projects Metrics Tools • Corporate • Startups • OJT • Ongoing Projects • Corporate • COTS & “Glue” • Customer • Support • Internal • Compatible Formats 14

  15. Core Processes Common to Multiple Disciplines CMMI CMMI Core Core Information Technology SW SW Products & Services in SW CMM SW CMM Constant Change SE Discipline SE Discipline IT Consulting Core Core SW Discipline Discipline SE SE SE CM SE CM Sys Arch, Engin & Delivery Enterprise Integration Core Core I I P P SW P P D D Data Center Operation - - Processes Processes C C M M M M IPD IPD IT Infrastructure Management Core Core Applications Management SETA IPD Discipline IPD Discipline Functional Process Outsourcing Other Other CMM’s CMM’s 15

  16. Context: Acquisition/Development Space Acquirer Mismatch Matched Team • Mature buyer must • Match of maturity mentor low maturity • Team risk approach developer • Execution to Plan • Outcome not • Measurable predictable performance • Predictable results Disaster Mismatch increasing • No discipline • “Customer is always • No process right” • Adhoc • Customer encourages • Crisis Management “shorts cuts” • Outcome not predictable Process Developer Maturity increasing 16

  17. Why the CMMI fit’s Mission Area Planning Budgeting Priority Requirements Definition D i r e c t i v e s , C o n s t r a i n t s Requirements Development Decision Analysis and Resolution Mission Shortfalls Contracting Activity Planning Supplier Agreement Project Management Planning Concurrent Product Control Front-End Requirements Integrated Project Technical Activities Management Management Solution Configuration Risk Management Management Product Quality Assurance Project Monitoring Integration and Control Causal Analysis Technical Execution and Resolution Program Management Assessment & Certification Products Deficiencies Product Measurement Validation Verification and Analysis Outcome & Feedback System Product Deliveries Organizational Process Management Process Process Process Innovation and Quantitative Focus Definition Training Performance Deployment Mgmt Process Maturation Courtesy: Mitre/Mike Bloom 17

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend