agile software development cost modeling for the us dod
play

Agile Software Development Cost Modeling for the US DoD Wilson - PowerPoint PPT Presentation

Agile Software Development Cost Modeling for the US DoD Wilson Rosa, Naval Center for Cost Analysis Ray Madachy, Naval Postgraduate School Bradford Clark, Software Metrics, Inc. Barry Boehm, University of Southern California SEI Software and


  1. Agile Software Development Cost Modeling for the US DoD Wilson Rosa, Naval Center for Cost Analysis Ray Madachy, Naval Postgraduate School Bradford Clark, Software Metrics, Inc. Barry Boehm, University of Southern California SEI Software and Cyber Solutions Symposium March 27, 2018 1 3/27/2018

  2. A Short History of Software Estimation Accuracy IDPD: Incremental Development Productivity Decline MBSSE: Model-Based Systems and Sw Engr. COTS: Commercial Off-the-Shelf SoS: Systems of Systems Relative Productivity Estimation Error Unprece- Component- SoS. Apps, Widgets, IDPD, Prece- COTS Agile dented based Clouds, Security, MBSSE dented A B C D Time, Domain Understanding 2 3/27/2018

  3. Problem Statement • In DoD, Popular Size Measures are often not available for Agile Effort Estimation at early phase – Function Points (FP) – COSMIC FP – Story Points – Source Lines of Code • No Publicized/Empirical Agile Effort Estimation Models 3/27/2018 3 2

  4. Purpose • Publish Agile Effort Estimation Models for – Crosschecking Contractor Cost Proposals – Validating Independent Government Cost Estimates • Examine the validity of using Initial Software Requirements as proxy size measure • Develop useful cost models using early phase information • Model calibration comparison: 3/27/2018 4 3

  5. Outline • Experimental Design • Dataset Demographics • Productivity Benchmarks • Agile Effort Estimation Models • Conclusion 3/27/2018 5 4

  6. Experimental Design 3/27/2018 6

  7. Primary Data Collection Form • 2011 Software Resource Data Report (SRDR) (DD Form 2630) SRDR Initial Developer Report SRDR Final Developer Report Section 3.1.1 UNCLASSIFIED Section 3.1.1 UNCLASSIFIED SECURITY CLASSIFICATION SECURITY CLASSIFICATION SOFTWARE RESOURCES DATA REPORTING: FINAL DEVELOPER REPORT (SAMPLE FORMAT 3) SOFTWARE RESOURCES DATA REPORTING: INITIAL DEVELOPER REPORT (SAMPLE FORMAT 2) Due 60 days after final software delivery and 60 days after delivery of any release or build. Due 60 days after contract award and 60 days after start of any release or build. Section 3.1 REPORT CONTEXT AND DEVELOPMENT ORGANIZATION Section 3.1 REPORT CONTEXT AND DEVELOPMENT ORGANIZATION MAJOR PROGRAM a. NAME: Section 3.1.2 b. PHASE/MILESTONE: Section 3.1.2 MAJOR PROGRAM a. NAME: Section 3.1.2 b. PHASE/MILESTONE: Section 3.1.2 REPORTING ORGANIZATION TYPE Section 3.1.3 REPORTING ORGANIZATION TYPE Section 3.1.3 NAME/ADDRESS a. REPORTING ORGANIZATION: NAME/ADDRESS a. REPORTING ORGANIZATION: Section 3.1.4 Section 3.1.4 PRIME/ASSOCIATE CONTRACTOR PRIME/ASSOCIATE CONTRACTOR DIRECT-REPORTING SUBCONTRACTOR DIRECT-REPORTING SUBCONTRACTOR b. DIVISION: Section 3.1.4 b. DIVISION: Section 3.1.4 GOVERNMENT GOVERNMENT APPROVED PLAN NUMBER Section 3.1.5 CUSTOMER Section 3.1.6 CONTRACT TYPE Section 3.1.7 APPROVED PLAN NUMBER Section 3.1.5 CUSTOMER Section 3.1.6 CONTRACT TYPE Section 3.1.7 WBS ELEMENT CODE Section 3.1.8 WBS RERPORTING ELEMENT Section 3.1.8 WBS ELEMENT CODE Section 3.1.8 WBS RERPORTING ELEMENT Section 3.1.8 c. SOLICITATION NO.: Section 3.1.9 e. TASK ORDER/DELIVERY c. SOLICITATION NO.: Section 3.1.9 e. TASK ORDER/DELIVERY TYPE ACTION Section 3.1.9 Section 3.1.9 TYPE ACTION Section 3.1.9 Section 3.1.9 a. CONTRACT NO.: a. CONTRACT NO.: ORDER NO.: ORDER NO.: b. LATEST MODIFCATION: Section 3.1.9 d. NAME: Section 3.1.9 b. LATEST MODIFCATION: Section 3.1.9 d. NAME: Section 3.1.9 PERIOD OF PERFORMANCE APPROPRIATION Section 3.1.11 SUBMISSION NUMBER Section 3.1.12 PERIOD OF PERFORMANCE APPROPRIATION Section 3.1.11 SUBMISSION NUMBER Section 3.1.12 RESUBMISSION NUMBER Section 3.1.13 RESUBMISSION NUMBER a. START DATE (YYYYMMDD): Section 3.1.10 RDT&E a. START DATE (YYYYMMDD): Section 3.1.10 RDT&E Section 3.1.13 b. END DATE (YYYYMMDD): PROCUREMENT REPORT AS OF (YYYYMMDD) Section 3.1.14 b. END DATE (YYYYMMDD): PROCUREMENT REPORT AS OF (YYYYMMDD) Section 3.1.14 Section 3.1.10 Section 3.1.10 DATE PREPARED (YYYYMMDD) Section 3.1.15 DATE PREPARED (YYYYMMDD) O&M O&M Section 3.1.15 NAME (Last, First, Middle Initial) Department Telephone (Include Area Code) EMAIL ADDRESS NAME (Last, First, Middle Initial) Department Telephone (Include Area Code) EMAIL ADDRESS Section 3.1.15 Section 3.1.15 Section 3.1.15 Section 3.1.15 Section 3.1.15 Section 3.1.15 Section 3.1.15 Section 3.1.15 DEVELOPMENT ORGANIZATION SOFTWARE PROCESS MATURITY Section 3.1.17 LEAD EVALUATOR Section 3.1.17 DEVELOPMENT ORGANIZATION SOFTWARE PROCESS MATURITY Section 3.1.17 LEAD EVALUATOR Section 3.1.17 CERTIFICATION DATE Section 3.1.17 EVALUATOR AFFILIATION Section 3.1.17 CERTIFICATION DATE Section 3.1.17 EVALUATOR AFFILIATION Section 3.1.16 Section 3.1.16 Section 3.1.17 PRECEDENTS (List up to five similar systems by the same organization or team.) PRECEDENTS (List up to five similar systems by the same organization or team.) Section 3.1.18 Section 3.1.18 SRDR DATA DICTIONARY FILENAME Section 3.1.19 SRDR DATA DICTIONARY FILENAME Section 3.1.19 COMMENTS COMMENTS Section 3.1.20 Section 3.1.20 Estimated Functional Requirements Actual Development Effort Estimated External Interfaces Actual Development Process Estimated Peak Staff Application Domain 3/27/2018 7 6

  8. Population and Sample Size Empirical data from 20 recent US DoD Agile programs : 12 Paired SRDRs from the Cost Assessment Data Enterprise (CADE) Each paired SRDR includes: SRDR Initial Developer Report (Estimates) & SRDR Final Developer Report (Actuals) http://dcarc.cape.osd.mil/Default.aspx 4 additional SRDRs from CADE (SRDR Final only) 4 Agile projects from proprietary source 20 Agile projects analyzed in this study 3/27/2018 8 7

  9. Data Normalization and Analysis Workflow • Dataset normalized to “account for sizing units, application complexity, and content so they are consistent for comparisons” (source: GAO) Counting Software Requirements Grouping Dataset by Super Domain Variable Selection RegressionAnalysis Model Selection 3/27/2018 9 8

  10. Counting Software Requirements F O R Initial Functional Initial External Initial Software M Requirements* Interfaces* Requirements U L A “shall” statements “ shall” statements M contained in the contained in the E A baseline Software baseline Interface S Requirements Requirements U Specification Specifications R E (SRS) (IRS) S O U SRDR Initial Report SRDR Initial Report R C E *Typically available before contract award 3/27/2018 10 *Definitions align with IEEE std. 830-1998 9

  11. Grouping Dataset by Super Domain 1) Dataset initially mapped into 17 Application Domains* 2) Then into 4 complexity groups called Super Domains Super Domain Application Domain Software Tools Mission Support (SUPP) Training Enterprise Information System Automated Information System (AIS) Enterprise Services Custom AIS Software Mission Planning Test, Measurement, and Diagnostic Equipment Engineering (ENG) Scientific & Simulation Process Control System Software Command & Control, Communications Real Time (RTE) Real Time Embedded Vehicle Control/Payload Signal Processing, Microcode & Firmware 3/27/2018 11 *New DOD policy (http://cade.osd.mil/policy/srdr) requires that Application Domains are identified for reported software activities. 10

  12. Grouping Dataset by Super Domain Super Domains Real Support AIS Engineering Time TOTAL Aircraft 2 0 4 0 6 Business 1 3 0 0 4 C4I 0 1 3 5 9 Missile 0 0 0 1 1 3 4 7 6 20 Top 2 Operating Environments à C4I and Aircraft 3/27/2018 12 11

  13. Variable Selection 1) Pairwise Correlation to select Independent Variables 2) Stepwise Analysis to select Categorical Variables Pairwise Correlation Dependent Variable Independent Variable Analysis Final Effort Initial Software Requirements Initial Functional Requirements Initial External Interfaces Select Initial Equivalent SLOC (ESLOC) Independent Variables Initial Peak Staff Initial Duration Original Effort Equation Stepwise Categorical Variable Analysis Process Maturity Development Process Super Domain Select Categorical Scope (New vs Enhancement) Variables 3/27/2018 13 12 RegressionAnalysis

  14. Model Selection §Model Selection Based on P-Value, lowest MMRE and CV Measure Symbol Description Percentage expression of the standard error compared to the Coefficient of mean of dependent variable. A relative measure allowing CV Variation direct comparison among models. Level of statistical significance established through the α P-value coefficient alpha (p ≤ α). Variance Indicates whether multi-collinearity (correlation among VIF Inflation Factor predictors) is present in multiple regression analysis. Coefficient of The Coefficient of Determination shows how much variation in R 2 Determination dependent variable is explained by the regression equation. Low MMRE is an indication of high accuracy. MMRE is defined as the sample mean (M) of the magnitude relative error Mean (MME). MME is the absolute value of the difference between Magnitude of MMRE Actual and Estimated effort divided by the Actual effort, Relative Error (A – E) / A 3/27/2018 14 13

  15. Dataset Demographics 3/27/2018 15

  16. Dataset by Delivery Year 7 6 Number of Projects 5 4 3 2 1 0 2008 2009 2010 2011 2012 2013 2014 2015 2016 Agile Software Project Delivery Year # of completed Agile Projects (reported in CADE) have increased since 2014 3/27/2018 16 15

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend